Nigeria No1. Music site And Complete Entertainment portal for Music Promotion WhatsApp:- +2349077287056
Monday, 20 January 2025
Show HN: Ergonomically call LLM in bulk from CLI https://bit.ly/42gfabg
Show HN: Ergonomically call LLM in bulk from CLI Hi! I've found myself repeatedly writing little scripts to do bulk calls to LLMs for various tasks. For example, run some analysis on a large list of records. There are a few "gotchas" to doing this. For example, some service providers have rate limits, and some models will not reliably return JSON (if you're asking for it). So, I've written a command for this. What I've tried to do here is let the user break up prompts and configuration as they see fit. For example, you can have a prompt file which includes the API key, rate limit, settings, etc. all together, or break these up into multiple files, or keep some parts local, or override parameters. This solves the problem of sharing settings between activities, and keeping prompts in simple, committable files of narrow scope. I hope this can be of use to someone. Thanks for reading. https://bit.ly/4hoDjAM January 20, 2025 at 11:22PM
Labels:
Hacker News
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment