-
-
Notifications
You must be signed in to change notification settings - Fork 261
Issues: simonw/llm
Ability to execute prompts against an asyncio (non-blocking) API
#507
opened Jun 10, 2024 by
simonw
Open
22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Add guidance supports customized openai url and the key
#624
opened Nov 12, 2024 by
MonolithFoundation
Store prompts and system prompts longer than X characters in de-duped table
enhancement
New feature or request
#617
opened Nov 6, 2024 by
simonw
Abstract out token usage numbers
enhancement
New feature or request
#610
opened Nov 6, 2024 by
simonw
Ability to configure attachment support for models in extra-openai-models.yaml
attachments
enhancement
New feature or request
#602
opened Nov 3, 2024 by
NightMachinery
I get error: uninstall-no-record-file when trying to upgrade to 0.17 using llm install -U llm
#592
opened Oct 29, 2024 by
numenbit
Error in llm chat due to invalid key \e[d from case conversion in key bindings (Windows 11)
#585
opened Oct 24, 2024 by
gerardantoun
Encoding Error:
'ascii' codec can't encode character '\xe0' in position 41: ordinal not in range(128)
#576
opened Sep 18, 2024 by
SamuelDevdas
Research issue: gather examples of multi-modal API calls from different LLMs
research
#557
opened Aug 26, 2024 by
simonw
Previous Next
ProTip!
Updated in the last three days: updated:>2024-11-09.