Skip to content

Commit

Permalink
fix(community): allow support for disabling max_tokens args (#21534)
Browse files Browse the repository at this point in the history
This PR fixes an issue with not able to use unlimited/infinity tokens
from the respective provider for the LiteLLM provider.

This is an issue when working in an agent environment that the token
usage can drastically increase beyond the initial value set causing
unexpected behavior.
  • Loading branch information
thehapyone committed Jun 27, 2024
1 parent 2a0d678 commit c6f700b
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion libs/community/langchain_community/chat_models/litellm.py
Original file line number Diff line number Diff line change
Expand Up @@ -191,7 191,7 @@ class ChatLiteLLM(BaseChatModel):
n: int = 1
"""Number of chat completions to generate for each prompt. Note that the API may
not return the full n completions if duplicates are generated."""
max_tokens: int = 256
max_tokens: Optional[int] = None

max_retries: int = 6

Expand Down

0 comments on commit c6f700b

Please sign in to comment.