Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

server: allow filtering llama server response fields #10940

Merged
merged 7 commits into from
Dec 24, 2024
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
clarify docs
  • Loading branch information
ngxson committed Dec 24, 2024
commit 4cf1fef3207d57b2ab39a6956b0acf77777e4bb5
2 changes: 1 addition & 1 deletion examples/server/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -450,7 450,7 @@ These words will not be included in the completion, so make sure to add them to

`post_sampling_probs`: Returns the probabilities of top `n_probs` tokens after applying sampling chain.

`requested_fields`: A list of required response fields, for example : `"requested_fields": ["content", "generation_settings/n_predict"]` If there is no field, return an empty json for that field.
`requested_fields`: A list of response fields, for example: `"requested_fields": ["content", "generation_settings/n_predict"]`. If the specified field is missing, it will simply be omitted from the response without triggering an error.

**Response format**

Expand Down