You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
it looks like these prompt templates are largely encoding the chat template for the model. For most models that I've seen, the chat template is actually encoded in the model metadata and applied server-side when using the chat/completions API. Has there been thought around using the chat API rather than applying the template client-side to the lower-level completions API?
The text was updated successfully, but these errors were encountered:
This issue has been automatically marked as stale because it has not had activity within 90 days. It will be automatically closed if no further activity occurs within 30 days.
Unstale - some of the research work we're merging in starts moving us over to the chat completions API instead of the generic completions API. So, we will be addressing at least the spirit of this issue.
#461 adds an LLMMessagesBlock that uses the chat completions API, as opposed to the LLMBlock in Pipelines that uses the older completions API. However, adding that block is just the first step towards moving us over - we still need to evaluate how/if we'd move our default pipelines over to the chat completions API instead of the completions API.
From @gabe-l-hart here
The text was updated successfully, but these errors were encountered: