You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When the model would generate a sequence like this, it would instead stop and return the current generation to the user. This more easily allows for more advanced prompting techniques like LangChain's original text-based ReAct agent loop, where we want the model to not generate an Observation: and instead have it populated by an external tool call.
We can get around this by streaming a generation and cancelling the request when we detect a stop sequence in the accumulated output, but this is a bit less nice.
The text was updated successfully, but these errors were encountered:
It would be nice in certain situations to be able to pass a list of stop sequences as input:
For example, see: https://platform.openai.com/docs/api-reference/chat/create#chat-create-stop
When the model would generate a sequence like this, it would instead stop and return the current generation to the user. This more easily allows for more advanced prompting techniques like LangChain's original text-based ReAct agent loop, where we want the model to not generate an
Observation:
and instead have it populated by an external tool call.We can get around this by streaming a generation and cancelling the request when we detect a stop sequence in the accumulated output, but this is a bit less nice.
The text was updated successfully, but these errors were encountered: