You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
400 {"type":"error","error":{"type":"invalid_request_error","message":"messages: text content blocks must be non-empty"}}
Error: 400 {"type":"error","error":{"type":"invalid_request_error","message":"messages: text content blocks must be non-empty"}}
at Function.generate (/Users/jeffplourd/Desktop/engineering/snorkl-backend/node_modules/@anthropic-ai/sdk/src/error.ts:61:14)
at Anthropic.makeStatusError (/Users/jeffplourd/Desktop/engineering/snorkl-backend/node_modules/@anthropic-ai/sdk/src/core.ts:397:21)
at Anthropic.makeRequest (/Users/jeffplourd/Desktop/engineering/snorkl-backend/node_modules/@anthropic-ai/sdk/src/core.ts:460:24)
at processTicksAndRejections (node:internal/process/task_queues:95:5)
at makeCompletionRequest (/Users/jeffplourd/Desktop/engineering/snorkl-backend/node_modules/@langchain/anthropic/dist/chat_models.cjs:805:24)
at RetryOperation._fn (/Users/jeffplourd/Desktop/engineering/snorkl-backend/node_modules/p-retry/index.js:50:12)
Description
I'm fetching a template from LangHub that looks something like this...
When I fetch the template locally as a ChatPromptTemplate then manually "formatMessage," I get the following:
[
SystemMessage {
"content": "You are an expert math teacher who evaluates student work….",
"additional_kwargs": {},
"response_metadata": {}
},
HumanMessage {
"content": [
{
"text": "Here is the math problem:",
"type": "text"
},
{
"text": "\"x - 20 = 100\n\nWhat does x equal?\n\na. 10\nb. 20\nc. 100\nd. 120\"",
"type": "text"
},
{
"text": "",
"type": "text"
},
{
"text": "Here is the student's whiteboard:",
"type": "text"
},
{
"image_url": {
"url":<image url (removing here for security purposes>"
},
"type": "image_url"
}
],
"additional_kwargs": {},
"response_metadata": {}
}
]
Notice the empty text value: {
"text": "",
"type": "text"
},
The ChatPromptTemplate appears to be converting newlines in the prompt template to these empty string items in the formatted messages.
Summary: Claude requires that each of the items in the message content array have non-empty values. The way the ChatPromptTemplate is formatting the message causes Claude to reject them and throw an error.
To resolve the issue where the ChatPromptTemplate in LangChain.js is causing empty text content blocks, leading to a "messages: text content blocks must be non-empty" error with Claude 3.5 Sonnet calls, you need to ensure that all message content blocks are non-empty before they are processed.
In your code, you should add checks to ensure that text content is non-empty. Here's an example of how you can modify your code to handle this:
if(typeofcastMsg.content==="string"&&castMsg.content!==""){assistantMsg.content?.push({text: castMsg.content,});}elseif(Array.isArray(castMsg.content)){constcontentBlocks: ContentBlock[]=castMsg.content.map((block)=>{if(block.type==="text"&&block.text!==""){return{text: block.text,};}else{constblockValues=Object.fromEntries(Object.values(block).filter(([key])=>key!=="type"));thrownewError(`Unsupported content block type: ${block.type} with content of ${JSON.stringify(blockValues,null,2)}`);}});assistantMsg.content=[
...(assistantMsg.content ? assistantMsg.content : []),
...contentBlocks,];}
Ensure that when you create or modify messages, you check that any text content is not an empty string. If you encounter an empty string, you should either provide a default non-empty value or skip adding that content block to the message. This will prevent the error from occurring [1].
While the above is correct, it doesn't seem like something I should be handling in my code. When I chain a ChatTemplatePrompt instance together with the ChatAnthropic this process is taking place under-the-hood from my perspective, so I would imagine ChatTemplatePrompt should be handling.
Checked other resources
Example Code
Here's the code that's throwing the error:
Here's
createChain
's implementation for reference:Error Message and Stack Trace (if applicable)
400 {"type":"error","error":{"type":"invalid_request_error","message":"messages: text content blocks must be non-empty"}}
Error: 400 {"type":"error","error":{"type":"invalid_request_error","message":"messages: text content blocks must be non-empty"}}
at Function.generate (/Users/jeffplourd/Desktop/engineering/snorkl-backend/node_modules/@anthropic-ai/sdk/src/error.ts:61:14)
at Anthropic.makeStatusError (/Users/jeffplourd/Desktop/engineering/snorkl-backend/node_modules/@anthropic-ai/sdk/src/core.ts:397:21)
at Anthropic.makeRequest (/Users/jeffplourd/Desktop/engineering/snorkl-backend/node_modules/@anthropic-ai/sdk/src/core.ts:460:24)
at processTicksAndRejections (node:internal/process/task_queues:95:5)
at makeCompletionRequest (/Users/jeffplourd/Desktop/engineering/snorkl-backend/node_modules/@langchain/anthropic/dist/chat_models.cjs:805:24)
at RetryOperation._fn (/Users/jeffplourd/Desktop/engineering/snorkl-backend/node_modules/p-retry/index.js:50:12)
Description
I'm fetching a template from LangHub that looks something like this...
When I fetch the template locally as a
ChatPromptTemplate
then manually "formatMessage," I get the following:Notice the empty text value:
{
"text": "",
"type": "text"
},
The ChatPromptTemplate appears to be converting newlines in the prompt template to these empty string items in the formatted messages.
Summary: Claude requires that each of the items in the message content array have non-empty values. The way the ChatPromptTemplate is formatting the message causes Claude to reject them and throw an error.
System Info
[email protected] | MIT | deps: 12 | versions: 301
Typescript bindings for langchain
https://github.com/langchain-ai/langchainjs/tree/main/langchain/
keywords: llm, ai, gpt3, chain, prompt, prompt engineering, chatgpt, machine learning, ml, openai, embeddings, vectorstores
dist
.tarball: https://registry.npmjs.org/langchain/-/langchain-0.3.6.tgz
.shasum: f4313d202ce168d29bfcf81a551147cd4986779f
.integrity: sha512-erZOIKXzwCOrQHqY9AyjkQmaX62zUap1Sigw1KrwMUOnVoLKkVNRmAyxFlNZDZ9jLs/58MaQcaT9ReJtbj3x6w==
.unpackedSize: 2.9 MB
dependencies:
@langchain/openai: >=0.1.0 <0.4.0 js-tiktoken: ^1.0.12 jsonpointer: ^5.0.1 openapi-types: ^12.1.3 uuid: ^10.0.0 zod-to-json-schema: ^3.22.3
@langchain/textsplitters: >=0.0.0 <0.2.0 js-yaml: ^4.1.0 langsmith: ^0.2.0 p-retry: 4 yaml: ^2.2.1 zod: ^3.22.4
maintainers:
dist-tags:
latest: 0.3.6 next: 0.3.2-rc.0 tag-for-publishing-older-releases: 0.2.20
published 12 hours ago by jacoblee93 [email protected]
The text was updated successfully, but these errors were encountered: