Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ChatPromptTemplate formatting is breaking Claude 3.5 Sonnet calls #7226

Open
5 tasks done
jeffplourd opened this issue Nov 17, 2024 · 2 comments
Open
5 tasks done

ChatPromptTemplate formatting is breaking Claude 3.5 Sonnet calls #7226

jeffplourd opened this issue Nov 17, 2024 · 2 comments
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@jeffplourd
Copy link

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

Here's the code that's throwing the error:

const chain = await this.aiService.createChain({
        modelName,
});

const summary = await chain.invoke(params);

Here's createChain's implementation for reference:

async createChain(options: ChainOptions) {
    const { hubId, modelName, temperature, maxTokens } = options;

    const prompt = await this.getPrompt(hubId); // ChatPromptTemplate
    const model = this.createModel({ modelName, temperature, maxTokens }); // ChatAnthropic
    const parser = new StringOutputParser();

    return prompt.pipe(model).pipe(parser);
}

private createModel(options: ModelOptions) {
    const { modelName } = options;

    return new ChatAnthropic({
      modelName,
    });
}

async getPrompt(hubId: string) {
    return hub.pull<ChatPromptTemplate>(hubId);
}

Error Message and Stack Trace (if applicable)

400 {"type":"error","error":{"type":"invalid_request_error","message":"messages: text content blocks must be non-empty"}}

Error: 400 {"type":"error","error":{"type":"invalid_request_error","message":"messages: text content blocks must be non-empty"}}
at Function.generate (/Users/jeffplourd/Desktop/engineering/snorkl-backend/node_modules/@anthropic-ai/sdk/src/error.ts:61:14)
at Anthropic.makeStatusError (/Users/jeffplourd/Desktop/engineering/snorkl-backend/node_modules/@anthropic-ai/sdk/src/core.ts:397:21)
at Anthropic.makeRequest (/Users/jeffplourd/Desktop/engineering/snorkl-backend/node_modules/@anthropic-ai/sdk/src/core.ts:460:24)
at processTicksAndRejections (node:internal/process/task_queues:95:5)
at makeCompletionRequest (/Users/jeffplourd/Desktop/engineering/snorkl-backend/node_modules/@langchain/anthropic/dist/chat_models.cjs:805:24)
at RetryOperation._fn (/Users/jeffplourd/Desktop/engineering/snorkl-backend/node_modules/p-retry/index.js:50:12)

Description

I'm fetching a template from LangHub that looks something like this...
Screenshot 2024-11-17 at 11 21 19 AM

When I fetch the template locally as a ChatPromptTemplate then manually "formatMessage," I get the following:

  [
  SystemMessage {
    "content": "You are an expert math teacher who evaluates student work….",
    "additional_kwargs": {},
    "response_metadata": {}
  },
  HumanMessage {
    "content": [
      {
        "text": "Here is the math problem:",
        "type": "text"
      },
      {
        "text": "\"x - 20 = 100\n\nWhat does x equal?\n\na. 10\nb. 20\nc. 100\nd. 120\"",
        "type": "text"
      },
      {
        "text": "",
        "type": "text"
      },
       {
        "text": "Here is the student's whiteboard:",
        "type": "text"
      },
      {
        "image_url": {
          "url":<image url (removing here for security purposes>"
        },
        "type": "image_url"
      }
    ],
    "additional_kwargs": {},
    "response_metadata": {}
  }
]

Notice the empty text value:
{
"text": "",
"type": "text"
},

The ChatPromptTemplate appears to be converting newlines in the prompt template to these empty string items in the formatted messages.

Summary: Claude requires that each of the items in the message content array have non-empty values. The way the ChatPromptTemplate is formatting the message causes Claude to reject them and throw an error.

System Info

[email protected] | MIT | deps: 12 | versions: 301
Typescript bindings for langchain
https://github.com/langchain-ai/langchainjs/tree/main/langchain/

keywords: llm, ai, gpt3, chain, prompt, prompt engineering, chatgpt, machine learning, ml, openai, embeddings, vectorstores

dist
.tarball: https://registry.npmjs.org/langchain/-/langchain-0.3.6.tgz
.shasum: f4313d202ce168d29bfcf81a551147cd4986779f
.integrity: sha512-erZOIKXzwCOrQHqY9AyjkQmaX62zUap1Sigw1KrwMUOnVoLKkVNRmAyxFlNZDZ9jLs/58MaQcaT9ReJtbj3x6w==
.unpackedSize: 2.9 MB

dependencies:
@langchain/openai: >=0.1.0 <0.4.0 js-tiktoken: ^1.0.12 jsonpointer: ^5.0.1 openapi-types: ^12.1.3 uuid: ^10.0.0 zod-to-json-schema: ^3.22.3
@langchain/textsplitters: >=0.0.0 <0.2.0 js-yaml: ^4.1.0 langsmith: ^0.2.0 p-retry: 4 yaml: ^2.2.1 zod: ^3.22.4

maintainers:

dist-tags:
latest: 0.3.6 next: 0.3.2-rc.0 tag-for-publishing-older-releases: 0.2.20

published 12 hours ago by jacoblee93 [email protected]

Copy link

dosubot bot commented Nov 17, 2024

To resolve the issue where the ChatPromptTemplate in LangChain.js is causing empty text content blocks, leading to a "messages: text content blocks must be non-empty" error with Claude 3.5 Sonnet calls, you need to ensure that all message content blocks are non-empty before they are processed.

In your code, you should add checks to ensure that text content is non-empty. Here's an example of how you can modify your code to handle this:

if (typeof castMsg.content === "string" && castMsg.content !== "") {
  assistantMsg.content?.push({
    text: castMsg.content,
  });
} else if (Array.isArray(castMsg.content)) {
  const contentBlocks: ContentBlock[] = castMsg.content.map((block) => {
    if (block.type === "text" && block.text !== "") {
      return {
        text: block.text,
      };
    } else {
      const blockValues = Object.fromEntries(
        Object.values(block).filter(([key]) => key !== "type")
      );
      throw new Error(
        `Unsupported content block type: ${
          block.type
        } with content of ${JSON.stringify(blockValues, null, 2)}`
      );
    }
  });
  assistantMsg.content = [
    ...(assistantMsg.content ? assistantMsg.content : []),
    ...contentBlocks,
  ];
}

Ensure that when you create or modify messages, you check that any text content is not an empty string. If you encounter an empty string, you should either provide a default non-empty value or skip adding that content block to the message. This will prevent the error from occurring [1].

To continue talking to Dosu, mention @dosu.

@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Nov 17, 2024
@jeffplourd
Copy link
Author

While the above is correct, it doesn't seem like something I should be handling in my code. When I chain a ChatTemplatePrompt instance together with the ChatAnthropic this process is taking place under-the-hood from my perspective, so I would imagine ChatTemplatePrompt should be handling.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

1 participant