Replies: 1 comment
-
Hey @maciek134, glad to have you on our discussion forum! Tailcall uses LLM to auto-generate configurations. You can read about its usage here — https://tailcall.run/docs/tailcall-graphql-cli/#llm. This is a completely opt-in feature. If you specify an AI model that we can connect to, we will use it to auto generate configurations with more meaningful names for your types. The reason we use LLMs is because we have a few users who had to onboard 100s of APIs on to our platform and writing the configuration by hand with that many types becomes very cumbersome. LLMs can help in inferring meta information about the APIs and generate a much more meaningful configurations. |
Beta Was this translation helpful? Give feedback.
-
I really like how this project looks like, seems like a great alternative to GraphQL Mesh, especially how the gRPC translation with directives is handled.
I am however a bit weary of the LLM usage.
Do you have any plans to expand how they are used?
Do you think it would be possible to make it opt in? I'd be happy to provide a PR for that.
Beta Was this translation helpful? Give feedback.
All reactions