Autogen with 2 (or more) models #145
Anto79-ops
started this conversation in
Ideas
Replies: 1 comment
-
turns out with the new version of Ollama, you don't need LiteLLM to support Autogen as Ollama does suporrt V1 chat completions, for example, here is a sample autogen code that uses 2 ollama models, a coder model and Mistral
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I was recently dabbling with Autogen and Ollama with LiteLLM.
Wondering if you could add support for Autogen where your 3B model can be use to do all the service calls and another model can do everything else, and maybe even both models can work together.
Autogen has good support with Ollama.
Beta Was this translation helpful? Give feedback.
All reactions