LLM Provider Compatibility
Users question whether a new AI tool is limited to OpenAI APIs and request support for local LLMs, open-source models, and alternative providers like Ollama, Anthropic, Llama, and Bedrock.
Activity Over Time
Top Contributors
Keywords
Sample Comments
As opposed to simply being locked into openai api's as the only option?
What benefits does this bring me vs just using OpenAI's official tools?
I've been using openrouter.ai to use "all llm's". No subscription, and can be tied to your editor of choice
openinterpreter has been doing this for a while, with a bunch of LLMs, glad to see first party support for this use case
Amazing product! How does the connection to LLMs works? Bring your own LLM/API key?
While not explicitly mentioned, why only openai api is supported? Folks with local LLMs feels left out.
Seems to be missing Local LLM / offline support and tied to openAI.
Does this work with other open-source LLMs like Qwen3 or other OpenAI compatible LLM Apis?
Can this be integrated with local LLMs or does it only support openAI?
You can also use any other LLM provider / any open source model of your choice!