LLM Provider Compatibility

Users question whether a new AI tool is limited to OpenAI APIs and request support for local LLMs, open-source models, and alternative providers like Ollama, Anthropic, Llama, and Bedrock.

➡️ Stable 1.3x AI & Machine Learning
6,951
Comments
13
Years Active
5
Top Authors
#9850
Topic ID

Activity Over Time

2012
1
2015
1
2016
7
2017
5
2018
5
2019
9
2020
49
2021
77
2022
200
2023
1,970
2024
1,802
2025
2,699
2026
132

Keywords

e.g BYO LLM CLI TUI openai.com FAQ UI AI JetBrains openai llm models api llms ai jetbrains api key key apis

Sample Comments

ancientworldnow Jun 17, 2023 View on HN

As opposed to simply being locked into openai api's as the only option?

namanyayg Oct 9, 2024 View on HN

What benefits does this bring me vs just using OpenAI's official tools?

metrix Jul 13, 2025 View on HN

I've been using openrouter.ai to use "all llm's". No subscription, and can be tied to your editor of choice

fragmede Oct 22, 2024 View on HN

openinterpreter has been doing this for a while, with a bunch of LLMs, glad to see first party support for this use case

mikigraf Dec 6, 2023 View on HN

Amazing product! How does the connection to LLMs works? Bring your own LLM/API key?

3abiton May 7, 2025 View on HN

While not explicitly mentioned, why only openai api is supported? Folks with local LLMs feels left out.

smcleod Jan 19, 2024 View on HN

Seems to be missing Local LLM / offline support and tied to openAI.

ramkumarkb Aug 3, 2025 View on HN

Does this work with other open-source LLMs like Qwen3 or other OpenAI compatible LLM Apis?

jxyxfinite Jul 27, 2023 View on HN

Can this be integrated with local LLMs or does it only support openAI?

Weves Nov 29, 2023 View on HN

You can also use any other LLM provider / any open source model of your choice!