Local LLM Support

The cluster focuses on users expressing interest in, inquiring about, and discussing support for running Large Language Models (LLMs) locally on personal hardware rather than using cloud-based services like OpenAI or ChatGPT, often citing privacy, cost, and capability reasons.

➡️ Stable 1.8x AI & Machine Learning
4,677
Comments
16
Years Active
5
Top Authors
#614
Topic ID

Activity Over Time

2011
1
2012
1
2013
1
2014
4
2015
5
2016
8
2017
19
2018
13
2019
13
2020
17
2021
19
2022
69
2023
1,133
2024
1,243
2025
1,959
2026
176

Keywords

RAM NGL AI CPU LLM ML soartificial.com GPT4 Msty.app SOTA locally local llm models llms use local run model cloud gpt4

Sample Comments

akdev1l Apr 11, 2024 View on HN

There are local LLMs but my understand is they are not as good.

smcleod Dec 12, 2023 View on HN

Nice project, any plans to make it work with local LLMs rather than "open"AI?

dockerd Nov 22, 2024 View on HN

Anything that you wouldn't have run on chatgpt/claude and would need local LLM?Or you using local LLM as you have option to run it locally?

tony_landis Jun 24, 2024 View on HN

Any plans for local LLMs?Thanks for making this!

fforflo May 6, 2024 View on HN

Not all LLM models are remote. You can do just fine with local ones.

pmarreck Apr 23, 2025 View on HN

Locally-running LLM's might be good enough to do a decent enough job at this point... or soon will be.

bick_nyers Dec 6, 2023 View on HN

I wonder if they have plans for allowing the usage of a locally hosted LLM?

la64710 Apr 30, 2023 View on HN

Great but personally I am interested in locally runnable LLM models instead of sending data to the cloud service like chatGPT.

airstrike Jan 19, 2026 View on HN

You can always run models locally? Local models will become cheaper and faster over time. Don't panic just yet

abhyantrika Apr 25, 2024 View on HN

Is there any product that allows me to plug in my own Open source LLM and run it locally?