AI Regulation Skepticism

The cluster focuses on criticism of AI regulation efforts, viewing them as regulatory capture by companies like OpenAI to suppress open-source competitors and innovation rather than addressing real risks.

📉 Falling 0.5x AI & Machine Learning
3,694
Comments
18
Years Active
5
Top Authors
#2046
Topic ID

Activity Over Time

2009
1
2010
1
2011
3
2012
2
2013
1
2014
1
2015
47
2016
25
2017
52
2018
59
2019
53
2020
51
2021
98
2022
124
2023
1,586
2024
764
2025
762
2026
66

Keywords

US LLM DOGE NASA AI HN twitter.com softwarecrisis.dev GPT4 HAHAHA ai regulation openai regulated regulate models regulatory companies open llm

Sample Comments

binary132 Apr 27, 2025 View on HN

Sounds like rage bait. They want to get AI regulated.

solarpunk Jan 31, 2024 View on HN

what's the premise here? this thing will become iteratively better until it could potentially be capable of bad outcomes?if that's the case, how much resources do you think should be dedicated to regulating it? more or less than currently identified existential risks? which entities should be paying for the regulatory controls?what's the proposal here?it's odd because only this one single company that is hedging it's entire existence on "oh boy what if this

malwrar Jul 11, 2023 View on HN

I think it's safe to say that, if a large entity with monopoly of force wants to stop these models from being used, they probably could. We already have a global surveillance machine watching us all, and normalized content takedowns for lesser reasons like "copyright infringement" and "obscene/exploitative material". Actual manufacturing & distribution of compute power is fairly consolidated, and hacker culture seems to have lost its activist edge in demanding l

tayo42 Nov 6, 2023 View on HN

isnt this the basis of the conspiracy theory like ideas around asking for AI regulation from the goverment

whelp_24 Oct 30, 2023 View on HN

AI can be dangerous, but that's not what is pushing these laws, it's regulatory capture. OpenAI was supposed to release their models a long time ago, instead they are just charging for access. Since actually open models are catching up they want to stop it.If the biggest companies in AI are making the rules, we might as well not rules at all.

yeknoda Dec 15, 2022 View on HN

If regulation is found to be necessary, here are some options- government could treat open ai like an electricity utility, with regulated profits- open ai could be forced to come up with compensation schemes for the human source images. The more the weights get used, the higher the payout- the users of the system could be licensed to ensure proper use and that royalties are paid to the source creators. We issue driving licenses, gun licenses, factory permits etc. Licenses are for potent

nomilk May 17, 2023 View on HN

Won't this just push AI development out of the US?

3np Aug 16, 2024 View on HN

There're serious lobbying efforts by industry incumbents and authoritarians to actually apply arms-like restrictions for access to AI. You'd need a special license to be able to create/run your own powerful models legally. I presume that's what they're getting at.

gitpusher Mar 13, 2025 View on HN

HAHAHA. Remember when Sam was absolutely frothing at the mouth to "regulate AI" two years ago?> https://www.nytimes.com/2023/05/16/technology/openai-altman-...> <a href="https://edition.cnn.com/2023/06/09/tech/korea-altman-chatgpt-ai-regulation

debacle Jan 31, 2024 View on HN

OpenAI seems to be transitioning from an AI lab to an AI fearmongering regulatory mouthpiece.As someone who lived through the days when encryption technology was highly regulated, I am seeing parallels.The Open Source cows have left the Proprietary barn. Regulation might slow things. It might even create a new generation of script kiddies and hackers. But you aren't getting the cows back in the barn.