AI Regulation Skepticism
The cluster focuses on criticism of AI regulation efforts, viewing them as regulatory capture by companies like OpenAI to suppress open-source competitors and innovation rather than addressing real risks.
Activity Over Time
Top Contributors
Keywords
Sample Comments
Sounds like rage bait. They want to get AI regulated.
what's the premise here? this thing will become iteratively better until it could potentially be capable of bad outcomes?if that's the case, how much resources do you think should be dedicated to regulating it? more or less than currently identified existential risks? which entities should be paying for the regulatory controls?what's the proposal here?it's odd because only this one single company that is hedging it's entire existence on "oh boy what if this
I think it's safe to say that, if a large entity with monopoly of force wants to stop these models from being used, they probably could. We already have a global surveillance machine watching us all, and normalized content takedowns for lesser reasons like "copyright infringement" and "obscene/exploitative material". Actual manufacturing & distribution of compute power is fairly consolidated, and hacker culture seems to have lost its activist edge in demanding l
isnt this the basis of the conspiracy theory like ideas around asking for AI regulation from the goverment
AI can be dangerous, but that's not what is pushing these laws, it's regulatory capture. OpenAI was supposed to release their models a long time ago, instead they are just charging for access. Since actually open models are catching up they want to stop it.If the biggest companies in AI are making the rules, we might as well not rules at all.
If regulation is found to be necessary, here are some options- government could treat open ai like an electricity utility, with regulated profits- open ai could be forced to come up with compensation schemes for the human source images. The more the weights get used, the higher the payout- the users of the system could be licensed to ensure proper use and that royalties are paid to the source creators. We issue driving licenses, gun licenses, factory permits etc. Licenses are for potent
Won't this just push AI development out of the US?
There're serious lobbying efforts by industry incumbents and authoritarians to actually apply arms-like restrictions for access to AI. You'd need a special license to be able to create/run your own powerful models legally. I presume that's what they're getting at.
HAHAHA. Remember when Sam was absolutely frothing at the mouth to "regulate AI" two years ago?> https://www.nytimes.com/2023/05/16/technology/openai-altman-...> <a href="https://edition.cnn.com/2023/06/09/tech/korea-altman-chatgpt-ai-regulation
OpenAI seems to be transitioning from an AI lab to an AI fearmongering regulatory mouthpiece.As someone who lived through the days when encryption technology was highly regulated, I am seeing parallels.The Open Source cows have left the Proprietary barn. Regulation might slow things. It might even create a new generation of script kiddies and hackers. But you aren't getting the cows back in the barn.