AI has a vibrant open source scene and is definitely not owned by a few people.
A lot of the data to train it is only owned by a few people though. It is record companies and publishing houses winning their lawsuits that will lead to dystopia. It’s a shame to see so many actually cheering them on.
So long as there are big players releasing open weights models, which is true for the foreseeable future, I don’t think this is a big problem. Once those weights are released, they’re free forever, and anyone can fine-tune based on them, or use them to bootstrap new models by distillation or synthetic RL data generation.
Truer words have never been said.
AI business is owned by a tiny group of technobros, who have no concern for what they have to do to get the results they want (“fuck the copyright, especially fuck the natural resources”) who want to be personally seen as the saviours of humanity (despite not being the ones who invented and implemented the actual tech) and, like all big wig biz boys, they want all the money.
I don’t have problems with AI tech in the principle, but I hate the current business direction and what the AI business encourages people to do and use the tech for.
Well I’m on board for fuck intellectual property. If openai doesn’t publish the weights then all their datacenter get visited by the killdozer
I don’t really agree that this is the biggest issue, for me the biggest issue is power consumption.
That is a big issue, but excessive power consumption isn’t intrinsic to AI. You can run a reasonably good AI on your home computer.
The AI companies don’t seem concerned about the diminishing returns, though, and will happily spend 1000% more power to gain that last 10% better intelligence. In a competitive market why wouldn’t they, when power is so cheap.
Technological development and the future of our civilization is in control of a handful of idiots.
like most of money
That’s… just not true? Current frontier AI models are actually surprisingly diverse, there are a dozen companies from America, Europe, and China releasing competitive models. Let alone the countless finetunes created by the community. And many of them you can run entirely on your own hardware so no one really has control over how they are used. (Not saying that that’s a good thing necessarily, just to point out Eno is wrong)
I’d say the biggest problem with AI is that it’s being treated as a tool to displace workers, but there is no system in place to make sure that that “value” (I’m not convinced commercial AI has done anything valuable) created by AI is redistributed to the workers that it has displaced.
The system in place is “open weights” models. These AI companies don’t have a huge head start on the publicly available software, and if the value is there for a corporation, most any savvy solo engineer can slap together something similar.
wrong. it’s that it’s not intelligent. if it’s not intelligent, nothing it says is of value. and it has no thoughts, feelings or intent. therefore it can’t be artistic. nothing it “makes” is of value either.
ai excels at some specific tasks. the chatbots they push us to are a gimmick rn.
COO > Return.
AI will become one of the most important discoveries humankind has ever invented. Apply it to healthcare, science, finances, and the world will become a better place, especially in healthcare. Hey artist, writers, you cannot stop intellectual evolution. AI is here to stay. All we need is a proven way to differentiate the real art from AI art. An invisible watermark that can be scanned to see its true “raison d’etre”. Sorry for going off topic but I agree that AI should be more open to verification for using copyrighted material. Don’t expect compensation though.
Like Sam Altman who invests in Prospera, a private “Start-up City” in Honduras where the board of directors pick and choose which laws apply to them!
The switch to Techno-Feudalism is progressing far too much for my liking.
Why is this message not being drilled into the heads of everyone. Sam Altman go to prison or publish your stolen weights.
The biggest problem with AI is the damage it’s doing to human culture.
Not solving any of the stated goals at the same time.
It’s a diversion. Its purpose is to divert resources and attention from any real progress in computing.