I’m rather curious to see how the EU’s privacy laws are going to handle this.
(Original article is from Fortune, but Yahoo Finance doesn’t have a paywall)
“AI model unlearning” is the equivalent of saying “removing a specific feature from a compiled binary executable”. So, yeah, basically not feasible.
But the solution is painfully easy: you remove the data from your training set (ie, the source code), and re-train your model (recompile the executable).
Yes, it may cost you a lot of time and money to accomplish this, but such are the consequences of breaking the law. Maybe be extra careful about obeying laws going forward, eh?
Far cheaper to just buy politicians and change the law.
Just ask the AI to do it for you. Much better return on investment.
It takes so.much money to retrain models tho…like the entire cost all over again …and what if they find something else?
Crazy how murky the legalities are here …just no caselaw to base anything on really
For people who don’t know how machine learning works at a very high level
basically every input the AI is trained on or “sees” changes a set of weights (float type decimal numbers) and once the weights are changed you can’t remove that input and change the weights back to what they were you can only keep changing them on new input
So we just let them break the law without penalty because it’s hard and costly to redo the work that already broke the law? Nah, they can put time and money towards safeguards to prevent themselves from breaking the law if they want to try to make money off of this stuff.
Much like DLLs exist for compiled binary executables, could we not have modular AI training data? Then only a small chunk would need to be relearned at a time.
Just throwing this into the void here.
The difference in between having or not something in the training set of a Neural Network is going to be different values for non-integer factors all over the neural network and, worse, it is just as like that they’re tiny differences as it is that they’re massive differences.
Or to give you a decent metaphor for it, “it would be like trying to remove a specific egg from a bowl of scrambled eggs”.
fuck laws
Man, fuck these user data protection laws, hate em
The issue is the ownership of the AI; if it were not ownable or instead owned by everyone, there wouldn’t be an issue.
Ah yes, let’s just quickly switch the mode of production in this industry, I’m sure that’s going to happen.
I also don’t want my data to be processed by the fully automated luxy gay space machine learning algorithms either.
Delete the AI and restart the training from the original sources minus the information it should not have learned in the first place.
And if they claim “this is more complicated than that” you know their process is f-ed up.
Or you know, if it’s impossible to strip out individual data, and it’s too expensive to retain/retrain models with data removed… Why is everyone overlooking “just don’t process private data, and only use public data in model training”?
Yeah. Penalise it heavily so if you need to make a model, make manually vetting the data the most affordable option.
Ultimately, ensuring models are trained on safe, good, legal data, and not just random bullshit scraped off of the internet, will just be a net positive overall.
Along those lines, perhaps you put in a stipulation that you don’t have to toss the model if you instead give the person a significant sum in royalties. After all, if their data isn’t a lynchpin in the model, you didn’t need it in the first place, and if it is crucial, you should pay them accordingly.
Punitive regulations seem to be the best way to make companies grow a sense of ethics.
Removed by mod
Sounds like bullshit.
Have you tried…
format EarthThen why they put it in in the first place no? 👁👄👁
Everyone in the thread so triggered lol, so you hear yourselves?






