The people loudly departing the biggest AI companies aren't necessarily looking for fatter paychecks or more stock options; they’re worried that AI businesses are putting profits over sanity and safety.
And it’s not even working. Not one of the AI companies is profitable. So they’re putting the hope for profits some time in the future over sanity and safety.
Steve Burke (of GN) described the absurdity pretty well, within the context of the currently uncertain Nvidia and OpenAI deal:
Nvidia offered OpenAI $100B in investment, money that it didn’t have, as long as OpenAI gave that money back to Nvidia to lease GPUs that haven’t been made, to then put in data centres that haven’t been constructed, which will be powered by electricity that hasn’t come online, to then rent to users who haven’t subscribed, to provide them features that haven’t come to fruition.
“Fuck me Sam, I don’t have anymore ideas on how to turn a profit. We’ve tried everything. How about we just give the AI its own infrastructure and bank account with the instructions ‘make money’ and see what it does? I know that safety guy advised against it before, but he no long works here. I mean if it becomes a singularity event, at least it’s our singularity event to control”
In a way this is what’s most scary. Because they are desperate. Any safety concerns will be damned and they are all racing to be the first who makes a breakthrough in the direction of AGI.
If we ever get there, this is not the way it should be done. I hope they remember that we need to have a world where they can spend their money.
And it’s not even working. Not one of the AI companies is profitable. So they’re putting the hope for profits some time in the future over sanity and safety.
Steve Burke (of GN) described the absurdity pretty well, within the context of the currently uncertain Nvidia and OpenAI deal:
“Fuck me Sam, I don’t have anymore ideas on how to turn a profit. We’ve tried everything. How about we just give the AI its own infrastructure and bank account with the instructions ‘make money’ and see what it does? I know that safety guy advised against it before, but he no long works here. I mean if it becomes a singularity event, at least it’s our singularity event to control”
In a way this is what’s most scary. Because they are desperate. Any safety concerns will be damned and they are all racing to be the first who makes a breakthrough in the direction of AGI.
If we ever get there, this is not the way it should be done. I hope they remember that we need to have a world where they can spend their money.
That’s a talk ask for desperate people to consider.
What about all the user data they sell to third parties? I’d be interested in knowing how that contributes to this