- 4 Posts
- 5 Comments
Feel like this belongs in !fuck_ai@lemmy.world
Think I should cross-post?
For those who downvoted without reading the link… This whole thing is satire.
The aim of this satirical campaign is to use humour to connect your personal needs for Earth’s resources with the evidence of just how much of these resources are now being claimed for the data centres running generative AI. And to make you smile.
Sorry! I added it now.
prototype_g2@lemmy.mlto Technology@lemmy.world•OpenAI Pleads That It Can’t Make Money Without Using Copyrighted Materials for FreeEnglish1·11 months agoI don’t think you understand exactly how theses machines work. The machine does not “learn”, it does not extract meaning from the tokens it receives. Here is one way to look at it
Suppose you have a sequence of symbols: ¹§ŋ¹§ŋ¹§ŋ¹§ŋ And then were given a fragment of a sequence and asked to guess what you be the most likely symbol to follow it: ¹§ Think you could do it? I’m sure you would have no trouble solving this example. But could you make a machine that could reliably accomplish this task, regardless of the sequence of symbols and regardless of the fragment given? Let’s imagine you did manage to create such a marvellous machine.
If given a large sequence of symbols spanning multiple books of length would you say this pattern recognition machine is able to create anything original? No… Because it is simply trying to copy it’s original sequence as closely as possible.
Another question: Would this machine ever derive meaning from this symbols? No… How could it?
But what if I told you that these symbols weren’t just symbols: Unbeknownst to the machine each one of this symbols actually represents a word. Behold: ChatGPT.
This is basically the general idea behind generative AI as far as I’m aware. Please correct me if I’m wrong. This is obviously oversimplified.
Done.