It is lazy. It will be sloppy, shoddily made garbage.
The shame is entirely on the one who chose to use the slop machine in the first place.
I laugh at all these desperate “AI good!” articles. Maybe the bubble will pop sooner than I thought.
Its gonna suck. Because of course they’re gonna get bailed out. It’s gonna be “too big to fail” all over again.
Because “national security” or some such nonsense.
If it’s not shameful, why not disclose it?
Regardless, I see its uses in providing structure for those who have issues expressing themselves competently, but not in providing content, and you should always check all the sources that the LLM quotes to make sure it’s not just nonsense. Basically, if someone else (or even yourself with a bit more time) could’ve written it, I guess it’s “okay”.
If it’s not shameful, why not disclose it?
https://en.wikipedia.org/wiki/Threshold_of_originality
If you don’t disclose it, you can claim copyright even if you have no right to. LLM-generated code is either plagiarism (but lawmakers proved that they don’t care about enforcing copyright on training data which has funny implications) or public domain because machine generation is not creative human work.
Uh, yes. Yes it is.
Its a shame that I assume this article is written by an LLM, prompted and edited by a person, and thus I have little will to even read it.
You could use an AI to summarize it…
I’ll see myself out.
Ha, fuck yeah it is.
Imagine if the AI bots learn to target and prioritize content not generated by AI (if they aren’t already). Labeling your content as organic makes it so much more appetizing for bots.
Lol. Lmao.