In an interview with Rolling Stone, Scott, who has directed several movies featuring AI, was asked if the technology worried him. He says he's always believed the...
Blade Runner director Ridley Scott calls AI a “technical hydrogen bomb” | “we are all completely f**ked”::undefined
I’m sure that a film director is an expert on the technical underpinnings of large language models, which primarily are used to generate blocks of text that have the appearance of being coherent.
Several departments where I work had massive layoffs in favour of implementing customized versions of GPT4 chatbots (both client facing services and internal stuff). That’s just the LLM end of AI.
That’s not even considering the generative image spectrum of AI. I fear for my companies graphics, web design, and UX/UI teams who will probably be gone this time next year.
I work freelance but occasionally needed to partner with artists and other stuff. But I now use various “ai” projects and no longer need to pay people to do the with as the computer can do it good enough.
I’m not some millionaire, I’m just a guy trying to save money to buy a house one day, so it’s not like a large economic impact, but I can’t be the only one.
I know very well what UX is having studied it as my major in uni. Senior executives do not know what it is and have and are making decisions to “replace” them with LLMs and “prompt engineers”. I see it daily at work.
There is a great disconnect where hiring managers and executives see LLMs as a quick win that will cut costs and make moves to cut costs without doing any analysis.
I use Copilot in my work, and watching the ongoing freakout about LLMs has been simultaneously amusing and exhausting.
They’re not even really AI. They’re a particularly beefed-up autocomplete. Very useful, sure. I use it to generate blocks of code in my applications more quickly than I could by hand. I estimate that when you add up the pros and cons (there are several), Copilot improves my speed by about 25%, which is great. But it has no capacity to replace me. No MBA is going to be able to do what I do using Copilot.
As for prose, I’ve yet to read anything written by something like ChatGPT that isn’t dull and flavorless. It’s not creative. It’s not going to replace story writers any time soon. No one’s buying ebooks with ChatGPT listed as the author.
Saying this is like saying your a particularly beefed-up bacteria. In both cases they operate on the same basic objective, survive and reproduce for you and the bacteria, guess the next word for llm and auto-complete, but the former is vastly more complex in the way it achieves those goals.
Yes, I thought he was talking about the film industry (“we’re fucked”) and how AI is/would be used in movie. In which case he would be competent to talk about it.
But he’s just confusing science-fiction and reality. Maybe all those ideas he’s got will make good movies, but they’re poor predictions.
I’m sure that a film director is an expert on the technical underpinnings of large language models, which primarily are used to generate blocks of text that have the appearance of being coherent.
Several departments where I work had massive layoffs in favour of implementing customized versions of GPT4 chatbots (both client facing services and internal stuff). That’s just the LLM end of AI.
That’s not even considering the generative image spectrum of AI. I fear for my companies graphics, web design, and UX/UI teams who will probably be gone this time next year.
I work freelance but occasionally needed to partner with artists and other stuff. But I now use various “ai” projects and no longer need to pay people to do the with as the computer can do it good enough.
I’m not some millionaire, I’m just a guy trying to save money to buy a house one day, so it’s not like a large economic impact, but I can’t be the only one.
Ux is not about drawing pictures. That work is already automated by ui kits anyway. Ux is about thinking through requirements and research.
I know very well what UX is having studied it as my major in uni. Senior executives do not know what it is and have and are making decisions to “replace” them with LLMs and “prompt engineers”. I see it daily at work.
There is a great disconnect where hiring managers and executives see LLMs as a quick win that will cut costs and make moves to cut costs without doing any analysis.
Suits are idiots. No argument there.
I use Copilot in my work, and watching the ongoing freakout about LLMs has been simultaneously amusing and exhausting.
They’re not even really AI. They’re a particularly beefed-up autocomplete. Very useful, sure. I use it to generate blocks of code in my applications more quickly than I could by hand. I estimate that when you add up the pros and cons (there are several), Copilot improves my speed by about 25%, which is great. But it has no capacity to replace me. No MBA is going to be able to do what I do using Copilot.
As for prose, I’ve yet to read anything written by something like ChatGPT that isn’t dull and flavorless. It’s not creative. It’s not going to replace story writers any time soon. No one’s buying ebooks with ChatGPT listed as the author.
Saying this is like saying your a particularly beefed-up bacteria. In both cases they operate on the same basic objective, survive and reproduce for you and the bacteria, guess the next word for llm and auto-complete, but the former is vastly more complex in the way it achieves those goals.
An 85 year old film director*
Yes, I thought he was talking about the film industry (“we’re fucked”) and how AI is/would be used in movie. In which case he would be competent to talk about it.
But he’s just confusing science-fiction and reality. Maybe all those ideas he’s got will make good movies, but they’re poor predictions.
You don’t need to be an expert to see a demo and understand what you can do with the tech.
You kinda do, as anyone in tech that has ever had to communicate with customers can attest to.