In an interview with Rolling Stone, Scott, who has directed several movies featuring AI, was asked if the technology worried him. He says he's always believed the...
Blade Runner director Ridley Scott calls AI a “technical hydrogen bomb” | “we are all completely f**ked”::undefined
I use Copilot in my work, and watching the ongoing freakout about LLMs has been simultaneously amusing and exhausting.
They’re not even really AI. They’re a particularly beefed-up autocomplete. Very useful, sure. I use it to generate blocks of code in my applications more quickly than I could by hand. I estimate that when you add up the pros and cons (there are several), Copilot improves my speed by about 25%, which is great. But it has no capacity to replace me. No MBA is going to be able to do what I do using Copilot.
As for prose, I’ve yet to read anything written by something like ChatGPT that isn’t dull and flavorless. It’s not creative. It’s not going to replace story writers any time soon. No one’s buying ebooks with ChatGPT listed as the author.
It’s never going to go away. AI is like the “god of the gaps” - as more and more tasks can be performed by computers to the same or better level compared to humans, what exactly constitutes intelligence will shrink until we’re saying, “sure, it can compose a symphony that people prefer to Mozart, and it can write plays that are preferred over Shakespeare, and paint better than van Gogh, but it can’t nail references to the 1991 TV series Dinosaurs so can we really call it intelligent??”
So much this. Most people under 40 must have grown up with video games. Shouldn’t they have noticed at some point that the enemies and NPCs are AI-controlled? Some games even say that in the settings.
I don’t see the point in the expression “AGI” either. There’s a fundamental difference between the if-else AI of current games and the ANNs behind LLMs. But there is no fundamental change needed to make an ANN-AI that is more general. At what point along that continuum do we talk of AGI? Why should that even be a goal in itself? I want more useful and energy-efficient software tools. I don’t care if it meets any kind of arbitrary definition.
Saying this is like saying your a particularly beefed-up bacteria. In both cases they operate on the same basic objective, survive and reproduce for you and the bacteria, guess the next word for llm and auto-complete, but the former is vastly more complex in the way it achieves those goals.
I use Copilot in my work, and watching the ongoing freakout about LLMs has been simultaneously amusing and exhausting.
They’re not even really AI. They’re a particularly beefed-up autocomplete. Very useful, sure. I use it to generate blocks of code in my applications more quickly than I could by hand. I estimate that when you add up the pros and cons (there are several), Copilot improves my speed by about 25%, which is great. But it has no capacity to replace me. No MBA is going to be able to do what I do using Copilot.
As for prose, I’ve yet to read anything written by something like ChatGPT that isn’t dull and flavorless. It’s not creative. It’s not going to replace story writers any time soon. No one’s buying ebooks with ChatGPT listed as the author.
sigh. Can we please stop this shitty argument?
They are. In a very broad sense. They are just not AGI.
I agree with you but this argument is never gonna go away.
It’s never going to go away. AI is like the “god of the gaps” - as more and more tasks can be performed by computers to the same or better level compared to humans, what exactly constitutes intelligence will shrink until we’re saying, “sure, it can compose a symphony that people prefer to Mozart, and it can write plays that are preferred over Shakespeare, and paint better than van Gogh, but it can’t nail references to the 1991 TV series Dinosaurs so can we really call it intelligent??”
So much this. Most people under 40 must have grown up with video games. Shouldn’t they have noticed at some point that the enemies and NPCs are AI-controlled? Some games even say that in the settings.
I don’t see the point in the expression “AGI” either. There’s a fundamental difference between the if-else AI of current games and the ANNs behind LLMs. But there is no fundamental change needed to make an ANN-AI that is more general. At what point along that continuum do we talk of AGI? Why should that even be a goal in itself? I want more useful and energy-efficient software tools. I don’t care if it meets any kind of arbitrary definition.
Saying this is like saying your a particularly beefed-up bacteria. In both cases they operate on the same basic objective, survive and reproduce for you and the bacteria, guess the next word for llm and auto-complete, but the former is vastly more complex in the way it achieves those goals.