Current:Home > Markets'Open the pod bay door, HAL' — here's how AI became a movie villain-LoTradeCoin
'Open the pod bay door, HAL' — here's how AI became a movie villain
is lotradecoin safe for beginners View Date:2024-12-25 12:47:08
This article was written by a human.
That's worth mentioning because it's no longer something you can just assume. Artificial intelligence that can mimic conversation, whether written or spoken, has been in the news a lot this year, delighting some members of the public while worrying educators, politicians, the World Health Organization, and even some of the people developing AI technology.
Misuse of AI is part of what actors and writers are striking about in Hollywood, and the threat of AI is something Hollywood was imagining long before it was real.
In 1968, for instance, the year before humans first set foot on the moon — and a time when astronauts still used pencils and slide rules to calculate re-entry trajectories because their space capsules had less computing power than a digital watch has today — Stanley Kubrick introduced movie audiences to a sentient HAL-9000 computer in 2001: A Space Odyssey.
HAL (for Heuristically Programmed Algorithmic Computer) introduced itself early in the film by saying, "No 9000 computer has ever made a mistake or distorted information. We are all, by any practical definition of the words, foolproof and incapable of error."
'Open the pod bay door, HAL'
So why was HAL acting so strangely? He (it?) was responsible for maintaining all aspects of a months-long space flight, ferrying astronauts to the moons of Jupiter. Programmed to run the mission flawlessly, the computer's behavior had become alarming, and two of the astronauts had decided to shut down some of its functions. Their plan was short-circuited when HAL, lip-reading a conversation they'd managed to keep him from hearing, cast one of them adrift while he was outside the ship repairing an antenna and refused to let the other back on board.
"Open the pod bay door, HAL" became one of the most quoted film lines of the decade when the computer responded, "I'm sorry, Dave, I'm afraid I can't do that. This mission is too important for me to allow you to jeopardize it."
It's hard to articulate what a genuine shock this was for 1960s movie audiences. There'd been films with, say, robots causing havoc, but they were generally robots doing someone else's bidding. Movie robots, at that point, were about brawn, not brain.
And anyway, malevolent robot stories were precisely the sort of B-movie silliness Kubrick was trying to avoid. So his intelligent machine simply observed (with an unblinking red eye) and, when addressed directly, spoke with a calm, modulated voice, not unlike the one that would be adopted four decades later by Siri and Alexa.
Darwin Among the Machines
Earlier literary notions of "artificial" intelligence — and there were not a lot of them at that point — hadn't really caught the public's imagination. Samuel Butler's 1863 article Darwin Among the Machines, is generally thought to be the origin of this species of writing, and it mostly just notes that while humankind invented machines to assist us — and remember, a really sophisticated machine in 1863 was the steam locomotive — we were increasingly assisting them: tending, fueling, repairing.
Over tens of thousands of years, Butler wondered, might humans not evolve in much the same way Darwin's study of natural selection had just established the rest of the plant and animal kingdoms do, to the point that we would become dependent on our devices?
But even when he incorporated that idea a decade later into a satirical novel called Erewhon, expounding for several chapters on self-replicating machines, Butler barely touched on the notion that those machines would develop consciousness. And neither did the influential 19th-century science fiction writers who followed him. H.G. Wells and Jules Verne invented plenty of unorthodox devices as they sent characters to the center of the Earth, and into space and the recesses of time, without ever considering that those devices might want to do things on their own.
The term "artificial intelligence" wasn't even coined (by American computer scientist John McCarthy) until about a dozen years before Kubrick made his Space Odyssey. But HAL made an impression on the public where scientists had not. Within just a couple of years, movie computers didn't just want spaceship domination; in Colossus: The Forbin Project (1970), they wanted to take over the world.
Malignant machines gone viral
And then this notion of technology-run-wild, ran wild. A high school student played by Matthew Broderick nearly started World War III in WarGames (1983) when he thought he was hacking a computer company's website but accidentally challenged the Pentagon's defense network to a quick game of "global thermonuclear war." The problem, it soon became clear, was that no one told the defense network they were just "playing."
Elsewhere, mechanical men stopped being all-brawn and got a new dispensation to think for themselves, something fiction had granted them before Hollywood got around to it.
In the 1940s, sci-fi novelist Isaac Asimov came up with "Three Laws of Robotics" that would theoretically keep "independent" machines in line. When Asimov's story I, Robot, was turned into a film a half-century or so later, those laws should have reassured Will Smith as he stared down thousands of bots. But he had good reason to be skeptical; he was fighting a robot rebellion.
The Terminator movies effectively put all these themes on steroids — cyborgs in the service of a computerized, sentient, civil-defense network called Skynet, designed to function without any human input. A "Nuclear Fire" and three billion human deaths later, what was left of humanity was engaged in a war against the machines that has so far consumed six films, a TV series, a pair of web series, and innumerable games.
And nuclear blasts weren't necessary to make machine intelligence alarming, a fact cyberpunk-noir established definitively in Blade Runner with its "replicants," and in a Matrix series that reduced all of humanity to a mere power source for machines.
Hollywood's still fighting that vision. Who knows what "The Entity" wants in Mission Impossible: Dead Reckoning (presumably we'll find out next year in Part Two), but whatever it is, it won't bode well for humanity.
It seems not to have occurred to Tinseltown that AI might do the things it's actually doing — make social media dangerous, or make undergrad writing courses unteachable, or screw up relationships by auto-completing incorrectly. None of those are terribly cinematic, so Hollywood concentrates on exploiting our fears — in the late 20th century, we worried about ceding control to technology. In the 21st century, we worry about losing control of technology.
Bring on the droids
Have there also been friendlier film visions of AI? Sure. George Lucas came up with lovable droids R2-D2 and C-3PO for Star Wars, and Pixar gave us Wall-E, a bot who was pluckily determined to clean up an entire planet we'd despoiled.
Spike Jonze's drama Her imagined a sentient, Siri-like personal assistant as a digital girlfriend. Star Trek's Data was not just a Next Generation android version of Mr. Spock, but also a sort of emotion-challenged Pinocchio.
And another Pinocchio — this one fashioned to stand the test of time — would have been Stanley Kubrick's own answer to the question he'd posed with HAL in 1968.
Kubrick labored for decades to hone the script for A.I. Artificial Intelligence, then just two years before he died, handed the project off to Steven Spielberg — the story of David, a robot child who has been programmed to love, and who ends up going beyond that programming.
"Until you were born," William Hurt's Professor Hobby told the bionic child he'd modeled on his own son, "robots didn't dream, robots didn't desire unless we told them what to want." The miracle, he went on, was that though David was engineered rather than born, he shared with humans "the ability to chase down our dreams...something no machine has ever done, until you."
That may not have been enough to make David a real boy, but it put a gentle face on what is perhaps our greatest fear about AI – that we are mortal, and it is not.
In the film, David outlives all of humanity, never growing up, never changing. And perhaps because he was played by Haley Joel Osment, or perhaps because Spielberg was calling the shots, or perhaps because the music swelled ... just so — it didn't feel the least bit threatening.
veryGood! (7233)
Related
- The burial site of the people Andrew Jackson enslaved was lost. The Hermitage says it is found
- You Didn't See It Coming: Long Celebrity Marriages That Didn't Last
- FDA changes Plan B label to clarify 'morning-after' pill doesn't cause abortion
- Perceiving without seeing: How light resets your internal clock
- Amazon's Thank My Driver feature returns: How to give a free $5 tip after delivery
- Summer House Preview: Paige DeSorbo and Craig Conover Have Their Most Confusing Fight Yet
- Solar Energy Surging in Italy, Outpacing U.S.
- Where Is the Green New Deal Headed in 2020?
- See Mariah Carey and Nick Cannon's Twins Monroe and Moroccan Gift Her Flowers Onstage
- American life expectancy is now at its lowest in nearly two decades
Ranking
- Worst. Tariffs. Ever. (update)
- For 'time cells' in the brain, what matters is what happens in the moment
- Confusion and falsehoods spread as China reverses its 'zero-COVID' policy
- Taliban begins to enforce education ban, leaving Afghan women with tears and anger
- I loved to hate pop music, until Chappell Roan dragged me back
- Why Maria Menounos Credits Her Late Mom With Helping to Save Her Life
- J. Harrison Ghee, Alex Newell become first openly nonbinary Tony winners for acting
- Popular COVID FAQs in 2022: Outdoor risks, boosters, 1-way masking, faint test lines
Recommendation
-
Oregon lawmakers to hold special session on emergency wildfire funding
-
China reduces COVID-19 case number reporting as virus surges
-
China has stopped publishing daily COVID data amid reports of a huge spike in cases
-
Spring Is Coming Earlier to Wildlife Refuges, and Bird Migrations Need to Catch Up
-
Sabrina Carpenter reveals her own hits made it on her personal Spotify Wrapped list
-
Capturing CO2 From Air: To Keep Global Warming Under 1.5°C, Emissions Must Go Negative, IPCC Says
-
Native American Pipeline Protest Halts Construction in N. Dakota
-
Brought 'to the brink' by the pandemic, a Mississippi clinic is rebounding strong