The great thing about the British tabloids is they never let a strict adherence to the truth get in the way of a headline. Take this one from The Daily Mail: "Stephen Hawking and Elon Musk sign open letter warning of a robot uprising." Pretty ominous, huh?
Artificial intelligence has loomed as a threat to humanity ever since Victor Frankenstein reanimated his first corpse. A.I. dooms us all, whether on Earth ("Terminator" and its sequels) or out in space ("Battlestar Galactica"). If we're lucky, we'll just end up ruled by a despotic supercomputer, as in Joseph Sargent's 1970 thriller "Colossus: The Forbin Project." If we're not lucky — boom.
Two movies now in theaters take different approaches to the A.I. menace. In "Avengers: Age of Ultron," A.I. is the latest threat confronting Marvel's cinematic superheroes. "Ex Machina," meanwhile, confronts us with just how thin the line between intelligence and artificial intelligence really is.
The curious thing about Marvel's movies is they're action movies where the action is typically the least interesting part. The Shaw Brothers, Marvel is not. And "Avengers: Age of Ultron" has a lot of action. It starts with a raid on a Hydra base — Hydra being the bad guys from the Captain America movies and TV's "Agents of SHIELD" — and ends, like the first "Avengers" movie, with a huge battle involving our heroes vs. an army of disposable drones. And no, that's not a metaphor for the United States' preferred form of modern warfare. In his second outing for Marvel, Joss Whedon still directs action scenes as if they bore him, and they probably do. They're just this thing he has to do because they're expected, and his disinterest is contagious.
The one action set piece that isn't a snooze is the much-promoted heavyweight bout between the Hulk (Mark Ruffalo's CGI stand-in) and Tony Stark (Robert Downey Jr.) wearing his just-for-the-occasion "Hulkbuster" Iron Man suit. One-on-one combat, punctuated by clever quips, beats Michael Bay-style melees every time.
The big bad is an insane robot named Ultron, voiced with malevolent glee by James Spader. Spader needs only his voice to rank with Tom Hiddleston's Loki among Marvel's best villains.
One of Stark's science projects gone awry, Ultron is supposed to protect humanity. Then Ultron decides the best way to do that is by making humanity evolve, and nothing speeds along evolution better than a global extinction event. Oops.
It makes one long for the more modest aims of Gene Hackman's Lex Luthor, plotting to dump California's coastline into the Pacific to further a real estate scheme.
Whedon, who also wrote the script, is on more comfortable turf dealing with characters, and "Age of Ultron" gives him plenty. Joining the previous film's team are "Godzilla's" Aaron Taylor-Johnson and Elizabeth Olsen as Quicksilver and the Scarlet Witch, while Paul Bettany, Stark's helpful A.I. Jarvis in four earlier movies, finally shows up in the flesh as the Vision.
The verbal sparring between Stark, Captain America (Chris Evans) and Thor (Chris Hemsworth), along with the budding romance between Ruffalo's Banner and Scarlett Johansson's Black Widow, keep lively what otherwise, like "Iron Man 2," feels more like a warm-up than the main act.
"Ex Machina" is definitely a main act, and it's a brilliant one. Screenwriter Alex Garland ("28 Days Later") steps behind the camera for an impressive directorial debut.
Billionaire tech mogul Nathan (Oscar Isaac) invites one of his programmers, Caleb (Domhnall Gleeson), to his secluded home for an experiment. The experiment is to test whether Nathan's android Ava (Alicia Vikander) has achieved artificial intelligence. But when Ava tells Caleb that Nathan cannot be trusted, it kicks off a battle of wits that leaves us to ponder not only what it means to be human, but whether the real danger of artificial intelligence isn't a lack of humanity, but too much of it.
Garland has crafted a first-class sci-fi thriller, with so many twists and red herrings that the ones you see blind you to the ones you don't.
"Ex Machina" may be the smaller film about A.I., but it leaves the bigger impression.
Showing posts with label computers. Show all posts
Showing posts with label computers. Show all posts
Thursday, May 07, 2015
Thursday, March 31, 2011
Culture Shock 03.31.11: Technology dominance is fleeting
In the past week, Amazon.com has quietly released two new products that are likely to significantly escalate the ongoing cold war between the technology world's three major superpowers.
Last week, Amazon opened an app store, offering free and paid applications for devices that run on Google's Android operating system and becoming the first direct challenger to Google's own Android Market.
That also fed rumors that Amazon is working on its own Android version of its already popular Kindle e-reader, which might just finally be the device that mounts a serious challenge to Apple's iPad.
Then, Monday night, Amazon rolled out its Amazon Cloud Player, which allows users to upload music and videos, which they can then play from any desktop or notebook computer's Web browser or, using Amazon's Cloud Player app, from any Android-powered smartphone or tablet device. Imagine accessing your entire MP3 collection from your work computer or your phone. You'll probably have to buy extra storage space to do it, even though Amazon offers 5 gigs free and another 20 GB with the purchase of any CD download. But now you can do just that.
And as the sun rose Tuesday morning, some tech pundits were wondering if Amazon's Cloud was about to rain all over Apple's iTunes.
Both Apple and Google are reportedly working on systems similar to the Amazon Cloud, but now they're playing catch-up. And Apple in particular isn't used to playing from behind.
Notice something interesting yet?
We're talking about the major players in consumer technology in 2011: a book retailer that branched out into digital media, a computer manufacturer that revived its fortunes by selling music and a company that started out with just a website that helped you find other websites. But who's missing from this list?
Microsoft.
Yes, Microsoft — the software giant that dominated personal computing for most of the past 20 years. The company that became so large and so powerful that its competitors accused it of being a monopoly. (How can you be a monopoly if you have competitors?) The Evil Empire targeted by lawsuits and justice departments in both the U.S. and Europe.
That Microsoft. You know, the one that barely seems relevant anymore.
Earlier this month, Microsoft announced it had discontinued its Zune music player. Launched in 2006, Zune was Microsoft's attempt to challenge Apple's iPod, but it came along far too late to make a dent in the market.
Microsoft just recently launched its own Windows 7 smartphone system, and from what I've seen it's not bad. But, once again, it's a bit late to Apple and Google's party.
Even if Microsoft still controls the lion's share of desktop operating systems, that's not a safe place to be when the future of computing looks a lot more like something you carry around in a backpack or purse.
In the end, the lawsuits and antitrust cases didn't really hurt Microsoft, but an inability to adapt did, which makes all that hand wringing about Microsoft's dominance back in the 1990s seem more than a little silly now. Dominance is fleeting.
Just ask Rupert Murdoch, who bought MySpace for $580 million in 2005 and then proceeded to mismanage the website into oblivion, allowing Facebook, which is not without its own annoyances, to become the social networking site of record. Now Murdoch's Newscorp is in talks with the music-video site Vevo for some kind of deal that might salvage something from Murdoch's investment.
MySpace and Microsoft became complacent, which is why they're no longer of any real consequence.
Meanwhile, Amazon, Google and Apple should consider themselves lucky they have each other to keep them all thinking about the future.
Last week, Amazon opened an app store, offering free and paid applications for devices that run on Google's Android operating system and becoming the first direct challenger to Google's own Android Market.
That also fed rumors that Amazon is working on its own Android version of its already popular Kindle e-reader, which might just finally be the device that mounts a serious challenge to Apple's iPad.
Then, Monday night, Amazon rolled out its Amazon Cloud Player, which allows users to upload music and videos, which they can then play from any desktop or notebook computer's Web browser or, using Amazon's Cloud Player app, from any Android-powered smartphone or tablet device. Imagine accessing your entire MP3 collection from your work computer or your phone. You'll probably have to buy extra storage space to do it, even though Amazon offers 5 gigs free and another 20 GB with the purchase of any CD download. But now you can do just that.
And as the sun rose Tuesday morning, some tech pundits were wondering if Amazon's Cloud was about to rain all over Apple's iTunes.
Both Apple and Google are reportedly working on systems similar to the Amazon Cloud, but now they're playing catch-up. And Apple in particular isn't used to playing from behind.
Notice something interesting yet?
We're talking about the major players in consumer technology in 2011: a book retailer that branched out into digital media, a computer manufacturer that revived its fortunes by selling music and a company that started out with just a website that helped you find other websites. But who's missing from this list?
Microsoft.
Yes, Microsoft — the software giant that dominated personal computing for most of the past 20 years. The company that became so large and so powerful that its competitors accused it of being a monopoly. (How can you be a monopoly if you have competitors?) The Evil Empire targeted by lawsuits and justice departments in both the U.S. and Europe.
That Microsoft. You know, the one that barely seems relevant anymore.
Earlier this month, Microsoft announced it had discontinued its Zune music player. Launched in 2006, Zune was Microsoft's attempt to challenge Apple's iPod, but it came along far too late to make a dent in the market.
Microsoft just recently launched its own Windows 7 smartphone system, and from what I've seen it's not bad. But, once again, it's a bit late to Apple and Google's party.
Even if Microsoft still controls the lion's share of desktop operating systems, that's not a safe place to be when the future of computing looks a lot more like something you carry around in a backpack or purse.
In the end, the lawsuits and antitrust cases didn't really hurt Microsoft, but an inability to adapt did, which makes all that hand wringing about Microsoft's dominance back in the 1990s seem more than a little silly now. Dominance is fleeting.
Just ask Rupert Murdoch, who bought MySpace for $580 million in 2005 and then proceeded to mismanage the website into oblivion, allowing Facebook, which is not without its own annoyances, to become the social networking site of record. Now Murdoch's Newscorp is in talks with the music-video site Vevo for some kind of deal that might salvage something from Murdoch's investment.
MySpace and Microsoft became complacent, which is why they're no longer of any real consequence.
Meanwhile, Amazon, Google and Apple should consider themselves lucky they have each other to keep them all thinking about the future.
Thursday, August 06, 2009
Culture Shock 08.06.09: I, for one, welcome our android overlords
![]() |
| Technology that would just as soon kill you. |
Is the day coming when your toaster might try to kill you?
That may depend on how smart the toaster is. One day, it makes your breakfast, and the next it decides it wants to sleep in, or else.
Do toasters dream of electric sheep?
At two conferences held recently in California, researchers in computing and artificial intelligence speculated about a future in which our computers are as smart as we are — or possibly much smarter.
"These are powerful technologies that could be used in good ways or scary ways," said Eric Horvitz, quoted in The Sunday Times of London. Horvitz should know all about scary technologies. He's a principal researcher at Microsoft, which gave us Windows Vista.
We can't say we weren't warned. From HAL 9000 to the replicants of "Blade Runner" to the Cylons to Skynet, science fiction is full of super-intelligent computers, robots and androids who rebel against their human creators.
Then there is the 1970 film "Colossus: The Forbin Project," in which a supercomputer decides to take over the world, for humanity's own good, naturally.
"In time," Colossus says, "you will come to regard me not only with respect and awe, but with love."
With computers becoming faster and more complex, maybe such doomsday fantasies could become real. According to Moore's law, named for Intel co-founder Gordon E. Moore, the processing power of computers doubles about every two years. Whether that trend will continue is a subject of debate, but some futurists think it will.
Ray Kurzweil believes we are about 30 years away from creating a human-level artificial intelligence, a computer just as smart as we are.
If both Kurzweil and Moore are correct, things could get interesting, and soon. In his 2008 book "Future Imperfect: Technology and Freedom in an Uncertain World," David D. Friedman writes, "In forty years, that makes them (computers) something like 100 times as smart as we are. We are now chimpanzees — perhaps gerbils — and had better hope that our new masters like pets."
Science-fiction writer Vernor Vinge coined a name for it: the singularity. That's the point at which superhuman intelligences, rather than humans, are driving technological advancement. Each generation of machines creates another that's even smarter.
In a 1993 article, Vinge writes, "Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended."
That's all well and good if the future's super-intelligent robots look like Tricia Helfer, Grace Park and Lucy Lawless — the "Battlestar Galactica" scenario — but there's still the danger they'll decide to nuke us from orbit (because it's the only way to be sure).
Kurzweil is an optimist. He thinks we can beat the robots at their own game. Find a way for human brains to connect to machines, and humanity can take advantage of Moore's law, too.
Yes, mankind's fate may hinge on us becoming cyborgs. But why stop there? We could, as science-fiction author Ken MacLeod has speculated, upload our minds into cyberspace, leaving our flesh to go the way of all flesh, while achieving technological immortality, at least until the universe reaches heat death. Then the lights go out permanently.
Stopping technological advancement isn't an option, but if becoming a cyborg seems too extreme, we could try to build something like Isaac Asimov's Three Laws of Robotics into our super-intelligent machines.
The only problem with that, as anyone who has read Asimov's stories knows, is the Three Laws often cause as many problems as they solve.
Maybe it's best just to hope we end up ruled by androids who look like Lucy Lawless. I, for one, welcome our new Cylon overlords.
Subscribe to:
Posts (Atom)


