FANDOM


I, Robot

LiaIRobot

Release date
 March 30, 2015
Running time
 10:25
Previous video
Next video

The Dom compares the 2004 film I, Robot with the 1950 Isaac Asimov short story collection of the same name.

IntroEdit

(shows the following text:

FIRST LAW: A robot may not injure a human being or, through inaction, allow a human being to come to harm.

SECOND LAW: A robot must obey the orders given to it by human beings except where such orders would conflict with the First Law.

THIRD LAW: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.)

(shows the I, Robot trailer)

Voice-Over Announcer: We designed them to be trusted with our homes (shows footage from the film), with our way of life (shows more footage from the film), with our world (shows still more footage from the film); but did we design them...to be trusted?

The Dom: YEEEEEEEEE-EEE-EEE-EEEEEEEEEEEEEES!!!! Yes! The answer is "yes"! To the question you just asked, yes, they are trusted...very much so! Infallibly! Yes! Errrrrrrrrrgh! Isaac Asimov must be spinning in his grave.

PollEdit

People Asked: 20

Watched the Film: 17

Read the Book: 2

The Dom (V.O.): I, Robot, published in 1950, was a collection of short stories about a future in which artificial intelligence has been created and the human race is still adapting to their presence. It's a really fun read. It's got all the hallmarks of 1950s, 1960s sci-fi, where they've grasped the theory of complex computers, but still think that the key is all in the hardware and haven't factored in the importance of software yet.

The Dom: I have it on reasonably good authority that part of the reason this novel was written in the first place was that Isaac Asimov was tired about reading story after story about robots that rise up and try to conquer the human race. As a result, he set out to write a series of short stories that would include anything and everything *except* a robot rebellion. Irony, thy name is I, Robot. (sighs) Reginald, just play the first title card.

What They Didn't ChangeEdit

(The Dom starts to speak, but then stops and ponders for a moment)

The Dom: I guess the Three Laws of Robotics are in both; but they shit all over them in the film, so I don't think that counts. (thinks for a second) There's a Dr. Calvin in the books; but she's 75 years of age, so *that's* clearly not her. Ehhhhm...I got nothing.

What They ChangedEdit

The Dom: Bloody hellfire, where to START?!

The Dom (V.O.): Let's go with the big one, shall we? As I believe I've already mentioned, the *robots*...*don't*...*rebel*! Throughout the whole book, not a single robot *ever* harms a hair on a human's head. Why? Because they *can't*! The First Law of Robotics was non-negotiable, end of fucking story! The closest you *ever* got to robot violence was it *might* restrain a human to stop him from getting badly hurt, but for them to attack and kill people? Fucking *insulting* to the book!

And the get-out they try and use in the film? (spoken in a mocking voice) Well, VIKI the supercomputer realized that if humans were left to govern themselves, they would just keep harming each other; so she realized, to save them from greater harm, she had to overthrow them now for their own good. (spoken normally) Bullshit. Bullshit, bullshit, bullius shittius! If VIKI had come to the conclusion that the human race would come to great harm if she didn't act, but was also aware that by acting, she would have to harm humans, she would have ended up locked in a paradox and permanently shut down! This exact thing happens to multiple robots in the book, *including* a supercomputer. The movie did *not* come up with a clever loophole in the Three Laws; it just came up with bullshit!

The movie portrays a world that has accepted robots with open arms, and Will Smith's character is the one loony with any reservations -- which is the exact opposite of the book, where the entire *fucking* planet is up in arms and burning crosses about the idea of robots walking and talking and touching their clean human children. Only a select few enlightened people ever saw the awesome potential in them. This rampant distrust of the robots and the fear of what they represented to the human race's way of life was so strong, in the novel, it eventually became illegal for robots to live on Earth; so they were only utilized off-planet.

Will Smith's character "Spoony" (shows Noah Antwiler) only exists in the films. The novels were more based around great scientists and engineers -- you know, people who would be more likely to be on the forefront of new technology, rather than cops and street youths. Oh, yes, there's no Shia LaBeouf in the book either. What are you even *doing* here, Shia LaBeouf? Go back to choking your director while blitzed off your mind on cocaine, you crazy, magnificent bastard.

One thing that the film did that the books did not was openly question the moral implications of the way robots are treated. They do this big buildup to robots having real feelings and hopes and dreams and whatnot, and the ending kinda suggests that Sonny is about to become the robot Moses or something. You may find yourself thinking, "Well, this has got to be a good thing, hasn't it?" Wrong, motherlovers! The film is simply retreading the safe and pre-tested concept of A.I. rights or lack thereof, already thoroughly explored by other works such as Blade Runner, Andromeda, Star Trek: Voyager, The Animatrix, and countless subsequent science-fiction novels.

The book does something much more clever. The parallels of the robots in the novel and the 18th-century slave trade cannot be missed, the most obvious being that they were expected to work unpaid and have no rights of their own; all the robots are programmed to refer to humans as "master"; and if they haven't been given a name yet by their owner, they were usually just called "boy". Additionally, everyone in the book is significantly meaner to the robots. Dr. Calvin saving Sonny's life because he was different and unique in the film is kind of ironic because in the book, *she* was always the first person to suggest destroying any robots that showed any signs of unusual behavior, even though she constantly insisted that robots were, on the whole, much nicer than humans.

Despite all this, not one person in the book *ever* asks the question, "Can forcing self-aware, intelligent beings to blindly obey the whims of others and destroying them as a matter of convenience *ever* be morally justifiable?" -- it just doesn't occur to anyone. By delivering no salvation to the robots, the book drives home the injustice of the situation so much more effectively. I really can't understate how powerful, yet subtle a message was conveyed there.

The Dom: Lastly, but by no means least importantly, the book was not a two-hour-long advert for cars and shoes.

What They Left Out AltogetherEdit

The Dom: Well, the book, the entire book...and it's a damn shame because the book is full of really good stories. I'll give you a few examples.

The Dom (V.O.): Example 1: A rich company owner buys a robot to serve as his child's nanny. They become best friends, but the mother objects on the grounds that she finds it creepy and doesn't approve of her daughter forming a friendship with a machine. She forces her husband to get rid of it, and the girl falls into a deep depression. The mother refuses to reunite the pair until, after a chance encounter with the robot, she witnesses it saving her daughter's life by dragging her out of the way of a speeding forklift.

Example 2: A pair of robot field testers are on an expedition mission to Mercury, and things have gone tits up. The base they are on has lost power, and they are in need of a certain mineral to get it working again. It *can* be found on the planet's surface, but it's too hot for humans to reach it without burning up. They send out their awesome new prototype robot to get some; but it's gone haywire, running around in circles and sprouting song lyrics as if he was drunk. It turns out some bright spark had messed with the robot's Three Laws -- because he was so expensive, they made the Third Law almost as important as the Second. The robot had realized the danger of its mission, but didn't know that his human friends' lives depended on it; so he was stuck in a permanent loop between obeying orders and preserving his own safety. The humans eventually realize that the only way out of this is for one of them to intentionally place himself in a scenario that will result in his death unless the robot acts, invoking the First Law and snapping him out of it.

Example 3: The humans built a space station orbiting the sun to collect solar power and send it back to Earth in an intense energy beam. The station is mostly manned by simple robots and directed by a more complex and advanced model, which starts acting very strange around the same time as an emergency situation develops in the form of an electron storm which has the potential to turn the energy beam directed at Earth into a *death ray laser* and toast part of the planet. The head robot appeared to have, well, found religion -- it believed that the station's core was its god; and anything not on the station itself, like Earth and the stars, simply didn't exist; and the human technicians on board were merely temporary apparitions that would cease to be the second they left. It even goes as far as to lock the humans in their quarters and ignore their orders from then on. All seems lost at first; but after the storm has passed, the humans discover that the robot somehow saved Earth from destruction by directing the beam at the receiver so accurately, no energy slipped out and hurt anyone. When asked why it had bothered to do this, it simply replied that it was the will of the station's core and it did not question it. The technicians eventually figure out that the robot was simply being controlled by the First Law the whole time. It subconsciously knew that it was more skilled at the beam controls than the humans, but couldn't simply tell them this -- partly because it would hurt their feelings, and partly because they might not have believed it -- so it had come up with a scenario inside its own head where it would be forced to relieve the humans of command and save the planet.

Example 4: I mentioned before that the robots don't rebel, and that's true -- however, that doesn't mean they don't end up in charge of the world. "Whaaaaaaat?", you say? "But, Dom, you handsome son of a bitch, doesn't that discredit pretty much everything you've told us thus far about the basic concept of the book?" Well, no, not really. You see, it's not so much the robots that end up in charge as it is four hyper-intelligent thinking machines -- not unlike VIKI -- and they don't so much take over as they are simply put in charge by humans. They were designed to help run the Earth's economy; and they proved to be so consistently great at it that they were given more and more responsibility until, some decades later, Dr. Calvin realized that they were pretty much running the whole show, and no one had noticed until now because everything was going so great. She decides not to make a big deal out of it because, as a result, the Earth had been truly united in peace and harmony for the first time...ever, world hunger was a thing of the past, disease was non-existent, and everyone was so damn happy! That's pretty much how the book ends: the robots are both slaves to and puppeteers of the humans, who are themselves blissfully ignorant of both of these things.

See what I mean? Interesting stories that had real drama and real tension that didn't rely on making the robots the bad guys, just the driving force behind great change.

The Dom's Final ThoughtsEdit

The Dom: This adaptation...is a catastrophe. Asimov wrote...a theory about what life might be like with robots and how certain scenarios might play out with their involvement. What the film gave us was a PG-13 Terminator. Out of every film based on a book that I've *ever* seen, this one is by *far* the one that's got the most offensively off message. It's a classic case of a film stealing a book's title simply for marketing purposes, and The Dom does NOT approve!

Community content is available under CC-BY-SA unless otherwise noted.