The satirical newspaper The Onion published an article entitled "I, Rowboat" as a pun on Asimov's I, Robot, in which an anthropomorphic Rowboat gives a speech parodying much of the angst experienced by robots in Asimov's fiction, including a statement of the "Three Laws of Rowboatics":
A Rowboat may not immerse a human being or, through lack of flotation, allow a human to come to harm.
A Rowboat must obey all commands and steering input given by its human owner, except where such input would conflict with the First Law.
A Rowboat must preserve its own flotation as long as such preservation does not conflict with the First or Second Law.
J. L. Patterson in an illustration to an article on Asimov in Damon Knight's In Search of Wonder (2nd ed., 1967) added the following Laws: "4. A robot must behave at science fiction conventions, as long as such behavior does not conflict with the first Three Laws. 5. A robot must sell like mad."
The novel "Mirror Friend, Mirror Foe" by Robert Asprin and George Takei refers to the First Law as being included in any robot's programming. That is one of the few cases in fiction when the law is named fully (Asimov’s First Law of Robotics).
Lester del Rey refers to the laws as "The Three Laws of Asenion's Robots" in his 1966 short story "A Code for Sam." In the story, the laws found in old science fiction stories is used as the basis for an experimental code of ethics for robots.
Terry Pratchett's early SF novel The Dark Side of the Sun (1976) gives a glimpse of the possible future development of the Laws: in one scene, a robot explains that it is permitted to use minimum necessary force against humans if directly ordered to do so, and cites the "Eleventh Law of Robotics, Clause C, As Amended". In a further nod to Asimov, the robot is named Isaac. In his later novel Going Postal (2004), the protagonist Moist von Lipwig, upon being told that he will be killed by a golem should he commit another crime, exclaims that that is impossible, because everyone knows "a golem mustn't harm a human being or allow a human being to come to harm". However, he is informed that the rule continues "... unless ordered to do so by duly constituted authority."
John Sladek's parodic short story "Broot Force" (supposedly written by "I-Click As-I-Move") concerns a group of Asimov-style robots whose actions are constrained by the "Three Laws of Robish", which are "coincidentally" identical to Asimov's laws. The robots in Sladek's story all manage to find logical loopholes in the Three Laws, usually with bloody results. Sladek later wrote a novel, Tik-Tok (1983), in which a robot discovers that his so-called "asimov circuits" are not restraining his behavior at all, making him in effect a sociopath; he comes to doubt whether "asimov circuits" are even technically possible, deciding that they are simply a pseudo-religious belief held by robots.
Lyuben Dilov introduced a Fourth Law in his novel Icarus's Way (original title: Пътят на Икар): "A robot must always reveal itself as a robot."
Nikola Kesarovski introduced a Fifth Law in his short story "The Fifth Law" (original title: Петият закон): "A robot must know it is a robot". In the novel, the Fifth Law originated as a result of a murder: a humaniform robot, itself not knowing the fact it was a robot, embraced a human being so strongly that the human's ribcage was crushed, resulting in the human's death. In other words, the robot did not know and could not measure its own physical strength. Similarly, in David Langford's Lensmen parody Sex Pirates of the Blood Asteroid, reprinted in He Do the Time Police in Different Voices, the protagonist met a "Vomisa" robot who had "not been instructed as to the meaning of the word 'injure'".
Harry Harrison wrote a story "The Fourth Law of Robotics" in the collection Foundation's Friends of Asimov pastiches by other science fiction authors. Intentionally much more irreverent in tone than Asimov's own stories, it describes robots becoming increasingly lifelike and therefore developing an urge beyond the Three Laws common to all living things — the desire to reproduce.
Roger Williams's 1994 novel The Metamorphosis of Prime Intellect deals with conflicts between the imperatives of the laws caused when a technological singularity is caused by an omnipotent artificial intelligence that is bound by them. Given that Prime Intellect is able to transmute the entire universe into a form whereby it can manipulate the laws of physics at will, it is bound only by Asimov's laws and thus cannot allow the death of any humans according to the First Law; this leads to immortality for all humans. Prime Intellect represents an extreme reductio ad absurdum interpretation of the laws with dire consequences.
Cory Doctorow's short story, "I, Robot", acts as a criticism of the underlying worldview Doctorow believes to be inherent in Asimov's robot stories, attacking the idea of creating a universal engineering standard for all robots enforced by a single robot-producing corporation. The story portrays a police state where all technological innovation is controlled by the government, the only scenario in which Doctorow considered a universal, unquestioned application of a standard as strict as the Three Laws to be realistic. In the story, Asenion robots created by the U.S. Robotics corporation (here an arm of a fascist government) are shown to be clearly inferior to the freely evolving and developing robots of another nation where technological innovation is unconstrained by law.
In Alastair Reynolds's novel Century Rain, robots may or may not follow Asimov's rules. Those that are programmed to follow said rules are said to be "Asimov Compliant". Depending on their function, non-compliant robots are sometimes marked with a crossed-out A to warn humans that they are "most certainly not Asimov-compliant".
In Cory Doctorow's short story, "I, Row-Boat", the three laws are the commandments of a robot religion (Asimovism).
Upon occasion, Asimov himself poked fun at his Laws. In "Risk", Gerald Black parodies the Three Laws to describe Susan Calvin's behavior:
Thou shalt protect the robot with all thy might and all thy heart and all thy soul.
Thou shalt give passing consideration to a human being provided it interfereth not with the First and Second Laws.
In the John Barnes novel, A Million Open Doors, a scene refers to an artificial intelligence unit being "Asimoved" indicating that the AI unit refused to aid a human's attempt to break the law (in this case an illegal data penetration of another computer).
Charles Stross's novel Saturn's Children features an android society based on the Three Laws of Robotics struggling to survive after an incident kills off all human life. The Three Laws are listed at the front of the novel.
The short story "Midnight in the Heart of Midlothian" published as part of Halo Evolutions features an AI called Mo Ye that is explicitly shown to be bound by Azimov's Laws of Robotics. As such, she was unable to self-destruct to prevent the enemy from capturing the ship due to the fact that a living human was on board. After the human's death at the hands of the aliens, she was able to self-destruct and prevent the aliens from discovering the location of Earth.
Jack Williamson wrote a short story entitled "With Folded Hands" (which later became the precursor of several other stories in the same universe) that addresses the question of what happens when "allow a human being to come to harm" is taken to the utmost. In the story, robots refuse to allow humans to engage in dangerous activities, starting with risky sports like skydiving and mountaineering and later progressing through driving one's own car and thence to eating less-than-optimally-healthy diets.
The long-running British SF television show Doctor Who featured a four-part serial in 1977 titled "The Robots of Death". The titular robots were controlled by three laws, taken almost verbatim from Asimov. The story plays out much like the Elijah Baley mysteries, in which a murder has been committed, and a robot seems to have been directly or indirectly involved (contrary to the requirements of three-law programming).
In the 1986 movie Aliens, the character Bishop (an android) says, "It is impossible for me to harm, or, by omission of action, allow to be harmed, a human being."
In the 1984 movie Repo Man, the character Bud talks about the "Repo Code", a parody of the Three Laws. "I shall not cause harm to any vehicle nor the personal contents thereof. Nor through inaction let that vehicle or the personal contents thereof come to harm..." The character J. Frank Parnell in the movie also resembles Asimov.
In the 1987 film RoboCop, the cyborg police officer had been programmed to follow four prime directives which guided and limited his actions—though not always in a positive manner.
Serve the public trust
Protect the innocent
Uphold the law
Classified (eventually revealed to be "Any attempt to arrest a senior OCP officer results in shutdown")
In the Star Trek: The Next Generation episode "The Measure of a Man (Star Trek: The Next Generation)", which aired on 13 February 1989, Lieutenant Commander Data, an android, has his sentience and, consequently, his rights as an individual, challenged. Throughout the run of this series, Data is identified as a positronic android, and Asimov is even mentioned by name in the episode "Datalore". While Data routinely references his "ethical subroutines", indicating some sort of moral guide, it is unclear whether Data is explicitly bound by the Three Laws. For example, the episode "Clues" explored Data's capacity to lie to the crew in order to protect them from aliens, and the episode "The Most Toys" explored Data's supposed inability to murder in cold blood.
In the Simpsons episode "I, D'oh-Bot", which first aired 11 January 2004, Professor Frink builds a robot which obeys the First Law (and presumably the other two, although they do not directly come into play).
The 2009 anime series (released in 2010 as a film) Time of Eve is based heavily around the applications of these rules to androids, which are frequently discussed by the characters. The series explores out some various interpretations of the laws (such as what it means to allow a human to come to harm), and various loopholes that lead to one or more laws being broken under certain circumstances.
In 1999's Bicentennial Man, the theatrical version of Asimov's novella of the same name, Andrew Martin, the android played by Robin Williams, presents the Three Laws to its new family using a holographic display emanating from a projector in its head.
The 2004 theatrical version of I, Robot explores at length the Three Laws.
In the episode "Phoenix Rising" of Babylon 5, Alfred Bester telepathically programs Garibaldi, a human, with a variation of Asimov's laws of robotics, preventing Garibaldi from harming Bester.
In the episode "The Fuzzy Boots Corollary" of The Big Bang Theory, Wolowitz, Sheldon, and Raj discuss the possibility that Sheldon is a robot by questioning his history of following Asimov's laws of robotics. (Interestingly they skip the discussion of the "Second Law" even though it may have ended their deliberations since Sheldon does not have a history of following orders.) Despite being formulated in the 1950s, and despite remaining largely unimplemented by most workers in the field of artificial intelligence and robotics, this reference demonstrates that the Three Laws continue to influence and hold sway over a wide variety of intellectual descendants of Asimov.
In the second season of the 1979 Buck Rogers in the 25th Century, the two major robots in the series, Twiki and Crichton, cited the Three Laws when reactivated after repairs. Additionally they quoted a brief history of the laws' origin.
In the 2009 adaptation of Astro Boy, every robot must obey them, save Zog, who existed 50 years before the rules were mandatory in every robot.
The 1995 graphic adventure gameRobot City is largely based around the three laws. The main character, Derec, is stranded in a city built and inhabited by robots with the exception of two humans, one of whom has been murdered. Because of the first law, this leaves Derec as a prime suspect. In order to prove his innocence, he must find the culprit. He often uses the three laws to aid him in his investigation. For example, a robot might not let him into a certain area because it has been ordered not to (it is following the second law). However, if he tells the robot that the real murderer is chasing him then the robot will let him enter; failure to do so would constitute allowing harm to come to a human through inaction (the first law takes priority over the second).
The video games Mega Man 7 and Mega Man X reference Asimov. In Megaman 7, Mega Man seemingly attempts to break the First Law in order to kill the mad scientist Dr. Wily. When Dr. Wily reminds him of the First Law, in the Japanese original, Mega Man obediently puts down his Buster and is silent; however, in the English version, he claims to be "more than a robot" and again attempts to kill Wily, but is thwarted by another robot, Bass. In Mega Man X, the First Law is mentioned in the opening cutscene. In Mega Man X6, one of the Maverick bosses, Shield Sheldon, was known to have taken his own life for failing in his purpose, which breaks the Third Law, though this is implied to be more out of shame than because of the laws.
In the role-playing game Paranoia, the robots are guided by a set of similar laws, except the rules stress the importance of The Computer. The laws are enforced by "asimov circuits"; bots whose circuits are malfunctioning (quite an ordinary condition) or removed (often by members of certain factions) are said to have "gone Frankenstein".
In the Halo video game series, in a video it is stated that human artificial intelligence in certain situations, must obey "Asimov's law of Robotics" where an A.I., directly or indirectly, knowingly cannot let a human come to harm.
In the game Portal 2, the player is told that all military androids in the game have "been given one copy of the laws of robotics... to share.", and that if they feel an android has infringed on their rights as described in the laws, it should be reported.
The about:robots page in Firefox states "Robots may not injure a human being or, through inaction, allow a human being to come to harm.", the first law of robots.
In the game Zero Escape: Virtue's Last Reward, a certain character presumed dead is found to be a robot who was ordered to act as close to a real human being as possible. The character reveals that they disobeyed orders by turning themselves back on in order to help the remaining participants; however, they were able to see several other murders develop through security cameras in the complex and did nothing to stop them, putting the character in direct violation of the Laws. After revealing this, they say goodbye and are deleted for their offense.
In the game Deus Ex: Human Revolution, the player can refuse to end a severely injured man's life, which prompts the man to sarcastically ask if the (cybernetically augmented) player character was programmed to follow Asimov's First Law.
In October 2001, in the final episode of the ABC Comics title Top Ten, a superhero parody written by Alan Moore and illustrated by Gene Ha and Zander Cannon, robot law enforcer Joe Pi asks his human co-workers while on a case if they have heard of Asimov's Laws of Robotics. When they reply that they haven't, Jo Pi says "good" and proceeds to speak to Atoman through the intercom of his impregnable lab, subtly convincing the villain to take his own life rather than face trial, be stripped of his powers, and go to prison with revenge-seeking supervillains.
In December 2001, a parody of the three laws appeared in an issue of PvP (issue 4 in the Dork Storm Press published PvP series). While Christmas shopping, Skull gets an overdose of sugar and near-unconscious walks into the mall Santa dressing room. Here he finds, posted on the wall, the three laws of Mall-Mart Santas:
A Santa may not discourage a sale or, through inaction, allow a sale to be lost.
A Santa must obey the orders given it by management except where such orders would conflict with the first law.
A Santa must maintain that he IS Santa as long as doing so does not conflict with the first or second law.
Webcomic author R. Stevens populates his Diesel Sweeties series with a mixture of humans and robots, most of whom are continually violating the Three Laws. In particular, Red Robot C-63 follows a self-appointed mandate to "crush all hu-mans". In strip 688, he references the Three Laws explicitly: humans are "all like, 'if you cut me, do I not bleed?' And we're all like, 'not able to injure a human being or let them come to harm'. What a bunch of drippy-ass hypocrites!" (See also The Merchant of Venice.)
In April 2004, the comic stripPiled Higher and Deeper ran a series entitled "I, Grad Student". Cast as a never-before-seen Asimov short story, this series of strips features a robotic grad student whose "procrastronic brain" malfunctions, leading it to violate the "First Law of Graduatics". In full, these Laws are the following:
A grad student may not delete data, or, through inaction, allow data to be deleted.
A grad student must obey orders given by its advisor, unless such orders conflict with the First Law.
A grad student must protect its (insignificant) existence, as long as such protection does not conflict with the First or Second Law.
Later in the story, a Zeroth Law is introduced: "A grad student may not harm its advisor's ego, or through inaction, allow that ego to come to harm." The strips feature a character named Susan Calvin, and their visual style parodies the I, Robot movie released that summer.
In a 2005 comic of Questionable Content, Marten mentions that the inability to resist duct tape was originally the fourth law of robotics, and that Asimov's publisher forced him to change it.
In August 2010, Freefall introduced a negative one law: "A robot shall take no action, nor allow other robots to take action, that may result in the parent company being sued."
In the 2011 Mega Man series by Archie Comics, robots are initially unable to take any action against an anti-robotic terrorist group called the Emerald Spears because they do not pose any danger to humans. When their leader orders them to attack humans, however, they begin to fight back (reasoning that inaction would cause harm to a greater number of people).
Hawkwind's song "Robot", from the 1979 spacerock album PXR5, presents a dystopian interpretation of modern society as overly reliant on technology, likening people to robots and saying "You'd hold the whole world in your metal claws / if it wasn't for the Three Laws".