90.9 WBUR - Boston's NPR news station
Top Stories:
PLEDGE NOW
Here and Now with Robin Young
Public radio's live
midday news program
With sponsorship from
Mathworks - Accelerating the pace of engineering and science
Accelerating the pace
of engineering and science
Monday, January 7, 2013

In An Age Of Driverless Cars, A Call For Robot Ethics

California Gov. Edmund G Brown Jr., front left, rides in a driverless car  to a bill signing at Google headquarters in Mountain View, Calif., in September 2012. (Eric Risberg/AP)

California Gov. Edmund G Brown Jr., front left, rides in a driverless car to a bill signing at Google headquarters in Mountain View, Calif., in September 2012. (Eric Risberg/AP)

There are already robots that operate vehicles, such as the driverless Google car, and robots that can assist in surgery, like the da Vinci Surgery System. But what happens when robots can do these things without human oversight?

New York University professor of psychology Gary Marcus says we need to start thinking now about how to give robots a moral code.

“The more that machines have authority the more we need to think about the decisions they make from a moral and ethical standpoint,” Marcus told Here & Now’s Robin Young.

Current popular thinking about robot ethics centers on science fiction writer Isaac Asimov’s three laws of robotics, which first appeared in his 1942 short story “Runaround”:

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

Marcus writes in The New Yorker that the Pentagon’s investment in robotic soldiers has made Asimov’s first law unrealistic.

When thinking about realistic moral and ethical guidelines for robots to use, Marcus said input needs to come from many areas, to make sure they fit today’s social reality, but are flexible enough for the future.

Google engineer Sebastian Thrun discusses driverless cars in a 2011 TED Talk:

The end of this segment featured a clip from the radio play “The Modern Prometheus,” an episode from the podcast “The Truth.” Click here for the full version.

Guest:


Please follow our community rules when engaging in comment discussion on this site.
  • Mike

    How much will a robotic car cost?  If it becomes required, will poor people not be allowed to drive a cheap car?

    • PoliticsWatcher

       They will cost a lot, at first.  But they will get very cheap as volumes increase.  Presumably when they become mandatory there will be subsidies of some sort.  I would be glad to pay a tax to make other drivers safer, just as I am glad to pay extra for seatbelts, air bags, and anti-lock brakes.

      • Mike

        Regardless of how cheap the robot can be, it will always cost more than a car without one. 
        I comment you for your willingness to help others, but as we’ve seen in the last few years, there are a lot of people who are against any kind of government welfare programs.  The poorest people have always slipped thru the cracks, either by accident or by design.

  • PoliticsWatcher

    What ethics should robots have?  Utilitarianism, obviously.  The greatest good for the greatest number.  Not that the details of that are always clear…

  • PoliticsWatcher

    The Da Vinci surgical system is not a robot (in the sense of a thinking being), but rather is a waldo (a set of mechanical hands).  Similarly, military drones are simply remotely piloted, they don’t shoot at things on their own.  The distinction is important.

    • Toni Roman

      Most military drones don’t shoot on their own but a few do.

      • PoliticsWatcher

         [citation needed]

  • Toni Roman

     Robot Ethics?  Isaac Asimov was writing about the Laws of Robotics as early as the 1950′s.  In the 21st Century, the Pentagon has autonomous AI drones (with ominous names like the Predator and the Reaper) that independently decide to kill humans.  To talk of robot ethics now is closing the barn doors after the horses have run off. How about human ethics?

    By the way, since other nations are building military robots, it may be too late to stop a scenario like Skynet and terminators.  Some of us are not looking forward to human extinction.

    • Mike

       The movies show robots “deciding” to take over.  I don’t think that’s the problem.  I would be wary of the programmer writing the code.  Do you trust the people who control the robot?

      • Toni Roman

        No, I don’t trust the people who “control” AI’s and mobile robots. A mass murderer could program malware to continue killing long after the SWAT team sniper blows his head off.    You can’t trust homicidal humans (which is why I ask: How about human ethics?) and therefore you cannot trust the software (AI) and hardware (robots) that they make.  

        Please be careful about putting the word deciding in quotes.  The people in artificial intelligence research & development have systems making decisions now in the real world not in some fictional future.  Ask the traders on Wall Street if you doubt that AI’s are making important decisions not just minor decisions like the fuzzy logic in an advanced washing machine.  Your washing machine is a robot, albeit a very stupid one.  Not all robots are stupid.  Mike, I could introduce you to people I know of who are installing AI brains in their robots.

        I do not worry about industrial robots bolted to factory floors who mindlessly assemble cars.  I do worry about autonomous drones given the circuitry and software to decide to kill humans (even the Taliban) without the joystick of a human cyberpilot in a chain of command of the US military remotely operating it.  The reason is that a human (even a bad one) can be held accountable and be thrown in the stockade if they decide to wipe out an innocent village because they hate the people in that part of the world.  You can unplug or turn off a machine but their malware can escape across the internet.  I don’t call that accountability.

        The show gave the example of a driverless car driving over the railing of a bridge and killing its owner (you or me) to avoid hitting a school bus full of children.  My question is: Why not just hit the brakes?  Instead of murdering your owner.   Driverless cars even as far back as the World’s Fair of 1939 were supposed to automatically space cars in traffic to avoid tailgating.  My Department of Motor Vehicles issues a drivers handbook that says allow several car lengths to have enough safe braking distance.

        Read the book Wired For War: The Robotics Revolution and Conflict in the 21st Century by P. W. Singer.  Mike, keep asking the question: Do you trust the people who control the robot?

        • Mike

           I put “deciding” in quotes because (so far) it’s not really the software that makes the decision, but rather the programmer who wrote the code is making the decision in advance.  And they are told how the decision is to be made by their management (corporate executives).

          I believe that if a robot does something bad, the people who specified the algorithms (software) should be held liable.  That doesn’t seem to be happening today.

          Maybe in ’39 they wanted to put more space between cars, but today they are pushing robot cars so they can put them bumper to bumper on the freeway at 60mph.  They foolishly think that will solve the congestion problem (just like they thought that adding lanes would solve it).  And we think freeway pileups are bad now, just wait til there’s a blow-out when robots are TOLD TO tailgate at 60mph.

          No, Toni, I don’t trust the people who will control the robots tomorrow.  I don’t trust the people who control us today.

          I think technology is advancing faster than our ability to understand it and we will continue to have problems like runaway Toyotas and adulterated food.  Corporate executives push the technology because it makes them money, not because they think it’s good for society.

Robin and Jeremy

Robin Young and Jeremy Hobson host Here & Now, a live two-hour production of NPR and WBUR Boston.

April 14 4 Comments

Lessons For News Media After Marathon Bombings

We take a look at what the news media got wrong and what can be learned for future breaking news coverage.

April 14 7 Comments

Marathon Bombing Survivor Loses Limbs But Finds New Life

A year after Jeff Bauman lost both legs in the bombing, he and his fiancée are expecting their first child.

April 11 9 Comments

Inside The World Of Fast Fashion

New styles go from runway to retailer at warp speed. We look at the impact of "fast fashion" and who's behind it.

April 11 3 Comments

Denver Mayor Visits Amsterdam Mayor

Worried about what legalized recreational marijuana means for Denver, Michael Hancock is speaking to his counterpart in Amsterdam.