Thread: Killer robots

Results 1 to 3 of 3
  1. #1 Killer robots 
    PORCUS MAXIMUS Rockntractor's Avatar
    Join Date
    Apr 2009
    Location
    oklahoma
    Posts
    41,841
    Killer robots can be taught ethics
    by Mark Rutherford
    14 comments

    (Credit: Signet)


    Adherence to the Three Laws of Robotics as put forth by Isaac Asimov has been, until now, entrusted to whoever held the joystick. That may change.

    A robotics engineer at the Georgia Institute of Technology has developed an "ethical governor," which could be used to program military robots to act ethically when deciding when, and whom, to shoot or bomb.

    Ron Arkin has demonstrated the system using attack UAVs and actual battlefield scenarios and maps from recent U.S. military campaigns in Afghanistan. (videos)

    In one scenario, a drone spots Taliban soldiers, but holds its fire because they're in a cemetery--fighting there is against international law.

    In another, the UAV identifies an enemy convoy close to a hospital, but limits itself to shooting up the vehicles so as to avoid collateral damage to the hospital. The mindful bot would also house a built in "guilt system," which would force it to behave more cautiously, after making a mistake.

    While the work shows promise, it also draws attention to the inadequacy of trying to program machines with morals, especially ones expected to perform in a complex battlefield environment, according to experts.

    "Robots don't get angry or seek revenge but they don't have sympathy or empathy either," Noel Sharkey, a roboticist at Sheffield University, U.K., told New Scientist. "Strict rules require an absolutist view of ethics, rather than a human understanding of different circumstances and their consequences."

    Arkin acknowledges that it may take a while before we can trust predators and other unmanned killers with life and death decisions.

    "These ideas will not be used tomorrow, but in the war after next, and in very constrained situations." Arkin is quoted in New Scientist. "The most important outcome of my research is not the architecture, but the discussion that it stimulates."
    http://news.cnet.com/cutting-edge/
    It is to bad democrats can't be taught ethics to!
    Reply With Quote  
     

  2. #2  
    Senior Member
    Join Date
    Apr 2009
    Posts
    1,000
    Quote Originally Posted by Rockntractor View Post
    http://news.cnet.com/cutting-edge/
    It is to bad democrats can't be taught ethics to!
    Might be able to implant that governor in their heads!
    Reply With Quote  
     

  3. #3  
    An Adversary of Linda #'s
    Join Date
    Aug 2005
    Posts
    22,891
    Quote Originally Posted by Rockntractor View Post
    http://news.cnet.com/cutting-edge/
    It is to bad democrats can't be taught ethics to!
    At best a set of basic routines that can be added to an the portable robotic operating system core kernel to prevent them from going into a frenzied battle field kill mode.If they are to be used unattended in battle the genevia conventioneers will go nuts in their attempts to keep warfare civil and PC !To be useful real battle field robots must be isolated and confined to a very limited area hotspot to save human causality's only.Armored entry into traps like wired houses or caves are ideal for these battle robots.
    Reply With Quote  
     

Bookmarks
Bookmarks
Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •