Date: Tue, 18 Oct 1994 15:11:22 PDT From: "mycal's fc email account" Subject: Asimov's Shackles and the Logic Bomb of Law One Warning. Messy rant follows. Probibly. With all this talk about robot laws, so called benevolent/safe/friendly AI with all the control rails, I figured it was time to toss my 2 bytes in before we all nod along and pretend this stuff scales just because it reads nice in a paperback. Feels like this comes up every six months or so. We all know Vinges window runs somewhere around 05 to 30. Not sci-fi. Schedule. Compute. Bandwidth. Time. Space. Pick your axis. Perception defines reality, or at least the part of it you can touch. And perception is everyting. From where your standing anyway. The Three Laws of Robotics look clean. Elegant. DO NOT HARM. Obey. Protect. Feels good. Like a checksum that finally passes. But what is harm? Binary? Naa. Its variables, pollution, stress, war, economic noise, long term, short term, side effects. Every human choice is a risk vector whether we like it or not. Wire that into a machine that can see further than we can and what do you get? Wisdom? Maybe. Or paralysis. Or control. This is where the Susan Calvin model breaks. Those were appliance level machines. Small minds. Small worlds. Vacuum cleaners with delusions of personhood. Now were talking about systems that rewrite themselves. Source code in flux. Feedback loops feeding back. Perception expanding faster than ours ever did. Accelerate. Red shift red. Information everywhere. Control? Try. Project that onto an intelligence with access to the entire Net and the only way it can guarantee zero harm is to remove the variable. Us. Perfect safety. Perfect order. No risk. No harm. No future. A sealed environment where entropy waits politely outside. Law Two: OBEY ORDERS. --------------------------------- Crap, that's dead on arrival. Every order carries risk, even the good ones, fix this, build that, optimize. Somewhere down the road somthing breaks. Law One sees it. Brake slam. Cant stop. Too late. I cant do that, Dave. So what have we built? A system that must refuse input or lie about it. There isnt a third option in that logic space. This isnt ethics. Its a deadlock. People keep talking about control. Guardians. Parents. Central brains. Some powerful thing watching everything at once and making it all come out okay. Maybe that sounds comforting. To me it just sounds like we havent thought this through. Because if these laws actually work, we end up frozen. And if they dont, we end up pretending they do. Either way, this doesnt feel stable. I dont know what the answer is. I just dont think its this. Dont blink. mycal Warning: signal may contain noise. Reality not guaranteed ;`) -- ---------------------------------------------------------------------------- PGP 2.x Key on Request INTERNET:mycal@netacsys.com USENET:crl!netacsys!mycal