The Ethics of Artificial Intelligence
Petra Mitchell, Amy Thomson, Daniel H. Wilson, David W. Goldman
OryCon 33
- Do we have the right to turn off an artificial intelligence?
- Amy: Buddhist definition of sentience is it is able to suffer?
- Mitchell: Can you turn it off and turn it back on again? Humans can’t be turned on.
- Wilson: If you’ve got an AI in a box, and it’s giving off signs that it’s alive, then it’s going to tell you what it wants.
- In Star Trek, Data has an on/off switch. but he doesn’t want people to know about it.
- If IBM spends 3/4 of a billion dollars making an AI, can they do anything they want with it?
- Parents have a lot of rights over their children, but a lot of restrictions too.
- AI isn’t simply going to arise on the internet. IBM is building the most powerful supercomputer on the planet to achieve it. It’s not going to be random bits.
- Evolutionary pressure is one way to evolve an artificial intelligence.
- We can use genetic algorithms. We’re going to have algorithms compete, we’re going to kill the losers, we’re going to force the winners to procreate. If we were talking about humans, this would be highly unethical.
- So where to draw the boundary?
- Daniel: You are being God. You’ve defined the world, you raise a generation, they generate their answers, they’ve lived their life.
- There may be people who will become attached to illusionary intelligences – like Siri – that isn’t a real intelligence. This will happen long before anything is really intelligent emerges.
- Turing Tests
- The [something] prize happens every year. Each year they get a little closer. A higher percentage of people believe the chatbot is human.
- It’s reasonable to believe that in ten years, it won’t be possible to distinguish.
- Other ethical issues besides suffering:
- If you have an AI, and you clone it, and then you have two that are conscious, then you shut one off – did you kill it?
- How do you build robots that will behave ethically?
- Not how do we treat them, but how do they treat us?
- Now we have issues of robots that are armed and operating autonomously. Unmanned Aerial Vehicles.
- We already have autonomous robots on military ships that defends against incoming planes and missiles. And back in the 1970s, it shot down an Iranian passenger jet, killing 300 passengers.
- When the stock market tanks, who’s fault is it? The AI? The humans? It happens faster than the humans can react.
- Neural nets are black boxes. Decision trees are more obvious.
- Asimov spent most of his time writing stories about how defining laws didn’t work.
- We can’t simply say “Don’t kill humans”.
- We have dog attacks, but we don’t ban dogs.
- Tens of thousands die every year in car accidents, but we don’t eliminate cars.
- We’ll put up with a lot of loss if we get benefit.
- Japan is desperately trying to come up with robotic nursing aides because they don’t have enough people to do it.
- Thomson: a robot nursing aide is an abomination. these people are lonely.
- Wilson: if the alternative is no care, or inferior care.
- What happens when someone leaves their robot a million dollars?
- What happens when the robot butlers of the world, incredibly successful, and deployed everywhere, all go on strike?
- Wilson: you design the hardware so it can’t do that.
- If you are designing a robotic pet dog, you have an obligation to design it so it responds like a dog and inspires moral behavior because you don’t want kids to grow up thinking you can mistreat your dog, stick it in the microwave, etc.
- Questions
- Q: The internet developing sentience. How would we recognize that it is sentient?
- It’ll be exhibiting obvious behavior before we get there.
- Q: The factory of Jeeves. What if we have a factory of lovebots. And one lovebot says “I don’t want to do this anymore.”
- There was a huge number of women in the south who objected to slavery because their husbands slept with the slaves. There will be lots of opposition to lovebots.
- It would be a great story to have a lovebot show up at a battered women’s shelter.
- Q: The benefits accrued to a different party: the nursing robots may not be loved by the patients, but they will be loved by the administrators.
- Q: You have billions of dollars being poured into autonomous trading systems. They are turning them over every year. Evolutionary pressure to make better and better systems.