• 6 years ago
Joanna Bryson isn't a fan of companies that can't be adults and hold themselves responsible. Too many tech companies, she argues, think that they're above the law and that they should create what they want, no matter who it hurts, and have society pick up the pieces later. This libertarian attitude might be fine in theory, or if you're a college philosophy major. But if you're a major company releasing something like unmanned flying machine guns upon the world, perhaps there should be some oversight. Tech companies, she argues, could potentially create something catastrophic that they can't take back. Which is why regulation over these tech behemoths is needed now more than ever.

Read more at BigThink.com: http://bigthink.com/videos/joanna-bryson-creation-without-consequence-how-silicon-valley-made-a-hot-mess-of-progress

Follow Big Think here:
YouTube: http://goo.gl/CPTsV5
Facebook: https://www.facebook.com/BigThinkdotcom
Twitter: https://twitter.com/bigthink

If we're coding AI and we understand that there's moral consequences does that mean the programmer has to understand it? It isn't only the programmer, although we do really think that we need to train programmers to be watching out for these kinds of situations, knowing how to whistle blow, knowing when to whistle blow. There is a problem of people being over-reactive and that costs companies and I understand that, but we also have sort of a Nuremberg situation that we need everybody to be responsible. But ultimately it isn't just about the programmers, the programmers work within the context of a company and the companies work in the context of regulation and so it is about the law, it's about society. One of the things, one of the papers that had come out in 2017 was Professor Alan Winfield was a thing about how if legislatures can't be expected to keep up with the pace of technological change, what they could keep up with is which professional societies do they trust. And they already do this in various disciplines; it's just new for AI. You say you have to achieve the moral standards of at least one professional organization so when they give their rules about what's okay. And that sort of allows you kind of a loose coupling because it's wrong for professional organizations to enforce the law to go after people to sue them, whatever. That's not what professional organizations are for. But it's also sensible it is what professional organizations are for is to keep up with their own field and to set things like codes of conduct. So that's why you want to bring those two things together the executive government and the professional organizations and you can kind of have the legislature join those two together.

This is what I'm working hard to keep in the regulations that it's always people in organizations that are accountable and so then they will be motivated to make sure that they can demonstrate they followed due process, so both of the people who are operating the AI and the people who developed the AI. Because it's like a car, when there's a car accident normally the driver is at fault, sometimes the person they hit is at fault because they did something completely unpredictable. But sometimes the manufacturer did something wrong with the brakes and that's a real problem. So we need to be able to show that the manufacturer followed good practice and it really is the fault of the driver. Or sometimes that there really isn't a fact of the matter like it was an unforeseeable thing in the past, but of course now it's happened so in the future we'll be more careful.
That just happened recently in Europe there was a case where somebody was on... it wasn't like a totally driverless car, but I guess it was cruise control or something it had some extra AI and unfortunately somebody had a stroke. Now what happens a lot and what automobile manufacturers have to look for is falling asleep at the wheel, but this guy had a stroke, which is different from falling asleep. So he was still kind of holding on semi in control but couldn't see anything, hit a family and killed two of the three of the family. And so the survivor was the father and he said he wasn't happy only to get money from insurance or whatever the liability or whatever, he wanted to know that whoever had caused this accident was being held accountable.

Category

🤖
Tech

Recommended