Skip to playerSkip to main contentSkip to footer
  • 2 days ago

Category

🗞
News
Transcript
00:00I'd like to ask you then, Dr. Yampolsky, what reason do you think AI would have for Sentinel AI to wipe out the human race? Why would it want to?
00:13Right. So that's a wonderful question. And right now what you're doing is you're asking me what I can come up with as a human with a certain level of IQ. And you can ask me, how would you do it? How would it kill everyone that doesn't even have hands? And I can give you all the standard answers about synthetic biology and nanobots and hiring humans on the internet with Bitcoin. But the interesting thing is you cannot predict what a smarter agent will do. I have no idea of what game theoretic reasons superintelligence would see to do this.
00:42Maybe it doesn't want us to create a competing superintelligence. It was suggested that we create multiple AIs and they kind of compete with good AIs, the bad AIs. Maybe we are casualty, collateral damage in that AI war. Maybe it's beneficial to do it to obtain certain resources. Maybe it's really into climate change and wants to make sure we don't pollute. I have no idea what the actual way to do it is or how it's going to do it.
01:11What I'm saying is that no one can guarantee what a smarter than you system will do. We don't have good ways of explaining how they arrive at certain decisions. And we cannot predict specific decisions ahead of time.

Recommended