Is Superintelligent AI an Existential Risk Nick Bostrom on ASI
Artificial superintelligence or a superintelligence in general is a hypothetical agent that possesses intelligence far surpassing that of the brightest and most gifted human minds. According to the most popular version of the singularity hypothesis, called intelligence explosion, an upgradable intelligent agent will eventually enter a runaway reaction of selfimprovement cycles, each new and more intelligent generation appearing more and more rapidly, causing an explosion in intelligence and resulting in a powerful superintelligence that qualitatively far surpasses all human intelligence. I. J. Good s intelligence explosion model predicts that a future superintelligence will trigger a singularity. Four polls of AI researchers, conducted in 2012 and 2013 by Nick Bostrom and Vincent C. Müller, suggested a median probability estimate of 50 that artificial general intelligence or AGI would be developed by 2040 to 2050. Public figures such as Stephen Hawking and Elon Musk have expressed concern that ful
|
|