Wednesday, January 22, 2014

Re: Singularity Destruction of Mankind or Blessing to Humanity?

Micah wrote: "Sinjin wrote: "...such an entity will rapidly evolve a super-intelligence and as part of its utility function determine that we pose a threat, or are irrelevant and therefore useless to its needs.

 There's no way to predict the outcome unless you know what such an entity's needs actually are. Are its needs self-determined? Or are they a by-product of how it was created, intentionally or not? I could see such an entity just not giving a damn about us, and then gobbling up all the processing power, networking and communications infrastructure of the world for its own use, thus depriving us of all those resources and driving us back into a pre-computer age. Kind of the "humans gobble up the entire planet's ecosystem" scenario to the detriment of all other flora/fauna. But then you may have limiting factors such as can this essentially software entity continue existing without the physical means of maintaining, repairing and/or expanding its hardware side? I.e., does the AI possess any kind of physical agency that could circumvent the need for humans altogether? If not, you may end up with a symbiosis of sorts where humans bargain to maintain some processing, communication and networking power in return for servicing the AI's physical resource needs. There are just too many imponderables without spelling out the initial assumptions of the AI's nature."..."

That view seems to be the general view shared by a lot of AI researchers.

No comments: