The Singularity (The Other Exponential Function)
They say that machines will be as intelligent as humans, briefly.
I see this as a race between exponentials.
This is easy to dismiss with conformational bias, but can we play a little make-believe game in which artificial intelligence get the upper hand and solves the energy conundrum. And then what? The people who fret about these things say that there are no guarantees that the AI will be biddable.And could just as easily be fatal.
Ideally I would like things to be run by a competent machine than an incompetent and indifferent bunch of academic fools.
If such a machine could be created (and controlled) and told to desire maximum utility for humans – I wonder what it would come up with?
Well, not the Fed!
Depends on what you mean by controlled, and who is/are those at the controls.
If the AI is truly as advanced to determine maximum utility for humans, it should be sufficiently advanced as to not require controls. I wouldn't give it launch codes for nukes, but I give it the ability to prevent them being used.
There's been so much sci-fi written on this principal, pick an author whose ending results in maximum utility not species annihilation and you might get some of the answer.
I'm just not that smart.