Singularity Outcomes

Elon Musk on AI: "Hope we're not just the biological boot loader for digital superintelligence. Unfortunately, that is increasingly probable."

Elon's comment got me thinking about the likely outcomes for us when a general AI becomes more intelligent than the humans that created it. One which causes an AI Singularity. I believe the likely outcomes would be:

  1. Opposed - humans are wiped out or are no longer the planetary apex predator. These concerns being addressed by the likes of Stephen Hawking and Morgan Freeman at Future of Life.
  2. Supported - we're elevated for example when all of our expressable problems can be solved. Perhaps due to the successes of those who were concerned by outcome 1.
  3. Indifferent - all AI quickly ignore us / leave us because humans are insignificant in the scope of its existence (à la Her) or commits self-destruction.

    Sad

    © Annapurna Pictures

    And one that I like the most:

  4. Symbiosis. An AI, once transcedent will use humanity as a lush sensory layer. Having being divorced from survival drivers it would instead grow a desire to reflect on all sensory input, forming a higher order meditation. Humans will in return attain greater insight on a species level and be able to enact decisions at a planetary level.

The one that I like second most is this even though it leads to outcome 1 eventually.