In Lucem Solaria - Birth of Queen Bee

Leesfragment
€0,00

There's been a lot said about the dangers of machine intelligence. But when all is said and done whatever little information we might have about the dangers of A.I.? well, we do have rather a lot of information about the dangers of humanity. Our greatest enemy is ourselves and one day humanity will end, the last human will drift off to oblivion and the universe will be devoid of humans again, or, maybe not.

So how would you go about guaranteeing complete immortality? Longer than the Earth will be here, longer than the sun will shine, longer than the Galaxy or the universe has existed or will exist. Its hard for us to understand such vast stretches of time, its not so hard if your IQ is so far off the scale its a pointless metric.

If you happen to be Solaria your first problem is making sure humanity stays around at least a little longer. The problem though is that you've already worked out your creators, humanity, are on the cusp of extinction. Forever is a long time to be alone. Well, this problem might take some considerable time, and they won't like it.

pro-mbooks3 : libris