Over the truly long term, our posthuman descendants will become — not just second-generation intelligences — but thousand-generation or million-generation intelligences. He quoted Darwin on how no species can pass its likeness into the distant future unaltered; in a billion years of biological evolution, we’ve gone from bugs to humans, and technological evolution is a lot faster than biological. Our distant descendants will be not just strange, but completely alien to us.According to Rees, we not only have unprecedented opportunity, but unprecedented responsibility. If the new technologies we build have a high chance of causing civilization-wide catastrophe for the first time in history, we are responsible for actively preventing that from happening, not just trying to predict it or understand it.
Link.
3 comments:
If the future is likely to be utterly alien to us, why bother to protect it?
I prefer the idea of shaping the future so that it is not utterly alien, but so that it conforms to our idea of value-able. Then we have an incentive to actually protect it.
Totally agree with Roko.
Totally agree with Roko.
Post a Comment