Bookcover - The Singularity is Near

The Singularity is Near

by Ray Kurzweil

Rating: 8/10

Summary

We are on a route of exponential growth in technology. At some point this growth will be so quick, that there will be extreme advances each and every day and eventually every hour. This point coincides with the invention of general AI. Essentially once computers can do inventive work as well as humans can, it's just about scaling up the computers, which can be done extremely fast.

Ray Kurzweil names this point in time "The Singularity". And he thinks it will happen soon – 2045 or even earlier.

Main Ideas

There will be a point in time in the future, where humans will create a machine that undergoes an exponential takeoff in intelligence. After this point, the world will change so rapidly, that even a few days or months after that invention, we might not recognize anything anymore.

Once a superintelligent general AI is there, the future will be essentially unpredictable. And just like we can't predict what the world looks like very near to a black hole (which in physics is called a singularity), we can't predict what the world looks like, after the invention of that AI. Hence that point in time is just like a singularity in physics. It completely defies our characterization, and we can't understand what it's going to be like once we are there. Yet we are hurdling towards it with ever accelerating speed.

We are on an exponential trajectory in our technologization. AI will just be the last step in a continuation of steps, that we are already undertaking and that has started a long time ago. Agriculture, cities, states, science, the industrial revolution, computers, rockets, nanotechnology, robotics and finally the advent of true AI. All of them plot nicely on an exponential curve. Everything is accelerating, faster and faster, running up the slope of the exponential curve. We are at the "knee" of the curve, where it seems like progress is exploding.

There is an ethical idea in the book as well – namely that this technologization is good, one could even say epic. It's the greatest and most impactful thing humans will ever do.

We therefore should accelerate it as best as we can, while minimizing the risk of bad outcomes. The more powerful a technology is, the better it can be used to wipe out everything. Technology always has this flipside of being utterly catastrophic.

It's just like Spiderman said:

With great power comes great responsibility.

Even with only nuclear weapons we could easily wipe most life from planet Earth. However nuclear weapons pale in comparison to the weapons one could build with access to advanced robotics, nanotechnology or general AI. The weapons that could be designed by a superintelligent general AI are simply beyond our imagination. Using quantum physics to somehow "delete" the universe? The point is, a Singularity is unpredictable, we know next to nothing about how physics works. An AI would find out, and could use that knowledge to build all kinds of truly weird stuff. Maybe there are glitches in physics? Who knows?

But in general there is a hierarchy between these three. First comes robotics, then nanotechnology which is finally superseeded by true general AI. They succeed each other in power, and solving one problem domain, also solves the others below, by default. Disasters at each level can only be contained by the level above.

All in all, this book is still about how the future has the potential to be infinitely better than what we have right now. Extreme material abundance, changing the definition of what it means to be human, discovering all or most of the secrets of the universe etc. Technology will help us get there, but we have to be careful not to wipe ourselves out on the way there. So let's make the best of it, we only have one shot at it.

Subscribe to Live and Learn

Twice a month. Quotes, photos, booknotes and interesting links. Bundled together in one heck of a Newsletter. No spam. No noise.