How will AI destroy the world?

May 08, 2023

Seriously, how? Because nobody seems to be able or willing to dig deeper into how the development of programs can bring about the apocalypse. Here is a list of things which humans have catastrophically failed to predict about the onset of AI:

  • Math, physics, and other formal logic systems would be its main domain of expertise
  • Programming would be one of the last domains it would touch (ironic considering point number 1)
  • Creative endeavors such as creative writing, art, and music were flat out impossible

Some people look at this in fear that we won’t be able to predict what the AI’s can do until it’s too late. Their solution? Proceed to make outlandish predictions at an increased quantity. It’s ridiculous to think that anything useful will come out of this mental masturbation around apocalypse porn. You will not stop progress, you will not stop invention, you will not stop the creative and curious minds of the world from building the future.

Let’s look at what AI is the best at today: creative endeavors. They were one of the first things that AI was taught how to do (note: taught, not learned. They only learned what humans told them to learn). This worked because to humans, if a couple pixels are off in a 1920x1080 image, we will not notice. There is room for error and the consequences of such are marginal at best. Due to the low consequences of error it has had the freedom to have quick cycle times for learning.

Now look at autonomous vehicles. A “wrong pixel” equivalent here could mean the difference between a life saved or not. It takes much longer to get this right than an image generator because we can’t just deploy it and see what happens. This implies that things that have high consequences for error have slow cycle times, and as such will not improve as fast. Anything that AI can do that has significant consequences to humans is slow moving, and anything that they are deployed to has a human in the loop to verify that it doesn’t harm humans.

Anything that is big enough to threaten the destruction of civilization will move so slowly that it will be gobsmackingly obvious to any bystander that we need to ward against it (such as nuclear weapons), and there will be many opportunities to fix the problem before we hit rock bottom. The impression I get from the public discourse nowadays it’s a clash of culture rather than an extinction level event. Rome wasn’t built in a day, nor was it destroyed in a day.