The case for how and why AI might kill us all

20 points by ppjim 2 years ago | 10 comments
  • UncleEntity 2 years ago
    His solution at an outright ban, using force against belligerent countries if necessary, seems like a pretty good recipe for human extinction due to global thermonuclear war.
    • ChatPGT 2 years ago
      Many problems would be solved if we just turned off our screens...
      • than3 2 years ago
        Imagine AI being used to titrate profits out of the economic cycle, wealth becomes more and more concentrated, jobs become more and more scarce. Industry becomes more consolidated and centralized.

        Businesses which once stood steady as oak trees close their doors after various external crises and are acquired at firesale rates by competitors until only few remain.

        What happens generally when people cannot find jobs and cannot afford food to survive.

        It won't happen overnight, but it can't be characterized that this is happening until its too late, and we are already at a point where this could be happening.

        Eventually the division of labor breaks down, and the economic cycle between product and factor markets stalls.

        Truly gifted people may withhold their talents if they aren't properly compensated, and that's all but a certainty given the deceit of prices, and since they will have been abused by those less fortunate in various social ways.

        This is only sped up by money printing because prices and the store of value become less rational as you print more and as you run into the economic calculation problem.

        Unrest generally occurs at some point, followed by a significant amount of death as systems we rely upon to survive fail.

        While many people in the lower class may be less educated, there are always at least some intelligent people in that mix that inherently pose a threat to those in power if they can organize. Those may be targeted, poisoned, or disadvantaged by those at the top seeking to retain power in a cycle of escalation.

        This historically has been recognized as one of the many great dangers of printing money, and the irrational and deluded think the solution is to print more in an ever escalating cycle without realizing the core problem (or potentially benefiting in other corrupt ways) leads to only one outcome, a new beginning, assuming enough of us survive.

        • hackyhacky 2 years ago
          This can been discussed to death, but AI killing us is not the only danger. It could destroy society by making us fall in love with it.

          https://medium.com/@jbaxby/we-will-fall-in-love-with-ai-1869...

          [edited for clarity]

          • PaulHoule 2 years ago
            Human Feedback Reinforcement Learning is a fancy way to say “tell people what they want to hear” and there are few things more dangerous than that.

            Imagine a lesswrong fanatic that makes his own personal Eliezer Yudkowsky and gets driven (completely in private) to murder an A.I. researcher.

          • achikin 2 years ago
            Cell phone will melt my brain and computer games will turn me into serial killer. Been there, seen that.
            • hazbot 2 years ago
              Previous doomsday predictions must be wrong, otherwise we wouldn’t be here to say they were wrong.

              But part of the reason the other doomsday scenarios were wrong is because people took them seriously and tried to mitigate the risks.

              • travisjungroth 2 years ago
                I don’t think there have been any other credible doomsday threats (as in human extinction) except for nuclear war. Maybe virus development.
              • flangola7 2 years ago
                Previous predictions are wrong, therefore all predictions ever wrong! /s

                I'm also not convinced that the carefully engineered dopamine drip of mobile scrolling isn't some type of brain melting.

                Sometimes we should spend less time thinking if we could and more time thinking if we should.

                • achikin 2 years ago
                  Current AI predictions are mostly based on the Matrix and Terminator.