A.I. Is Starting to Wear Down Democracy

13 points by mitchbob 4 days ago | 19 comments
  • Frieren 3 days ago
    Inequality is wearing down Democracy. Money votes, and money is being concentrated in just a few hands.

    AI is just one of the means to lie to citizens, but way before AI existed, when power and money concentrates in a few hands, democracy and representation is at risk.

    Redistribute wealth, and corruption and manipulation will shrink at the same pace that inequality. It happened before, it will happen again.

    • javascriptpy 4 days ago
      I would counter with, “When any political faction (lib / cons) treats its moral convictions as self-evident global truths, and sympathetic media outlets and so called journalists amplify that certainty, our democratic discourse erodes at a rate far greater than anything AI could do.”
      • JumpCrisscross 4 days ago
        That’s a criticism of direct democracy, not republics; it’s been a known failure mode (and base case) for millennia.

        The novel elements are social media and AI. I am increasingly convinced that ad-funded social media should be banned and/or tightly regulated like utilities are.

        • user32489318 3 days ago
          Ah yes, let’s the replace all ad-funded social media with state/gvn’t funded social media. That will be much better.
          • JumpCrisscross 3 days ago
            > state/gvn’t funded social media

            Not how things that are "banned and/or tightly regulated like utilities are" are funded.

            • phorkyas82 3 days ago
              Well, I wouldn't entrust critical infrastructure to the private sector which.. blindly optimizes shareholder value and sells your data anywhere. Germany's train privatization also comes to mind,.. or US healthcare - worse service for more money.
              • pickledoyster 3 days ago
                We pay for utilities directly. Also, even it if it was funded from state budget(s), there are known and working controls to make publicly funded <> state controlled.
          • bayesianbot 3 days ago
            I thought democracy was doomed not long after I first learned about it as I had already seen how 'well' people check that their beliefs match with reality. I thought I was wrong (partly to keep my sanity), that there must be some system that keeps it stable. Nowadays it is clear I was wrong - the clown show of current global democracy is nothing like what I imagined. I thought it was going to be smart, greedy leaders with complex lies that are hard to check, but instead, people are being robbed and the processes keeping them alive and fed are being destroyed all in the open, there are mostly no complex lies or smart leaders involved.

            It's hard to blame AI for something that was obvious three decades ago.

            • mitchbob 4 days ago
              • Yizahi 3 days ago
                Neural networks are basically leveling political/media playing field down, to the level of the worst of the available candidates. So if there is a regular middle ground typical politician and a morally corrupt one financed by the neighboring fascist regime, then normally a regular politician will advertise on the basis that he is a better human and better professional. Now in the advent of generated fake content, his opponent (or sponsors/fans of him) will flood the network with lies and misinformation pulling that regular politician in the minds of the voters down to his own morally corrupt populist level. At the same time that regular politician can't really respond, because a) he can't really engage in spreading obvious lies since it is contrary to his image, b) creating positive lies would never work.

                Basically since only negative lies work on general population, and positive lies are impossible, then negative lies will be generated about everyone above the currently shittiest level of politicians in a race. And neural nets are an x100 power and speed amplifier for this.

                • exodust 3 days ago
                  Is it possible to develop a camera that takes pictures in a special format that records an encrypted signature, some kind ledger technology for images?

                  The idea is authenticity of images could be challenged. If in doubt, the original photographer or source can provide verification - automated or otherwise, that only superficial X, Y, or Z edits have been made such as lighting or contrast changes.

                  Copies of the images would contain this ledger or a link. In the future maybe photos without this technology are never to be trusted as a source, or perhaps disallowed on certain platforms.

                  So it's inverted from "we must detect fake" to "we must detect authenticity." The latter being easier to control.

                  I mean, nice things are under threat. There's no escaping it. If technology got us into this mess, technology can dig us out.

                  • d3vnull 3 days ago
                    What's keeping anyone from using a trustworthy camera to take a trustworthy picture of a screen displaying an AI generated image?

                    And that's the low tech version. Taking a trustworthy camera and plugging the CMOS sensor connector into an LLM that generates its output as RAW data that feeds into the sensor isn't that hard if you want to make extra sure you're not detected.

                    Such a system might even cause more harm than good by giving people the impression that real/AI images can actually be certified.

                    • exodust 3 days ago
                      Because it will be a picture of a screen not reality? The hypothetical camera knows the scene it's taking is a flat surface or not, and puts that detail into the signature.
                    • Yizahi 3 days ago
                      I think it's in the works. Hard to say if it will make any impact globally:

                      https://asia.nikkei.com/Business/Technology/Nikon-Sony-and-C...

                      • exodust 3 days ago
                        Interesting, thanks for the link. I was more thinking of a way for the camera to analyse the scene independently of the lens, in order to generate an unbreakable signature bond between the image and the sensor data, so that manipulation of the image later breaks that bond. Or something.
                      • DEGoodmanWilson 3 days ago
                        This is, as others have pointed out, trivial to defeat. We should not be seeking engineering solutions to social problems.
                        • exodust 3 days ago
                          Is it really a social problem if AI is generating the images and spreading them around with bots? If no humans are involved except to knock the first dominoes over, it's a stretch to call the chain reaction a "social problem".
                        • pvdebbe 3 days ago
                          Already developed, and things are slowly taking effect. There's no need for special formats, just to sign the JPEGs and raw files. https://contentauthenticity.org/
                        • 123yawaworht456 3 days ago
                          current thing bad