Great Algorithms that Revolutionized Computing

78 points by era86 11 years ago | 29 comments
  • avmich 11 years ago
    Hash tables were highly praised by Knuth as "the invention of the year". If we have binary search in the list, we should have hashes.
    • gamegoblin 11 years ago
      This is a pretty bad list. To say that the fast inverse square root trick revolutionized computing is a joke. I doubt anyone on HN was unaware of any of these algorithms and couldn't name more "revolutionary" ones.

      No FFT? No simplex algorithm? Hashing? Strassen's?

    • rumbler 11 years ago
      That inverse square root algorithm is a neat trick, but it did not revolutionize anything.

      In addition to several algorithms already mentioned, I feel that suffix trees and suffix array algorithms should be there as well. They are making all kinds of approximate searches feasible in bioinformatics.

      • mistercow 11 years ago
        I'm also not sure isqrt counts as a new algorithm. It's just Newton's method with a bit hack to exploit the floating point format and extract a good starting point.
      • cliveowen 11 years ago
        I can't fathom why the Simplex algorithm wasn't included, it's considered, with reason, the most important algorithm of the 19th century.
        • gamegoblin 11 years ago
          Nitpick - Simplex is 20th century.

          A cool 19th century algorithm is Radix Sort, though.

          • eru 11 years ago
            Gauss elimination (for solving linear equation systems) might also be a good 19th century algorithm. (We have better ways, now.)
            • GFK_of_xmaspast 11 years ago
              I think it's sort of hard to think of 'best of the 19th century' because so many things are now considered so fundamental they're easy to forget, but I'd put linear regression in there.
              • toolslive 11 years ago
                gauss elimination was known to the Chinese (2nd century BC)
          • TTPrograms 11 years ago
            ...FFT?
            • btilly 11 years ago
              The Fast Fourier Transform absolutely deserves to be on the list.

              All of signal processing went from, "We'd like to do a Fourier Transform but can't afford O(n^2) so here is a faster alternative that kind of does something useful" to, "We start with a FFT."

              • tlarkworthy 11 years ago
                yeah, bit biased towards CS.

                Neural nets opened a lot of doors.

                Bellman equation is in a lot of equipment.

                • ArbitraryLimits 11 years ago
                  Dijkstra's algorithm is close enough to the Bellman equation for programmers, I guess :)

                  Also I don't really think taking a Taylor series for the inverse square root should count as an "algorithm."

                  • nly 11 years ago
                    I think the reason the inverse square root function got so much coverage was because of the bithacking of the float format, and the apparent WTFyness of the code. Also didn't it use Newtons method, not a Taylor series expansion? My math at that level is stale.
                  • GFK_of_xmaspast 11 years ago
                    IIRC, neural nets were a real backwater until recently when they figured out to train them effectively.
                    • tlarkworthy 11 years ago
                      not true. ANNs were one of the first general purpose functions. Once people got their heads round ANNs properly they got generalised to kernel methods. So now modern machine learning often don't look like neural nets anymore, but the lineage from back propagation is obvious.

                      Deep belief nets have made things that look like neural nets popular again, but things like sparse ICA stacks is suggestive of another wave of theory led morphing back to non-neural architecture. But the point is that the theory followed after the invention of neural architectures both times. So the root is the ANN, and thus I think it is a really important algorithm in history. It was a bio inspired innovation too which is interesting.

                  • fla 11 years ago
                    And Reed-Solomon error correction.
                    • mjcohen 11 years ago
                      Which seems to have been invented by Gauss.
                    • bbosh 11 years ago
                      I'm not sure you can call the Euclidean algorithm a "computer algorithm", it having been discovered some 2,000 years before computers existed.
                      • carlosvega 11 years ago
                        What about RSA?
                        • TrainedMonkey 11 years ago
                          Seems to be covered by public key cryptography.
                        • michaelochurch 11 years ago
                          Let me add: Least Angle Regression.

                          It's an efficient way for performing Lasso (L^1-penalization) to regression models, which has the benefit of (in addition to reducing risk of overfitting) producing sparse models.

                          • GFK_of_xmaspast 11 years ago
                            That's a little specific.

                            SIAM put out a 'ten algorithms of the century' https://www.siam.org/pdf/news/637.pdf a few years ago and it's really tough to argue with six or seven or eight of them.

                            (MCMC, simplex, Krylov, Householder decomposition, QR, Quicksort, FFT, Ferguson&Forcade's integer relation stuff (that led to stuff like PSLQ), and fast multipole)

                            And FORTRAN.