Quantum Algorithms for Lattice Problems

233 points by trotro 1 year ago | 124 comments
  • j2kun 1 year ago
    I work on homomorphic encryption, and there are some rumors circulating that, if this checks out, it will break some of the leading FHE schemes like BFV, where the moduli used are quite large (in the hundreds of bits or even over a thousand bits).
    • ilya_m 1 year ago
      … only if scalable quantum computers exist.
      • warkdarrior 1 year ago
        If scalable quantum computers do not exist, we do not need PQC.
        • sgt101 1 year ago
          We need PQC about 20 years before practical, scalable gate quantum computers appear (if they can do all the right gates).

          I think that this will be signaled when someone factors a 32 bit integer on one. At that point I guess it'll be about 20 years before someone can factor a 2048 bit integer, and I'll get twitchy about what I am sending over the wire with PKI. My feeling is that all my secrets from 20 years ago are irrelevant to life now so I feel 20 years of warning is quite sufficient.

          • foota 1 year ago
            Hemomorphic encryption is not the same thing as post quantum crypto?
            • Ar-Curunir 1 year ago
              FHE is still only known from lattices, and has nothing to do with post-quantum computers.
            • odyssey7 1 year ago
              I wouldn't bet against the existence of a modern Bletchley Park analogue.
          • troq13 1 year ago
            Just a bit more improvement and they might be able to use a computer that doesn't exist to break an encrypting scheme nobody uses. Alarming.
            • j2kun 1 year ago
              Major systems and big companies like Google are already mid-transition to PQC. So it is alarming.
              • AnthonyMouse 1 year ago
                More to the point, the purpose of the encrypting system nobody uses is to have something to use if anybody ever makes the computer that doesn't exist. Now if that happens, what?
                • tempaway4785751 1 year ago
                  We really need to get people to take really complicated risks that might never come to pass much more seriously. Perhaps someone smart can explain the really complicated risks that might never come to pass to the government that doesn't really look beyond the three year time horizon and get them to allocate some of their money that doesn't really exist to help.
                • anonymousDan 1 year ago
                  Furthermore this could have implications for fully homomorphic encryption schemes based on lattices. But nonetheless I laughed :)
                  • rgmerk 1 year ago
                    So a thing which is currently useless because it runs at a speed that makes the Harvard Mark I look fast, might be rendered useless if a thing that doesn’t physically exist despite decades of effort is constructed? :P)
                  • troq13 1 year ago
                    Google has dozens of chrome extensions in their app store that anyone can check in 2 mins are plain malware, and they do nothing about it. If they cared about security that's what they would be working on, these guys just want to publish papers.
                    • j2kun 1 year ago
                      I'm sure they have thought more about how to prioritize security threats than an anonymous internet commenter.
                    • adastra22 1 year ago
                      Their deployment is additive. You would need to break both the PCQ and classical schemes, so they’d be unaffected here.
                      • less_less 1 year ago
                        They wouldn't be immediately hacked, especially as this is a quantum algorithm anyway. But if it turns out that the current PQC schemes are not quantum-resistant, then that work will need to be redone (unless the progress in quantum computing stalls out, I guess). The current result does not break Kyber / Dilithium / NTRU variants / Falcon / FrodoKEM even assuming it's correct, but obviously there's some concern that the a follow-up result might improve on it.

                        The NIST process has been running for 7 years, though they do have a few "non-lattice" schemes waiting for a 4th round of standardization: the code-based schemes Classic McEliece, BIKE and HQC. We could switch over to those, and the work to add crypto-agility to protocols would not be wasted, but the work on lattice software and hardware would be largely wasted.

                        Also, error-correcting codes are also solving short-vector problems in a lattice! But since the lattice has a different shape maybe it would be fine? After codes the list gets pretty thin... like there's CSIDH, but it's very slow, has partial quantum attacks, and it isn't very trusted after SIKE got broken in half.

                  • Beldin 1 year ago
                    Headline should be "polynomial time quantum algorithms for solving lattices" or somesuch. The polynomial time aspect is the main contribution here - and also why this is attracting attention.
                    • anonymousDan 1 year ago
                    • tromp 1 year ago
                      How does this affect these statements on Wikipedia [1]

                      > some lattice-based constructions appear to be resistant to attack by both classical and quantum computers. Furthermore, many lattice-based constructions are considered to be secure under the assumption that certain well-studied computational lattice problems cannot be solved efficiently.

                      and [2] ?

                      > One class of quantum resistant cryptographic algorithms is based on a concept called "learning with errors" introduced by Oded Regev in 2005.

                      [1] https://en.wikipedia.org/wiki/Lattice-based_cryptography

                      [2] https://en.wikipedia.org/wiki/Ring_learning_with_errors_key_...

                      • doomrobo 1 year ago
                        The idea of "appear to be resistant to attack" is an empirical one. When someone says that, they are saying that we simply have not found a good attack against this problem. That can change any day, in principle. Unfortunately, "we don't know of an attack" is about as strong a statement you can make in cryptography, when talking about a fundamental hardness assumption. More verbosely, you'd say "the best known attacks take 2^whatever operations on a computer (classical or quantum), and that's expensive, so we're probably fine unless someone makes a significant leap tomorrow"
                        • adgjlsfhk1 1 year ago
                          imo, this isn't quite true. there are a lot of areas where we can say "this looks sufficiently secure for now, but given the rate of advancement in this area in the last decade, we expect it will probably lose a few bits of security in the next decade"
                        • westurner 1 year ago
                          CRYSTALS-Kyber, NTRU, SABER, CRYSTALS-Dilithium, and FALCON are lattice-based method finalists in NIST PQC Round 3.

                          [1] NIST Post-Quantum Cryptography Standardization: https://en.wikipedia.org/wiki/NIST_Post-Quantum_Cryptography...

                          The NTRU article mentions PQ resistance to Shor's only, other evaluations, and that IEEE Std 1363.1 (2008) and the X9 financial industry spec already specify NTRU, which is a Round 3 Finalist lattice-based method.

                          In [1] Under "Selected Algorithms 2022", the article lists "Lattice: CRYSTALS-Kyber, CRYSTALS-Dilithium, FALCON; Hash-based: SPHINCS+".

                          Round 4 includes Code-based and Supersingular elliptic curve isogeny algos.

                          FWIU There's not yet a TLS 1.4/2.0 that specifies which [lattice-based] PQ algos webservers would need to implement to support a new PQ TLS spec.

                          • wbl 1 year ago
                            Do you know how little we know? We don't even know P isn't PSPACE!
                          • da-bacon 1 year ago
                            People seemed to be focusing on the fact that this wouldn’t break the NIST leading PQC public key cryptosystem, but I think that misses the point. This takes a problem at the core of this security, which previously only had an exponential approximation, and finds a polynomial approximation. Sure that polynomial is too high O(n^4.5) to break the leading proposed systems, but I mean are you really feeling safe when an exponential just changed to a polynomial?

                            An analogy would be something like this. Factoring is hard. We base RSA on the hardness of this problem and there we use numbers that are the product of two primes. Someone just found an algorithm that doesn’t work to find the product of two primes, but can take a product of four primes and return two products of two primes. Do you feel safe with RSA?

                            Anyway the paper could be wrong or it could be right, it will take a while for those in the field to dig through this. As a cautionary tale, there have been a few extra good quantum people who have proposed quantum attacks on lattice problems that have later been shown to have bugs.

                            • Ar-Curunir 1 year ago
                              The running time of attacks hasn't suddenly become O(n^4.5). The latter figure describe the noise ratio for which the LWE assumption becomes broken in quantum polynomial time.

                              The proposed post-quantum encryption schemes use a much smaller noise ratio which (at the moment) is not affected by these attacks.

                              • da-bacon 1 year ago
                                I didn’t say the runtime did I? The approximation ratio went from exponential to polynomial noise ratio. This just went from 2^n to n^4.5 and everyone seems to say “oh this is fine”.
                                • Ar-Curunir 1 year ago
                                  The attackable noise ratio did not go from exponential to polynomial either. It went from classically subexponential to quantumly polynomial.
                            • 1 year ago
                              • deknos 1 year ago
                                Are the OpenSSH lattice instances or the ones of DJB affected by this problem?
                                • axblount 1 year ago
                                  Does this result apply to all LWE problems? Does this approach care about LWE vs Ring-LWE at all?

                                  If so, it's a big blow to systems like FrodoKEM that banked on unstructured lattices providing higher security.

                                  • tux3 1 year ago
                                    Not a lattice expert, so add salt to taste, but it looks like LWE in general (incluring RLWE)

                                    But the current attack essentially wants q > n^2, so even if it is confirmed, not all LWE schemes are dead. There will certainly be people who tweak the params in response and carry on.

                                    However, attacks only get better. And for people in FHE who are squeezed between performance problems and dangerously thin security parameters, it is a bad day if confirmed. There's no credible practical alternative to LWE for FHE...

                                    • j2kun 1 year ago
                                      RingLWE security reduces to LWE via a relatively simple reduction (see https://www.jeremykun.com/2022/12/28/estimating-the-security...).
                                    • hellobye 1 year ago
                                      Hello everyone. I am a college student and currently new to this field. If possible can somone explain in simple terms that what real future impacts would this paper can create?
                                      • swells34 1 year ago
                                        It would be silly not to first ask your interpretation, given that you are a college student.

                                        Since this is about quantum computing, real world effects are very likely to be none except an exorbitant amount of grant money.

                                      • tschumacher 1 year ago
                                        Some post-quantum signatures like CRYSTALS-Dilithium are based on lattices. Makes me think that quantum key distribution (what I've been working on for the past 6 months) has a chance to actually become useful instead of being only of interest to academics and to a few companies that sell overpriced solutions to paranoids.
                                        • hannob 1 year ago
                                          QKD does not solve the problem that quantum computers create, and cannot replace public key cryptography. That's a common misconception that the marketing departments of QKD research tries to keep alive.

                                          Even under ideal conditions (whether these can exist is debatable), the best QKD gives you is a securely encrypted channel only when you already have a securely authenticated channel. The latter is extremely important, makes the whole thing mostly useless, and is often omitted by QKD advocates.

                                          • HappyPanacea 1 year ago
                                            If you don't have an authenticated channel, you are susceptible to a MITM attack which makes any asymmetric crypto useless. Thus I think there is an implicit assumption in any asymmetric crypto that you already have an authenticated channel. Or did I miss something?
                                            • ilya_m 1 year ago
                                              Grossly simplifying, Alice and Bob may establish an authenticated channel either by physical means (a wire) or by some combination of certificates/passwords and out-of-band authentication. Most of the time, QKD implicitly assumes the former - a line-of-sight connection or a fiber-optics cable. In these circumstances the parties might as well exchange flash drives with one-time pads, similarly to how the Kremlin-White House hotline was protected.
                                          • Vecr 1 year ago
                                            Code based systems are still in, and classic McEliece could be extended to ~50 MiB for a keypair and still be way more practical than QKD. Just run the max current classic McEliece spec hybrid post quantum with X448.
                                            • sgt101 1 year ago
                                              NSA is that you?
                                              • karma_pharmer 1 year ago
                                                please explain?

                                                OP recommended McElice, not DUAL_EC_DRDBG. Is there something I should know about the former?

                                          • ColinWright 1 year ago
                                            There is an update:

                                            https://news.ycombinator.com/item?id=40086515

                                            "Update on April 18: Step 9 of the algorithm contains a bug, which I don’t know how to fix."

                                            ...

                                            "Now the claim of showing a polynomial time quantum algorithm for solving LWE with polynomial modulus-noise ratios does not hold."

                                            • ghostway 1 year ago
                                              From the paper:

                                              > Let us remark that the modulus-noise ratio achieved by our quantum algorithm is still too large to break the public-key encryption schemes based on (Ring)LWE used in practice. In particular, we have not broken the NIST PQC standardization candidates. For example, for CRYSTALS-Kyber [BDK+18], the error term is chosen from a small constant range, the modulus is q = 3329, the dimension is n = 256 · k where k ∈ {3, 4, 5}, so we can think of q as being almost linear in n. For our algorithm, if we set αq ∈ O(1), then our algorithm applies when q ∈ Ω^~(n^2), so we are not able to break CRYSTALS-Kyber yet. We leave the task of improving the approximation factor of our quantum algorithm to future work.

                                              • ghostway 1 year ago
                                                (of course, this doesn't mean we are in the clear -- a polynomial-time algorithm is alarming)
                                                • rhaps0dy 1 year ago
                                                  I don't understand your comment in the context of the previous comment you posted. AIUI, the excerpt says "our algorithm only applies when the modulus q is larger than n^2" where n is 2563 or 2566 (I guess?). So the excerpt would be saying that the algorithm does not apply in this case, because 3000 << (256*3)^2. Right?
                                                  • abdullahkhalids 1 year ago
                                                    If the history of cryptography is any guide, even though this result doesn't break LWE crypto-protocols, it's much more likely now that someone will come up an improvement that will break LWE crypto-protocols. First constructions of algorithms are rarely optimal.

                                                    Even though the opposite is possible as well, now that a concrete algorithm has been made. Someone could very well prove that LWE crypto-protocols are secure against some class of algorithms this algorithm belongs to.

                                                    Of course, right now, we should just wait for the experts to read the paper and check if there are any problems.

                                                  • Ar-Curunir 1 year ago
                                                    The algorithm is only quantum-polynomial time for a parameter regime not applicable to the PQC candidates.
                                                  • pclmulqdq 1 year ago
                                                    Factorization and discrete log are also polynomial on a quantum computer, and we are very good at just increasing bit widths. If CRYSTALS is also polynomial in BQP, there is very little reason to invest so much into it.

                                                    I am still of the (very controversial) opinion that the only PQC algorithm worth investing in at the expense of classical algorithms is Classic McEliece. This is a code that has stood up to classical and quantum cracking attempts for a very long time - cracking these codes is equivalent to creating a very valuable algorithm in error correcting codes.

                                                    The NIST also is dead set on people using only PQC or classical crypto, not a wrapper with both. That is stupid IMO.

                                                    • less_less 1 year ago
                                                      It's NSA who wants only PQC and not hybrid. NIST is fine with hybrid. They don't plan to standardize hybrids as entire units, but they said they plan to standardize the KDF modes you'd need to build them.
                                                      • cryptonik 1 year ago
                                                        Thanks for your comment, very interesting. About your last paragraph : Do you know why NIST refuses hybridization, when European agencies imposes it ? What is the political behind it ?
                                                        • pclmulqdq 1 year ago
                                                          The charitable interpretation I would give the NIST - and a very real concern - is that they are not sure that one form of cryptography doesn't weaken the other, without proofs. Since these cryptosystems also tend to work in different number fields, it's very hard to prove anything about their interactions at all.

                                                          We all know the uncharitable interpretation, that the PQC algorithms may be backdoored.

                                                          • kamilner 1 year ago
                                                            NIST does not refuse hybridization, they will be publishing guidance on hybrid schemes in the draft of SP 800-227 at the same time as the final standards. They don't impose it though, because at a large scale it's more efficient to run just (fast) ML-KEM instead of (fast) ML-KEM + (slower) ECDH, which more than doubles your computation time for what they see as no benefit.
                                                          • pseudo0 1 year ago
                                                            > The NIST also is dead set on people using only PQC or classical crypto, not a wrapper with both. That is stupid IMO.

                                                            Yeah, this is rather baffling. After SIKE got broken, you'd think they would have realized the importance of combining these new cutting-edge candidates with something reliable.

                                                            • Ar-Curunir 1 year ago
                                                              The remark clearly states that CRYSTALs is not affected by this attack.
                                                            • 1 year ago
                                                            • JoachimS 1 year ago
                                                              There was an update of the paper 2024-04-18:

                                                              "Note: Update on April 18: Step 9 of the algorithm contains a bug, which I don’t know how to fix. See Section 3.5.9 (Page 37) for details. I sincerely thank Hongxun Wu and (independently) Thomas Vidick for finding the bug today. Now the claim of showing a polynomial time quantum algorithm for solving LWE with polynomial modulus-noise ratios does not hold. I leave the rest of the paper as it is (added a clarification of an operation in Step 8) as a hope that ideas like Complex Gaussian and windowed QFT may find other applications in quantum computation, or tackle LWE in other ways."

                                                            • dogeprotocol 1 year ago
                                                              Will be interesting to see how this pans out.
                                                              • deyiao 1 year ago
                                                                If the findings of this paper hold up, I believe it could pretty much undo a decade of NIST's efforts in post-quantum cryptography. a seismic shift in the world of cryptography.
                                                                • kyoji 1 year ago
                                                                  Not entirely true, there are other PKE and DSA algorithms that were/are a part of the competition that used problems not related to lattices. However, the lattice-based options were often among the fastest and smallest.
                                                                  • tux3 1 year ago
                                                                    Isogenies vindicated? :)
                                                                    • tptacek 1 year ago
                                                                      I know you're kidding but for the benefit of the class isogeny schemes were pulled when their best candidate design turned out to be breakable with a Python script owing to obscure non-cryptographic mathematic research from the 1990s.

                                                                      I'd expect we're not getting isogenies back. :)

                                                                  • tptacek 1 year ago
                                                                    No? One of the side effects of running an open competition is that it focused attention on a variety of competing options for this, all of which were formalized, recorded, and publicly evaluated by the world's academic cryptography experts. We're strictly better off as a result, and much of NIST's own work would still be valuable even in a hypothetical scenario in which none of LWE was quantum-safe.
                                                                    • bawolff 1 year ago
                                                                      This is the reason why nist did the decade of work - to focus effort on figuring out what options are secure. Finding out an option is not secure is a good thing. Its why we are putting effort into PQC now before quantum computers are a real threat.