Lattice-based cryptography

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
7
down vote

favorite
4












How viable is lattice-based cryptography in a "practical" setting?



It has been said that lattice-based cryptography would be a "post-quantum" cryptography scheme, but is it feasibly implementable?







share|improve this question


























    up vote
    7
    down vote

    favorite
    4












    How viable is lattice-based cryptography in a "practical" setting?



    It has been said that lattice-based cryptography would be a "post-quantum" cryptography scheme, but is it feasibly implementable?







    share|improve this question
























      up vote
      7
      down vote

      favorite
      4









      up vote
      7
      down vote

      favorite
      4






      4





      How viable is lattice-based cryptography in a "practical" setting?



      It has been said that lattice-based cryptography would be a "post-quantum" cryptography scheme, but is it feasibly implementable?







      share|improve this question














      How viable is lattice-based cryptography in a "practical" setting?



      It has been said that lattice-based cryptography would be a "post-quantum" cryptography scheme, but is it feasibly implementable?









      share|improve this question













      share|improve this question




      share|improve this question








      edited Aug 19 at 3:08









      forest

      1,953427




      1,953427










      asked Aug 19 at 1:05









      Steven Sagona

      1364




      1364




















          1 Answer
          1






          active

          oldest

          votes

















          up vote
          10
          down vote













          Yes, it is feasible. In fact, the NIST post-quantum submissions include a number of lattice-based cryptographic key exchange and signature protocols. As you can see from a summary of the different types of algorithms, lattice-based algorithms dominate the submissions. These include NTRU and its variants, Newhope (R-LWE), and FALCON (designed in part by one of the regulars!). Lattice-based cryptography itself is fairly well understood, exploiting the shortest vector problem as a trapdoor function. Many new algorithms are based in one way or another on it.



          Lattice-based cryptography is one of only a few popular designs for post-quantum cryptography. There others, such as code-based cryptography, multivariate polynomial cryptography, and hash-based signatures. Of those, code-based algorithms are the only class that could realistically compete with lattice-based algorithms, since it is also well known from the McEliece cryptosystem. Multivariate polynomial cryptography is not as popular, and many of the proposed algorithms using it have been broken. Lastly, hash-based cryptography, while quite secure, is only useful for digital signatures, not key exchange. This explains why so many proposed algorithms are lattice-based.



          Lattice-based cryptography is also very fast. For example, NTRU performs private key operations even faster than RSA, since the time increases with the cube of the key size for RSA, but quadratically for NTRU. The viability of lattice-based cryptography is undisputed. All we need to do now is iron out the kinks and standardize a particular implementation.






          share|improve this answer






















          • Could you elaborate a little bit on what you mean by "iron out the kinks" and standardize a particular implementation? Thank you!
            – William Hird
            Aug 19 at 3:00






          • 2




            @WilliamHird It just means that the different algorithms will be tested, people will try to break them and will debate their merits and drawbacks. The creators will argue why their algorithm should be chosen, etc. This is all part of the NIST standardization process, a "contest" to see which algorithm should be made the official standard.
            – forest
            Aug 19 at 3:03










          • My understanding is that NTRU was invented by mathematicians at Brown University, which also has some of the most brilliant computer scientists on the planet. Why didn't Brown do the "whole package", the math, the algorithmic implementation, the protocols, ect. ? Why would you need a contest when you have all those geniuses in one place working on cryptography?
            – William Hird
            Aug 20 at 9:20






          • 1




            The answer to tha last question is known as Schneier’s law: “Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break. It's not even hard. What is hard is creating an algorithm that no one else can break, even after years of analysis. And the only way to prove that is to subject the algorithm to years of analysis by the best cryptographers around.”
            – Frédéric Grosshans
            Aug 20 at 16:25










          • @FrédéricGrosshans: OK, makes sense, the team peer review concept, just don't let anyone from the NSA be part of the team unless you like back doors in your software. :-)
            – William Hird
            Aug 20 at 17:46










          Your Answer




          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "281"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          convertImagesToLinks: false,
          noModals: false,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













           

          draft saved


          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f61595%2flattice-based-cryptography%23new-answer', 'question_page');

          );

          Post as a guest






























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          10
          down vote













          Yes, it is feasible. In fact, the NIST post-quantum submissions include a number of lattice-based cryptographic key exchange and signature protocols. As you can see from a summary of the different types of algorithms, lattice-based algorithms dominate the submissions. These include NTRU and its variants, Newhope (R-LWE), and FALCON (designed in part by one of the regulars!). Lattice-based cryptography itself is fairly well understood, exploiting the shortest vector problem as a trapdoor function. Many new algorithms are based in one way or another on it.



          Lattice-based cryptography is one of only a few popular designs for post-quantum cryptography. There others, such as code-based cryptography, multivariate polynomial cryptography, and hash-based signatures. Of those, code-based algorithms are the only class that could realistically compete with lattice-based algorithms, since it is also well known from the McEliece cryptosystem. Multivariate polynomial cryptography is not as popular, and many of the proposed algorithms using it have been broken. Lastly, hash-based cryptography, while quite secure, is only useful for digital signatures, not key exchange. This explains why so many proposed algorithms are lattice-based.



          Lattice-based cryptography is also very fast. For example, NTRU performs private key operations even faster than RSA, since the time increases with the cube of the key size for RSA, but quadratically for NTRU. The viability of lattice-based cryptography is undisputed. All we need to do now is iron out the kinks and standardize a particular implementation.






          share|improve this answer






















          • Could you elaborate a little bit on what you mean by "iron out the kinks" and standardize a particular implementation? Thank you!
            – William Hird
            Aug 19 at 3:00






          • 2




            @WilliamHird It just means that the different algorithms will be tested, people will try to break them and will debate their merits and drawbacks. The creators will argue why their algorithm should be chosen, etc. This is all part of the NIST standardization process, a "contest" to see which algorithm should be made the official standard.
            – forest
            Aug 19 at 3:03










          • My understanding is that NTRU was invented by mathematicians at Brown University, which also has some of the most brilliant computer scientists on the planet. Why didn't Brown do the "whole package", the math, the algorithmic implementation, the protocols, ect. ? Why would you need a contest when you have all those geniuses in one place working on cryptography?
            – William Hird
            Aug 20 at 9:20






          • 1




            The answer to tha last question is known as Schneier’s law: “Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break. It's not even hard. What is hard is creating an algorithm that no one else can break, even after years of analysis. And the only way to prove that is to subject the algorithm to years of analysis by the best cryptographers around.”
            – Frédéric Grosshans
            Aug 20 at 16:25










          • @FrédéricGrosshans: OK, makes sense, the team peer review concept, just don't let anyone from the NSA be part of the team unless you like back doors in your software. :-)
            – William Hird
            Aug 20 at 17:46














          up vote
          10
          down vote













          Yes, it is feasible. In fact, the NIST post-quantum submissions include a number of lattice-based cryptographic key exchange and signature protocols. As you can see from a summary of the different types of algorithms, lattice-based algorithms dominate the submissions. These include NTRU and its variants, Newhope (R-LWE), and FALCON (designed in part by one of the regulars!). Lattice-based cryptography itself is fairly well understood, exploiting the shortest vector problem as a trapdoor function. Many new algorithms are based in one way or another on it.



          Lattice-based cryptography is one of only a few popular designs for post-quantum cryptography. There others, such as code-based cryptography, multivariate polynomial cryptography, and hash-based signatures. Of those, code-based algorithms are the only class that could realistically compete with lattice-based algorithms, since it is also well known from the McEliece cryptosystem. Multivariate polynomial cryptography is not as popular, and many of the proposed algorithms using it have been broken. Lastly, hash-based cryptography, while quite secure, is only useful for digital signatures, not key exchange. This explains why so many proposed algorithms are lattice-based.



          Lattice-based cryptography is also very fast. For example, NTRU performs private key operations even faster than RSA, since the time increases with the cube of the key size for RSA, but quadratically for NTRU. The viability of lattice-based cryptography is undisputed. All we need to do now is iron out the kinks and standardize a particular implementation.






          share|improve this answer






















          • Could you elaborate a little bit on what you mean by "iron out the kinks" and standardize a particular implementation? Thank you!
            – William Hird
            Aug 19 at 3:00






          • 2




            @WilliamHird It just means that the different algorithms will be tested, people will try to break them and will debate their merits and drawbacks. The creators will argue why their algorithm should be chosen, etc. This is all part of the NIST standardization process, a "contest" to see which algorithm should be made the official standard.
            – forest
            Aug 19 at 3:03










          • My understanding is that NTRU was invented by mathematicians at Brown University, which also has some of the most brilliant computer scientists on the planet. Why didn't Brown do the "whole package", the math, the algorithmic implementation, the protocols, ect. ? Why would you need a contest when you have all those geniuses in one place working on cryptography?
            – William Hird
            Aug 20 at 9:20






          • 1




            The answer to tha last question is known as Schneier’s law: “Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break. It's not even hard. What is hard is creating an algorithm that no one else can break, even after years of analysis. And the only way to prove that is to subject the algorithm to years of analysis by the best cryptographers around.”
            – Frédéric Grosshans
            Aug 20 at 16:25










          • @FrédéricGrosshans: OK, makes sense, the team peer review concept, just don't let anyone from the NSA be part of the team unless you like back doors in your software. :-)
            – William Hird
            Aug 20 at 17:46












          up vote
          10
          down vote










          up vote
          10
          down vote









          Yes, it is feasible. In fact, the NIST post-quantum submissions include a number of lattice-based cryptographic key exchange and signature protocols. As you can see from a summary of the different types of algorithms, lattice-based algorithms dominate the submissions. These include NTRU and its variants, Newhope (R-LWE), and FALCON (designed in part by one of the regulars!). Lattice-based cryptography itself is fairly well understood, exploiting the shortest vector problem as a trapdoor function. Many new algorithms are based in one way or another on it.



          Lattice-based cryptography is one of only a few popular designs for post-quantum cryptography. There others, such as code-based cryptography, multivariate polynomial cryptography, and hash-based signatures. Of those, code-based algorithms are the only class that could realistically compete with lattice-based algorithms, since it is also well known from the McEliece cryptosystem. Multivariate polynomial cryptography is not as popular, and many of the proposed algorithms using it have been broken. Lastly, hash-based cryptography, while quite secure, is only useful for digital signatures, not key exchange. This explains why so many proposed algorithms are lattice-based.



          Lattice-based cryptography is also very fast. For example, NTRU performs private key operations even faster than RSA, since the time increases with the cube of the key size for RSA, but quadratically for NTRU. The viability of lattice-based cryptography is undisputed. All we need to do now is iron out the kinks and standardize a particular implementation.






          share|improve this answer














          Yes, it is feasible. In fact, the NIST post-quantum submissions include a number of lattice-based cryptographic key exchange and signature protocols. As you can see from a summary of the different types of algorithms, lattice-based algorithms dominate the submissions. These include NTRU and its variants, Newhope (R-LWE), and FALCON (designed in part by one of the regulars!). Lattice-based cryptography itself is fairly well understood, exploiting the shortest vector problem as a trapdoor function. Many new algorithms are based in one way or another on it.



          Lattice-based cryptography is one of only a few popular designs for post-quantum cryptography. There others, such as code-based cryptography, multivariate polynomial cryptography, and hash-based signatures. Of those, code-based algorithms are the only class that could realistically compete with lattice-based algorithms, since it is also well known from the McEliece cryptosystem. Multivariate polynomial cryptography is not as popular, and many of the proposed algorithms using it have been broken. Lastly, hash-based cryptography, while quite secure, is only useful for digital signatures, not key exchange. This explains why so many proposed algorithms are lattice-based.



          Lattice-based cryptography is also very fast. For example, NTRU performs private key operations even faster than RSA, since the time increases with the cube of the key size for RSA, but quadratically for NTRU. The viability of lattice-based cryptography is undisputed. All we need to do now is iron out the kinks and standardize a particular implementation.







          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Aug 19 at 2:27

























          answered Aug 19 at 1:44









          forest

          1,953427




          1,953427











          • Could you elaborate a little bit on what you mean by "iron out the kinks" and standardize a particular implementation? Thank you!
            – William Hird
            Aug 19 at 3:00






          • 2




            @WilliamHird It just means that the different algorithms will be tested, people will try to break them and will debate their merits and drawbacks. The creators will argue why their algorithm should be chosen, etc. This is all part of the NIST standardization process, a "contest" to see which algorithm should be made the official standard.
            – forest
            Aug 19 at 3:03










          • My understanding is that NTRU was invented by mathematicians at Brown University, which also has some of the most brilliant computer scientists on the planet. Why didn't Brown do the "whole package", the math, the algorithmic implementation, the protocols, ect. ? Why would you need a contest when you have all those geniuses in one place working on cryptography?
            – William Hird
            Aug 20 at 9:20






          • 1




            The answer to tha last question is known as Schneier’s law: “Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break. It's not even hard. What is hard is creating an algorithm that no one else can break, even after years of analysis. And the only way to prove that is to subject the algorithm to years of analysis by the best cryptographers around.”
            – Frédéric Grosshans
            Aug 20 at 16:25










          • @FrédéricGrosshans: OK, makes sense, the team peer review concept, just don't let anyone from the NSA be part of the team unless you like back doors in your software. :-)
            – William Hird
            Aug 20 at 17:46
















          • Could you elaborate a little bit on what you mean by "iron out the kinks" and standardize a particular implementation? Thank you!
            – William Hird
            Aug 19 at 3:00






          • 2




            @WilliamHird It just means that the different algorithms will be tested, people will try to break them and will debate their merits and drawbacks. The creators will argue why their algorithm should be chosen, etc. This is all part of the NIST standardization process, a "contest" to see which algorithm should be made the official standard.
            – forest
            Aug 19 at 3:03










          • My understanding is that NTRU was invented by mathematicians at Brown University, which also has some of the most brilliant computer scientists on the planet. Why didn't Brown do the "whole package", the math, the algorithmic implementation, the protocols, ect. ? Why would you need a contest when you have all those geniuses in one place working on cryptography?
            – William Hird
            Aug 20 at 9:20






          • 1




            The answer to tha last question is known as Schneier’s law: “Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break. It's not even hard. What is hard is creating an algorithm that no one else can break, even after years of analysis. And the only way to prove that is to subject the algorithm to years of analysis by the best cryptographers around.”
            – Frédéric Grosshans
            Aug 20 at 16:25










          • @FrédéricGrosshans: OK, makes sense, the team peer review concept, just don't let anyone from the NSA be part of the team unless you like back doors in your software. :-)
            – William Hird
            Aug 20 at 17:46















          Could you elaborate a little bit on what you mean by "iron out the kinks" and standardize a particular implementation? Thank you!
          – William Hird
          Aug 19 at 3:00




          Could you elaborate a little bit on what you mean by "iron out the kinks" and standardize a particular implementation? Thank you!
          – William Hird
          Aug 19 at 3:00




          2




          2




          @WilliamHird It just means that the different algorithms will be tested, people will try to break them and will debate their merits and drawbacks. The creators will argue why their algorithm should be chosen, etc. This is all part of the NIST standardization process, a "contest" to see which algorithm should be made the official standard.
          – forest
          Aug 19 at 3:03




          @WilliamHird It just means that the different algorithms will be tested, people will try to break them and will debate their merits and drawbacks. The creators will argue why their algorithm should be chosen, etc. This is all part of the NIST standardization process, a "contest" to see which algorithm should be made the official standard.
          – forest
          Aug 19 at 3:03












          My understanding is that NTRU was invented by mathematicians at Brown University, which also has some of the most brilliant computer scientists on the planet. Why didn't Brown do the "whole package", the math, the algorithmic implementation, the protocols, ect. ? Why would you need a contest when you have all those geniuses in one place working on cryptography?
          – William Hird
          Aug 20 at 9:20




          My understanding is that NTRU was invented by mathematicians at Brown University, which also has some of the most brilliant computer scientists on the planet. Why didn't Brown do the "whole package", the math, the algorithmic implementation, the protocols, ect. ? Why would you need a contest when you have all those geniuses in one place working on cryptography?
          – William Hird
          Aug 20 at 9:20




          1




          1




          The answer to tha last question is known as Schneier’s law: “Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break. It's not even hard. What is hard is creating an algorithm that no one else can break, even after years of analysis. And the only way to prove that is to subject the algorithm to years of analysis by the best cryptographers around.”
          – Frédéric Grosshans
          Aug 20 at 16:25




          The answer to tha last question is known as Schneier’s law: “Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break. It's not even hard. What is hard is creating an algorithm that no one else can break, even after years of analysis. And the only way to prove that is to subject the algorithm to years of analysis by the best cryptographers around.”
          – Frédéric Grosshans
          Aug 20 at 16:25












          @FrédéricGrosshans: OK, makes sense, the team peer review concept, just don't let anyone from the NSA be part of the team unless you like back doors in your software. :-)
          – William Hird
          Aug 20 at 17:46




          @FrédéricGrosshans: OK, makes sense, the team peer review concept, just don't let anyone from the NSA be part of the team unless you like back doors in your software. :-)
          – William Hird
          Aug 20 at 17:46

















           

          draft saved


          draft discarded















































           


          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f61595%2flattice-based-cryptography%23new-answer', 'question_page');

          );

          Post as a guest













































































          Comments

          Popular posts from this blog

          Long meetings (6-7 hours a day): Being “babysat” by supervisor

          Is the Concept of Multiple Fantasy Races Scientifically Flawed? [closed]

          Confectionery