Lattice-based cryptography
Clash Royale CLAN TAG#URR8PPP
up vote
7
down vote
favorite
How viable is lattice-based cryptography in a "practical" setting?
It has been said that lattice-based cryptography would be a "post-quantum" cryptography scheme, but is it feasibly implementable?
post-quantum-cryptography lattice-crypto
add a comment |Â
up vote
7
down vote
favorite
How viable is lattice-based cryptography in a "practical" setting?
It has been said that lattice-based cryptography would be a "post-quantum" cryptography scheme, but is it feasibly implementable?
post-quantum-cryptography lattice-crypto
add a comment |Â
up vote
7
down vote
favorite
up vote
7
down vote
favorite
How viable is lattice-based cryptography in a "practical" setting?
It has been said that lattice-based cryptography would be a "post-quantum" cryptography scheme, but is it feasibly implementable?
post-quantum-cryptography lattice-crypto
How viable is lattice-based cryptography in a "practical" setting?
It has been said that lattice-based cryptography would be a "post-quantum" cryptography scheme, but is it feasibly implementable?
post-quantum-cryptography lattice-crypto
edited Aug 19 at 3:08
forest
1,953427
1,953427
asked Aug 19 at 1:05
Steven Sagona
1364
1364
add a comment |Â
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
10
down vote
Yes, it is feasible. In fact, the NIST post-quantum submissions include a number of lattice-based cryptographic key exchange and signature protocols. As you can see from a summary of the different types of algorithms, lattice-based algorithms dominate the submissions. These include NTRU and its variants, Newhope (R-LWE), and FALCON (designed in part by one of the regulars!). Lattice-based cryptography itself is fairly well understood, exploiting the shortest vector problem as a trapdoor function. Many new algorithms are based in one way or another on it.
Lattice-based cryptography is one of only a few popular designs for post-quantum cryptography. There others, such as code-based cryptography, multivariate polynomial cryptography, and hash-based signatures. Of those, code-based algorithms are the only class that could realistically compete with lattice-based algorithms, since it is also well known from the McEliece cryptosystem. Multivariate polynomial cryptography is not as popular, and many of the proposed algorithms using it have been broken. Lastly, hash-based cryptography, while quite secure, is only useful for digital signatures, not key exchange. This explains why so many proposed algorithms are lattice-based.
Lattice-based cryptography is also very fast. For example, NTRU performs private key operations even faster than RSA, since the time increases with the cube of the key size for RSA, but quadratically for NTRU. The viability of lattice-based cryptography is undisputed. All we need to do now is iron out the kinks and standardize a particular implementation.
Could you elaborate a little bit on what you mean by "iron out the kinks" and standardize a particular implementation? Thank you!
â William Hird
Aug 19 at 3:00
2
@WilliamHird It just means that the different algorithms will be tested, people will try to break them and will debate their merits and drawbacks. The creators will argue why their algorithm should be chosen, etc. This is all part of the NIST standardization process, a "contest" to see which algorithm should be made the official standard.
â forest
Aug 19 at 3:03
My understanding is that NTRU was invented by mathematicians at Brown University, which also has some of the most brilliant computer scientists on the planet. Why didn't Brown do the "whole package", the math, the algorithmic implementation, the protocols, ect. ? Why would you need a contest when you have all those geniuses in one place working on cryptography?
â William Hird
Aug 20 at 9:20
1
The answer to tha last question is known as SchneierâÂÂs law: âÂÂAnyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break. It's not even hard. What is hard is creating an algorithm that no one else can break, even after years of analysis. And the only way to prove that is to subject the algorithm to years of analysis by the best cryptographers around.âÂÂ
â Frédéric Grosshans
Aug 20 at 16:25
@FrédéricGrosshans: OK, makes sense, the team peer review concept, just don't let anyone from the NSA be part of the team unless you like back doors in your software. :-)
â William Hird
Aug 20 at 17:46
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
10
down vote
Yes, it is feasible. In fact, the NIST post-quantum submissions include a number of lattice-based cryptographic key exchange and signature protocols. As you can see from a summary of the different types of algorithms, lattice-based algorithms dominate the submissions. These include NTRU and its variants, Newhope (R-LWE), and FALCON (designed in part by one of the regulars!). Lattice-based cryptography itself is fairly well understood, exploiting the shortest vector problem as a trapdoor function. Many new algorithms are based in one way or another on it.
Lattice-based cryptography is one of only a few popular designs for post-quantum cryptography. There others, such as code-based cryptography, multivariate polynomial cryptography, and hash-based signatures. Of those, code-based algorithms are the only class that could realistically compete with lattice-based algorithms, since it is also well known from the McEliece cryptosystem. Multivariate polynomial cryptography is not as popular, and many of the proposed algorithms using it have been broken. Lastly, hash-based cryptography, while quite secure, is only useful for digital signatures, not key exchange. This explains why so many proposed algorithms are lattice-based.
Lattice-based cryptography is also very fast. For example, NTRU performs private key operations even faster than RSA, since the time increases with the cube of the key size for RSA, but quadratically for NTRU. The viability of lattice-based cryptography is undisputed. All we need to do now is iron out the kinks and standardize a particular implementation.
Could you elaborate a little bit on what you mean by "iron out the kinks" and standardize a particular implementation? Thank you!
â William Hird
Aug 19 at 3:00
2
@WilliamHird It just means that the different algorithms will be tested, people will try to break them and will debate their merits and drawbacks. The creators will argue why their algorithm should be chosen, etc. This is all part of the NIST standardization process, a "contest" to see which algorithm should be made the official standard.
â forest
Aug 19 at 3:03
My understanding is that NTRU was invented by mathematicians at Brown University, which also has some of the most brilliant computer scientists on the planet. Why didn't Brown do the "whole package", the math, the algorithmic implementation, the protocols, ect. ? Why would you need a contest when you have all those geniuses in one place working on cryptography?
â William Hird
Aug 20 at 9:20
1
The answer to tha last question is known as SchneierâÂÂs law: âÂÂAnyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break. It's not even hard. What is hard is creating an algorithm that no one else can break, even after years of analysis. And the only way to prove that is to subject the algorithm to years of analysis by the best cryptographers around.âÂÂ
â Frédéric Grosshans
Aug 20 at 16:25
@FrédéricGrosshans: OK, makes sense, the team peer review concept, just don't let anyone from the NSA be part of the team unless you like back doors in your software. :-)
â William Hird
Aug 20 at 17:46
add a comment |Â
up vote
10
down vote
Yes, it is feasible. In fact, the NIST post-quantum submissions include a number of lattice-based cryptographic key exchange and signature protocols. As you can see from a summary of the different types of algorithms, lattice-based algorithms dominate the submissions. These include NTRU and its variants, Newhope (R-LWE), and FALCON (designed in part by one of the regulars!). Lattice-based cryptography itself is fairly well understood, exploiting the shortest vector problem as a trapdoor function. Many new algorithms are based in one way or another on it.
Lattice-based cryptography is one of only a few popular designs for post-quantum cryptography. There others, such as code-based cryptography, multivariate polynomial cryptography, and hash-based signatures. Of those, code-based algorithms are the only class that could realistically compete with lattice-based algorithms, since it is also well known from the McEliece cryptosystem. Multivariate polynomial cryptography is not as popular, and many of the proposed algorithms using it have been broken. Lastly, hash-based cryptography, while quite secure, is only useful for digital signatures, not key exchange. This explains why so many proposed algorithms are lattice-based.
Lattice-based cryptography is also very fast. For example, NTRU performs private key operations even faster than RSA, since the time increases with the cube of the key size for RSA, but quadratically for NTRU. The viability of lattice-based cryptography is undisputed. All we need to do now is iron out the kinks and standardize a particular implementation.
Could you elaborate a little bit on what you mean by "iron out the kinks" and standardize a particular implementation? Thank you!
â William Hird
Aug 19 at 3:00
2
@WilliamHird It just means that the different algorithms will be tested, people will try to break them and will debate their merits and drawbacks. The creators will argue why their algorithm should be chosen, etc. This is all part of the NIST standardization process, a "contest" to see which algorithm should be made the official standard.
â forest
Aug 19 at 3:03
My understanding is that NTRU was invented by mathematicians at Brown University, which also has some of the most brilliant computer scientists on the planet. Why didn't Brown do the "whole package", the math, the algorithmic implementation, the protocols, ect. ? Why would you need a contest when you have all those geniuses in one place working on cryptography?
â William Hird
Aug 20 at 9:20
1
The answer to tha last question is known as SchneierâÂÂs law: âÂÂAnyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break. It's not even hard. What is hard is creating an algorithm that no one else can break, even after years of analysis. And the only way to prove that is to subject the algorithm to years of analysis by the best cryptographers around.âÂÂ
â Frédéric Grosshans
Aug 20 at 16:25
@FrédéricGrosshans: OK, makes sense, the team peer review concept, just don't let anyone from the NSA be part of the team unless you like back doors in your software. :-)
â William Hird
Aug 20 at 17:46
add a comment |Â
up vote
10
down vote
up vote
10
down vote
Yes, it is feasible. In fact, the NIST post-quantum submissions include a number of lattice-based cryptographic key exchange and signature protocols. As you can see from a summary of the different types of algorithms, lattice-based algorithms dominate the submissions. These include NTRU and its variants, Newhope (R-LWE), and FALCON (designed in part by one of the regulars!). Lattice-based cryptography itself is fairly well understood, exploiting the shortest vector problem as a trapdoor function. Many new algorithms are based in one way or another on it.
Lattice-based cryptography is one of only a few popular designs for post-quantum cryptography. There others, such as code-based cryptography, multivariate polynomial cryptography, and hash-based signatures. Of those, code-based algorithms are the only class that could realistically compete with lattice-based algorithms, since it is also well known from the McEliece cryptosystem. Multivariate polynomial cryptography is not as popular, and many of the proposed algorithms using it have been broken. Lastly, hash-based cryptography, while quite secure, is only useful for digital signatures, not key exchange. This explains why so many proposed algorithms are lattice-based.
Lattice-based cryptography is also very fast. For example, NTRU performs private key operations even faster than RSA, since the time increases with the cube of the key size for RSA, but quadratically for NTRU. The viability of lattice-based cryptography is undisputed. All we need to do now is iron out the kinks and standardize a particular implementation.
Yes, it is feasible. In fact, the NIST post-quantum submissions include a number of lattice-based cryptographic key exchange and signature protocols. As you can see from a summary of the different types of algorithms, lattice-based algorithms dominate the submissions. These include NTRU and its variants, Newhope (R-LWE), and FALCON (designed in part by one of the regulars!). Lattice-based cryptography itself is fairly well understood, exploiting the shortest vector problem as a trapdoor function. Many new algorithms are based in one way or another on it.
Lattice-based cryptography is one of only a few popular designs for post-quantum cryptography. There others, such as code-based cryptography, multivariate polynomial cryptography, and hash-based signatures. Of those, code-based algorithms are the only class that could realistically compete with lattice-based algorithms, since it is also well known from the McEliece cryptosystem. Multivariate polynomial cryptography is not as popular, and many of the proposed algorithms using it have been broken. Lastly, hash-based cryptography, while quite secure, is only useful for digital signatures, not key exchange. This explains why so many proposed algorithms are lattice-based.
Lattice-based cryptography is also very fast. For example, NTRU performs private key operations even faster than RSA, since the time increases with the cube of the key size for RSA, but quadratically for NTRU. The viability of lattice-based cryptography is undisputed. All we need to do now is iron out the kinks and standardize a particular implementation.
edited Aug 19 at 2:27
answered Aug 19 at 1:44
forest
1,953427
1,953427
Could you elaborate a little bit on what you mean by "iron out the kinks" and standardize a particular implementation? Thank you!
â William Hird
Aug 19 at 3:00
2
@WilliamHird It just means that the different algorithms will be tested, people will try to break them and will debate their merits and drawbacks. The creators will argue why their algorithm should be chosen, etc. This is all part of the NIST standardization process, a "contest" to see which algorithm should be made the official standard.
â forest
Aug 19 at 3:03
My understanding is that NTRU was invented by mathematicians at Brown University, which also has some of the most brilliant computer scientists on the planet. Why didn't Brown do the "whole package", the math, the algorithmic implementation, the protocols, ect. ? Why would you need a contest when you have all those geniuses in one place working on cryptography?
â William Hird
Aug 20 at 9:20
1
The answer to tha last question is known as SchneierâÂÂs law: âÂÂAnyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break. It's not even hard. What is hard is creating an algorithm that no one else can break, even after years of analysis. And the only way to prove that is to subject the algorithm to years of analysis by the best cryptographers around.âÂÂ
â Frédéric Grosshans
Aug 20 at 16:25
@FrédéricGrosshans: OK, makes sense, the team peer review concept, just don't let anyone from the NSA be part of the team unless you like back doors in your software. :-)
â William Hird
Aug 20 at 17:46
add a comment |Â
Could you elaborate a little bit on what you mean by "iron out the kinks" and standardize a particular implementation? Thank you!
â William Hird
Aug 19 at 3:00
2
@WilliamHird It just means that the different algorithms will be tested, people will try to break them and will debate their merits and drawbacks. The creators will argue why their algorithm should be chosen, etc. This is all part of the NIST standardization process, a "contest" to see which algorithm should be made the official standard.
â forest
Aug 19 at 3:03
My understanding is that NTRU was invented by mathematicians at Brown University, which also has some of the most brilliant computer scientists on the planet. Why didn't Brown do the "whole package", the math, the algorithmic implementation, the protocols, ect. ? Why would you need a contest when you have all those geniuses in one place working on cryptography?
â William Hird
Aug 20 at 9:20
1
The answer to tha last question is known as SchneierâÂÂs law: âÂÂAnyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break. It's not even hard. What is hard is creating an algorithm that no one else can break, even after years of analysis. And the only way to prove that is to subject the algorithm to years of analysis by the best cryptographers around.âÂÂ
â Frédéric Grosshans
Aug 20 at 16:25
@FrédéricGrosshans: OK, makes sense, the team peer review concept, just don't let anyone from the NSA be part of the team unless you like back doors in your software. :-)
â William Hird
Aug 20 at 17:46
Could you elaborate a little bit on what you mean by "iron out the kinks" and standardize a particular implementation? Thank you!
â William Hird
Aug 19 at 3:00
Could you elaborate a little bit on what you mean by "iron out the kinks" and standardize a particular implementation? Thank you!
â William Hird
Aug 19 at 3:00
2
2
@WilliamHird It just means that the different algorithms will be tested, people will try to break them and will debate their merits and drawbacks. The creators will argue why their algorithm should be chosen, etc. This is all part of the NIST standardization process, a "contest" to see which algorithm should be made the official standard.
â forest
Aug 19 at 3:03
@WilliamHird It just means that the different algorithms will be tested, people will try to break them and will debate their merits and drawbacks. The creators will argue why their algorithm should be chosen, etc. This is all part of the NIST standardization process, a "contest" to see which algorithm should be made the official standard.
â forest
Aug 19 at 3:03
My understanding is that NTRU was invented by mathematicians at Brown University, which also has some of the most brilliant computer scientists on the planet. Why didn't Brown do the "whole package", the math, the algorithmic implementation, the protocols, ect. ? Why would you need a contest when you have all those geniuses in one place working on cryptography?
â William Hird
Aug 20 at 9:20
My understanding is that NTRU was invented by mathematicians at Brown University, which also has some of the most brilliant computer scientists on the planet. Why didn't Brown do the "whole package", the math, the algorithmic implementation, the protocols, ect. ? Why would you need a contest when you have all those geniuses in one place working on cryptography?
â William Hird
Aug 20 at 9:20
1
1
The answer to tha last question is known as SchneierâÂÂs law: âÂÂAnyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break. It's not even hard. What is hard is creating an algorithm that no one else can break, even after years of analysis. And the only way to prove that is to subject the algorithm to years of analysis by the best cryptographers around.âÂÂ
â Frédéric Grosshans
Aug 20 at 16:25
The answer to tha last question is known as SchneierâÂÂs law: âÂÂAnyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can't break. It's not even hard. What is hard is creating an algorithm that no one else can break, even after years of analysis. And the only way to prove that is to subject the algorithm to years of analysis by the best cryptographers around.âÂÂ
â Frédéric Grosshans
Aug 20 at 16:25
@FrédéricGrosshans: OK, makes sense, the team peer review concept, just don't let anyone from the NSA be part of the team unless you like back doors in your software. :-)
â William Hird
Aug 20 at 17:46
@FrédéricGrosshans: OK, makes sense, the team peer review concept, just don't let anyone from the NSA be part of the team unless you like back doors in your software. :-)
â William Hird
Aug 20 at 17:46
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f61595%2flattice-based-cryptography%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password