Optimal bit-change-probability of the avalanche effect for hashes
Clash Royale CLAN TAG#URR8PPP
up vote
2
down vote
favorite
Avalanche effect:
The strict avalanche criterion (SAC) is a formalization of the avalanche effect. It is satisfied if, whenever a single input bit is complemented, each of the output bits changes with a 50% probability.
My questions are:
Is a 50% bit-change-probability optimal for any hash or is it just the minimal value so satify the strict avalanche criterion?
What if a hash-algorithm had a 100% bit-change-probability?
hash avalanche
add a comment |Â
up vote
2
down vote
favorite
Avalanche effect:
The strict avalanche criterion (SAC) is a formalization of the avalanche effect. It is satisfied if, whenever a single input bit is complemented, each of the output bits changes with a 50% probability.
My questions are:
Is a 50% bit-change-probability optimal for any hash or is it just the minimal value so satify the strict avalanche criterion?
What if a hash-algorithm had a 100% bit-change-probability?
hash avalanche
add a comment |Â
up vote
2
down vote
favorite
up vote
2
down vote
favorite
Avalanche effect:
The strict avalanche criterion (SAC) is a formalization of the avalanche effect. It is satisfied if, whenever a single input bit is complemented, each of the output bits changes with a 50% probability.
My questions are:
Is a 50% bit-change-probability optimal for any hash or is it just the minimal value so satify the strict avalanche criterion?
What if a hash-algorithm had a 100% bit-change-probability?
hash avalanche
Avalanche effect:
The strict avalanche criterion (SAC) is a formalization of the avalanche effect. It is satisfied if, whenever a single input bit is complemented, each of the output bits changes with a 50% probability.
My questions are:
Is a 50% bit-change-probability optimal for any hash or is it just the minimal value so satify the strict avalanche criterion?
What if a hash-algorithm had a 100% bit-change-probability?
hash avalanche
hash avalanche
edited 2 hours ago
asked 2 hours ago


Aleksander Rassasse
808217
808217
add a comment |Â
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
2
down vote
Consider a function $H:0,1^mto0,1^h$ (that is, from the set of $m$-bit bistrings to the set of $h$-bit bitstrings). Define $D_n=0^nmathbin|1mathbin|0^m-n-1$ (that is, the $m$-bit "disturbance" bitstring with only bit $n$ set).
By a possible definition, $H$ satisfies the Strict Avalanche Criterion when for each of the $mcdot h$ pairs $(n,i)$ with $0le n<m$ and $0le i<h$, the arithmetic sum of bit $i$ of $H(Moplus D_n)oplus H(M)$ computed for all $M$ in $0,1^m$ is $2^m-1$ (that is, exactly 50% of the number of terms summed).
That can be extended to functions with variable-size input (such as usual hashes) by stating that for each input size $mge2$ supported, the function satisfies the SAC (Note: when $m<2$ and $hne0$, no function satisfies the SAC).
Is a 50% bit-change-probability optimal for any hash or is it just the minimal value to satisfy the strict avalanche criterion?
Neither.
The optimal for a hash is to behave like a random function, and a random function is extremely unlikely to satisfy the SAC (for $m=2$ that has probability $2^-h$, and that goes further down very fast when $m$ grows). If a practical cryptographic hash ($hge128$) satisfies the SAC for some $m$ with $2le mle64$, that's easily detectable and a serious weakness. For large $m$ and practical hashes, it is computationally impossible to test if $H$ satisfies the SAC.
50% is not the minimal value to satisfy the strict avalanche criterion. For the SAC t hold, the probability that a 1-bit change in input changes any particular bit of output must be exactly 50%, where that probability is computed over the $2^m$ inputs.
Mildly interesting problem: up to what $m$ is it possible to get experimental evidence that a practical hash does not meet the SAC?
What if a hash-algorithm had a 100% bit-change-probability?
Such function must produce the XOR of its $m$ input bit(s), repeated $h$ times to form a $h$-bit vector, then XORed with a $h$-bit arbitrary value dependent only on $m$. It is not a secure cryptographic hash by any measure. Independently, it does not meet the SAC.
add a comment |Â
up vote
1
down vote
What if a hash-algorithm had a 100% bit-change-probability?
Well, it wouldn't be a hash. You might call it an oscillator or something, but certainly not a pseudo random function. One of the principal characteristics of a cryptographic hash is that the output appears pseudo random, given any input.
You're suggesting a behaviour like $f(b) = f'(a)$ as you'd have to maintain a memory of all of the previous output bits. There would be no randomness and no independence between the outputs. You'd then simply oscillate between the two outputs. It would be useless as a hash and no better than a quasi NOT operator.
The 50% bit-change-probability is not optimal. It's one of the fundamental requirements. But you'll also find that this will occur naturally with the construction of any good hash. A natural relationship between an output collision rate of $1/e$ and a 50% overall bit change emerges given enough internal bit mangling.
Be careful though. The avalanche effect and the strict avalanche effect are not the same, and folks tend to use them interchangeably. They're not and your wiki quote isn't explicitly correct. One is not simply the formalisation of the other. There's a succinct answer @ https://crypto.stackexchange.com/a/42471/23115 explaining this.
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
Consider a function $H:0,1^mto0,1^h$ (that is, from the set of $m$-bit bistrings to the set of $h$-bit bitstrings). Define $D_n=0^nmathbin|1mathbin|0^m-n-1$ (that is, the $m$-bit "disturbance" bitstring with only bit $n$ set).
By a possible definition, $H$ satisfies the Strict Avalanche Criterion when for each of the $mcdot h$ pairs $(n,i)$ with $0le n<m$ and $0le i<h$, the arithmetic sum of bit $i$ of $H(Moplus D_n)oplus H(M)$ computed for all $M$ in $0,1^m$ is $2^m-1$ (that is, exactly 50% of the number of terms summed).
That can be extended to functions with variable-size input (such as usual hashes) by stating that for each input size $mge2$ supported, the function satisfies the SAC (Note: when $m<2$ and $hne0$, no function satisfies the SAC).
Is a 50% bit-change-probability optimal for any hash or is it just the minimal value to satisfy the strict avalanche criterion?
Neither.
The optimal for a hash is to behave like a random function, and a random function is extremely unlikely to satisfy the SAC (for $m=2$ that has probability $2^-h$, and that goes further down very fast when $m$ grows). If a practical cryptographic hash ($hge128$) satisfies the SAC for some $m$ with $2le mle64$, that's easily detectable and a serious weakness. For large $m$ and practical hashes, it is computationally impossible to test if $H$ satisfies the SAC.
50% is not the minimal value to satisfy the strict avalanche criterion. For the SAC t hold, the probability that a 1-bit change in input changes any particular bit of output must be exactly 50%, where that probability is computed over the $2^m$ inputs.
Mildly interesting problem: up to what $m$ is it possible to get experimental evidence that a practical hash does not meet the SAC?
What if a hash-algorithm had a 100% bit-change-probability?
Such function must produce the XOR of its $m$ input bit(s), repeated $h$ times to form a $h$-bit vector, then XORed with a $h$-bit arbitrary value dependent only on $m$. It is not a secure cryptographic hash by any measure. Independently, it does not meet the SAC.
add a comment |Â
up vote
2
down vote
Consider a function $H:0,1^mto0,1^h$ (that is, from the set of $m$-bit bistrings to the set of $h$-bit bitstrings). Define $D_n=0^nmathbin|1mathbin|0^m-n-1$ (that is, the $m$-bit "disturbance" bitstring with only bit $n$ set).
By a possible definition, $H$ satisfies the Strict Avalanche Criterion when for each of the $mcdot h$ pairs $(n,i)$ with $0le n<m$ and $0le i<h$, the arithmetic sum of bit $i$ of $H(Moplus D_n)oplus H(M)$ computed for all $M$ in $0,1^m$ is $2^m-1$ (that is, exactly 50% of the number of terms summed).
That can be extended to functions with variable-size input (such as usual hashes) by stating that for each input size $mge2$ supported, the function satisfies the SAC (Note: when $m<2$ and $hne0$, no function satisfies the SAC).
Is a 50% bit-change-probability optimal for any hash or is it just the minimal value to satisfy the strict avalanche criterion?
Neither.
The optimal for a hash is to behave like a random function, and a random function is extremely unlikely to satisfy the SAC (for $m=2$ that has probability $2^-h$, and that goes further down very fast when $m$ grows). If a practical cryptographic hash ($hge128$) satisfies the SAC for some $m$ with $2le mle64$, that's easily detectable and a serious weakness. For large $m$ and practical hashes, it is computationally impossible to test if $H$ satisfies the SAC.
50% is not the minimal value to satisfy the strict avalanche criterion. For the SAC t hold, the probability that a 1-bit change in input changes any particular bit of output must be exactly 50%, where that probability is computed over the $2^m$ inputs.
Mildly interesting problem: up to what $m$ is it possible to get experimental evidence that a practical hash does not meet the SAC?
What if a hash-algorithm had a 100% bit-change-probability?
Such function must produce the XOR of its $m$ input bit(s), repeated $h$ times to form a $h$-bit vector, then XORed with a $h$-bit arbitrary value dependent only on $m$. It is not a secure cryptographic hash by any measure. Independently, it does not meet the SAC.
add a comment |Â
up vote
2
down vote
up vote
2
down vote
Consider a function $H:0,1^mto0,1^h$ (that is, from the set of $m$-bit bistrings to the set of $h$-bit bitstrings). Define $D_n=0^nmathbin|1mathbin|0^m-n-1$ (that is, the $m$-bit "disturbance" bitstring with only bit $n$ set).
By a possible definition, $H$ satisfies the Strict Avalanche Criterion when for each of the $mcdot h$ pairs $(n,i)$ with $0le n<m$ and $0le i<h$, the arithmetic sum of bit $i$ of $H(Moplus D_n)oplus H(M)$ computed for all $M$ in $0,1^m$ is $2^m-1$ (that is, exactly 50% of the number of terms summed).
That can be extended to functions with variable-size input (such as usual hashes) by stating that for each input size $mge2$ supported, the function satisfies the SAC (Note: when $m<2$ and $hne0$, no function satisfies the SAC).
Is a 50% bit-change-probability optimal for any hash or is it just the minimal value to satisfy the strict avalanche criterion?
Neither.
The optimal for a hash is to behave like a random function, and a random function is extremely unlikely to satisfy the SAC (for $m=2$ that has probability $2^-h$, and that goes further down very fast when $m$ grows). If a practical cryptographic hash ($hge128$) satisfies the SAC for some $m$ with $2le mle64$, that's easily detectable and a serious weakness. For large $m$ and practical hashes, it is computationally impossible to test if $H$ satisfies the SAC.
50% is not the minimal value to satisfy the strict avalanche criterion. For the SAC t hold, the probability that a 1-bit change in input changes any particular bit of output must be exactly 50%, where that probability is computed over the $2^m$ inputs.
Mildly interesting problem: up to what $m$ is it possible to get experimental evidence that a practical hash does not meet the SAC?
What if a hash-algorithm had a 100% bit-change-probability?
Such function must produce the XOR of its $m$ input bit(s), repeated $h$ times to form a $h$-bit vector, then XORed with a $h$-bit arbitrary value dependent only on $m$. It is not a secure cryptographic hash by any measure. Independently, it does not meet the SAC.
Consider a function $H:0,1^mto0,1^h$ (that is, from the set of $m$-bit bistrings to the set of $h$-bit bitstrings). Define $D_n=0^nmathbin|1mathbin|0^m-n-1$ (that is, the $m$-bit "disturbance" bitstring with only bit $n$ set).
By a possible definition, $H$ satisfies the Strict Avalanche Criterion when for each of the $mcdot h$ pairs $(n,i)$ with $0le n<m$ and $0le i<h$, the arithmetic sum of bit $i$ of $H(Moplus D_n)oplus H(M)$ computed for all $M$ in $0,1^m$ is $2^m-1$ (that is, exactly 50% of the number of terms summed).
That can be extended to functions with variable-size input (such as usual hashes) by stating that for each input size $mge2$ supported, the function satisfies the SAC (Note: when $m<2$ and $hne0$, no function satisfies the SAC).
Is a 50% bit-change-probability optimal for any hash or is it just the minimal value to satisfy the strict avalanche criterion?
Neither.
The optimal for a hash is to behave like a random function, and a random function is extremely unlikely to satisfy the SAC (for $m=2$ that has probability $2^-h$, and that goes further down very fast when $m$ grows). If a practical cryptographic hash ($hge128$) satisfies the SAC for some $m$ with $2le mle64$, that's easily detectable and a serious weakness. For large $m$ and practical hashes, it is computationally impossible to test if $H$ satisfies the SAC.
50% is not the minimal value to satisfy the strict avalanche criterion. For the SAC t hold, the probability that a 1-bit change in input changes any particular bit of output must be exactly 50%, where that probability is computed over the $2^m$ inputs.
Mildly interesting problem: up to what $m$ is it possible to get experimental evidence that a practical hash does not meet the SAC?
What if a hash-algorithm had a 100% bit-change-probability?
Such function must produce the XOR of its $m$ input bit(s), repeated $h$ times to form a $h$-bit vector, then XORed with a $h$-bit arbitrary value dependent only on $m$. It is not a secure cryptographic hash by any measure. Independently, it does not meet the SAC.
edited 35 mins ago
answered 40 mins ago


fgrieu
73.8k6151313
73.8k6151313
add a comment |Â
add a comment |Â
up vote
1
down vote
What if a hash-algorithm had a 100% bit-change-probability?
Well, it wouldn't be a hash. You might call it an oscillator or something, but certainly not a pseudo random function. One of the principal characteristics of a cryptographic hash is that the output appears pseudo random, given any input.
You're suggesting a behaviour like $f(b) = f'(a)$ as you'd have to maintain a memory of all of the previous output bits. There would be no randomness and no independence between the outputs. You'd then simply oscillate between the two outputs. It would be useless as a hash and no better than a quasi NOT operator.
The 50% bit-change-probability is not optimal. It's one of the fundamental requirements. But you'll also find that this will occur naturally with the construction of any good hash. A natural relationship between an output collision rate of $1/e$ and a 50% overall bit change emerges given enough internal bit mangling.
Be careful though. The avalanche effect and the strict avalanche effect are not the same, and folks tend to use them interchangeably. They're not and your wiki quote isn't explicitly correct. One is not simply the formalisation of the other. There's a succinct answer @ https://crypto.stackexchange.com/a/42471/23115 explaining this.
add a comment |Â
up vote
1
down vote
What if a hash-algorithm had a 100% bit-change-probability?
Well, it wouldn't be a hash. You might call it an oscillator or something, but certainly not a pseudo random function. One of the principal characteristics of a cryptographic hash is that the output appears pseudo random, given any input.
You're suggesting a behaviour like $f(b) = f'(a)$ as you'd have to maintain a memory of all of the previous output bits. There would be no randomness and no independence between the outputs. You'd then simply oscillate between the two outputs. It would be useless as a hash and no better than a quasi NOT operator.
The 50% bit-change-probability is not optimal. It's one of the fundamental requirements. But you'll also find that this will occur naturally with the construction of any good hash. A natural relationship between an output collision rate of $1/e$ and a 50% overall bit change emerges given enough internal bit mangling.
Be careful though. The avalanche effect and the strict avalanche effect are not the same, and folks tend to use them interchangeably. They're not and your wiki quote isn't explicitly correct. One is not simply the formalisation of the other. There's a succinct answer @ https://crypto.stackexchange.com/a/42471/23115 explaining this.
add a comment |Â
up vote
1
down vote
up vote
1
down vote
What if a hash-algorithm had a 100% bit-change-probability?
Well, it wouldn't be a hash. You might call it an oscillator or something, but certainly not a pseudo random function. One of the principal characteristics of a cryptographic hash is that the output appears pseudo random, given any input.
You're suggesting a behaviour like $f(b) = f'(a)$ as you'd have to maintain a memory of all of the previous output bits. There would be no randomness and no independence between the outputs. You'd then simply oscillate between the two outputs. It would be useless as a hash and no better than a quasi NOT operator.
The 50% bit-change-probability is not optimal. It's one of the fundamental requirements. But you'll also find that this will occur naturally with the construction of any good hash. A natural relationship between an output collision rate of $1/e$ and a 50% overall bit change emerges given enough internal bit mangling.
Be careful though. The avalanche effect and the strict avalanche effect are not the same, and folks tend to use them interchangeably. They're not and your wiki quote isn't explicitly correct. One is not simply the formalisation of the other. There's a succinct answer @ https://crypto.stackexchange.com/a/42471/23115 explaining this.
What if a hash-algorithm had a 100% bit-change-probability?
Well, it wouldn't be a hash. You might call it an oscillator or something, but certainly not a pseudo random function. One of the principal characteristics of a cryptographic hash is that the output appears pseudo random, given any input.
You're suggesting a behaviour like $f(b) = f'(a)$ as you'd have to maintain a memory of all of the previous output bits. There would be no randomness and no independence between the outputs. You'd then simply oscillate between the two outputs. It would be useless as a hash and no better than a quasi NOT operator.
The 50% bit-change-probability is not optimal. It's one of the fundamental requirements. But you'll also find that this will occur naturally with the construction of any good hash. A natural relationship between an output collision rate of $1/e$ and a 50% overall bit change emerges given enough internal bit mangling.
Be careful though. The avalanche effect and the strict avalanche effect are not the same, and folks tend to use them interchangeably. They're not and your wiki quote isn't explicitly correct. One is not simply the formalisation of the other. There's a succinct answer @ https://crypto.stackexchange.com/a/42471/23115 explaining this.
answered 1 hour ago
Paul Uszak
6,11011332
6,11011332
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f62884%2foptimal-bit-change-probability-of-the-avalanche-effect-for-hashes%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password