Key Entropy meaning Help
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
Can anyone help me understand key entropy the mathematics behind it and the uses for it? Thank you.
P.S please don't ignore this I need this for my preparation for my Cryptography exam in two days. I just want someone to explain this to me since I have ASD. I understand most of it but I just want to understand what does $log 2$ do like if I do. $log 128 / log 2 = 7$?
The Professor has really confused me here on this one on what the $log 2$ mean? formula;
$Entropy = log(Phrases)/log 2$
The professor gave this formula, too: How many bits can represent $X$ phases? Just take the $log X$, and divide by $log 2$.
keys entropy
New contributor
add a comment |Â
up vote
1
down vote
favorite
Can anyone help me understand key entropy the mathematics behind it and the uses for it? Thank you.
P.S please don't ignore this I need this for my preparation for my Cryptography exam in two days. I just want someone to explain this to me since I have ASD. I understand most of it but I just want to understand what does $log 2$ do like if I do. $log 128 / log 2 = 7$?
The Professor has really confused me here on this one on what the $log 2$ mean? formula;
$Entropy = log(Phrases)/log 2$
The professor gave this formula, too: How many bits can represent $X$ phases? Just take the $log X$, and divide by $log 2$.
keys entropy
New contributor
Could you please edit your question to be more specific about what you don't understand about "key entropy"?
â SEJPMâ¦
1 hour ago
@SEJPM Done editing the question.
â Weaponized Autism
1 hour ago
Or written as $Entropy=log_2 (Phrases)$ this
â kelalaka
1 hour ago
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
Can anyone help me understand key entropy the mathematics behind it and the uses for it? Thank you.
P.S please don't ignore this I need this for my preparation for my Cryptography exam in two days. I just want someone to explain this to me since I have ASD. I understand most of it but I just want to understand what does $log 2$ do like if I do. $log 128 / log 2 = 7$?
The Professor has really confused me here on this one on what the $log 2$ mean? formula;
$Entropy = log(Phrases)/log 2$
The professor gave this formula, too: How many bits can represent $X$ phases? Just take the $log X$, and divide by $log 2$.
keys entropy
New contributor
Can anyone help me understand key entropy the mathematics behind it and the uses for it? Thank you.
P.S please don't ignore this I need this for my preparation for my Cryptography exam in two days. I just want someone to explain this to me since I have ASD. I understand most of it but I just want to understand what does $log 2$ do like if I do. $log 128 / log 2 = 7$?
The Professor has really confused me here on this one on what the $log 2$ mean? formula;
$Entropy = log(Phrases)/log 2$
The professor gave this formula, too: How many bits can represent $X$ phases? Just take the $log X$, and divide by $log 2$.
keys entropy
keys entropy
New contributor
New contributor
edited 6 mins ago
kelalaka
2,384524
2,384524
New contributor
asked 1 hour ago
Weaponized Autism
64
64
New contributor
New contributor
Could you please edit your question to be more specific about what you don't understand about "key entropy"?
â SEJPMâ¦
1 hour ago
@SEJPM Done editing the question.
â Weaponized Autism
1 hour ago
Or written as $Entropy=log_2 (Phrases)$ this
â kelalaka
1 hour ago
add a comment |Â
Could you please edit your question to be more specific about what you don't understand about "key entropy"?
â SEJPMâ¦
1 hour ago
@SEJPM Done editing the question.
â Weaponized Autism
1 hour ago
Or written as $Entropy=log_2 (Phrases)$ this
â kelalaka
1 hour ago
Could you please edit your question to be more specific about what you don't understand about "key entropy"?
â SEJPMâ¦
1 hour ago
Could you please edit your question to be more specific about what you don't understand about "key entropy"?
â SEJPMâ¦
1 hour ago
@SEJPM Done editing the question.
â Weaponized Autism
1 hour ago
@SEJPM Done editing the question.
â Weaponized Autism
1 hour ago
Or written as $Entropy=log_2 (Phrases)$ this
â kelalaka
1 hour ago
Or written as $Entropy=log_2 (Phrases)$ this
â kelalaka
1 hour ago
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
2
down vote
The Professor has really confused me here on this one on what the
$log 2$ mean? formula: $textEntropy = log(textPhrases)/log 2$
If I understand the problem correctly, you are asking what the $log 2$ is doing there in the denominator. This is essentially to ensure that the base of the logarithm (whether it's $10$ or $e$ or $2$) doesn't matter and you always get the computation result as a base-2 logarithm.
As a quick reminder: The base in a logarithm is the $b$ for which you are looking to find the $x$ such that $b^x=a$ for $x=log a$.
How many bits [are needed to] represent $x$ [phrases]?
So you have $x$ values and you want to know how many bits you need to have in order to identify all of them. Well, first note that 1 bit can address 2 values, 2 bits can address 4 values, 3 bits can address 8 values, etc. $n$ bits can address $2^n$ values. Thus we are looking for the smallest value $n$ such that $xleq 2^n$. So we first compute a logarithm, to an arbitrary base, on both sides of the inequality, which yields $log xleq log(2^n)=ncdot log 2Leftrightarrow log x/log 2leq n$, thus $log x/log 2$ bits are sufficient.
Thanks for helping, I the professor gave this formula: How many bits can represent X phases? Just take the Log(X), and divide by Log (2).
â Weaponized Autism
44 mins ago
@kelalaka sure.
â Weaponized Autism
7 mins ago
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
The Professor has really confused me here on this one on what the
$log 2$ mean? formula: $textEntropy = log(textPhrases)/log 2$
If I understand the problem correctly, you are asking what the $log 2$ is doing there in the denominator. This is essentially to ensure that the base of the logarithm (whether it's $10$ or $e$ or $2$) doesn't matter and you always get the computation result as a base-2 logarithm.
As a quick reminder: The base in a logarithm is the $b$ for which you are looking to find the $x$ such that $b^x=a$ for $x=log a$.
How many bits [are needed to] represent $x$ [phrases]?
So you have $x$ values and you want to know how many bits you need to have in order to identify all of them. Well, first note that 1 bit can address 2 values, 2 bits can address 4 values, 3 bits can address 8 values, etc. $n$ bits can address $2^n$ values. Thus we are looking for the smallest value $n$ such that $xleq 2^n$. So we first compute a logarithm, to an arbitrary base, on both sides of the inequality, which yields $log xleq log(2^n)=ncdot log 2Leftrightarrow log x/log 2leq n$, thus $log x/log 2$ bits are sufficient.
Thanks for helping, I the professor gave this formula: How many bits can represent X phases? Just take the Log(X), and divide by Log (2).
â Weaponized Autism
44 mins ago
@kelalaka sure.
â Weaponized Autism
7 mins ago
add a comment |Â
up vote
2
down vote
The Professor has really confused me here on this one on what the
$log 2$ mean? formula: $textEntropy = log(textPhrases)/log 2$
If I understand the problem correctly, you are asking what the $log 2$ is doing there in the denominator. This is essentially to ensure that the base of the logarithm (whether it's $10$ or $e$ or $2$) doesn't matter and you always get the computation result as a base-2 logarithm.
As a quick reminder: The base in a logarithm is the $b$ for which you are looking to find the $x$ such that $b^x=a$ for $x=log a$.
How many bits [are needed to] represent $x$ [phrases]?
So you have $x$ values and you want to know how many bits you need to have in order to identify all of them. Well, first note that 1 bit can address 2 values, 2 bits can address 4 values, 3 bits can address 8 values, etc. $n$ bits can address $2^n$ values. Thus we are looking for the smallest value $n$ such that $xleq 2^n$. So we first compute a logarithm, to an arbitrary base, on both sides of the inequality, which yields $log xleq log(2^n)=ncdot log 2Leftrightarrow log x/log 2leq n$, thus $log x/log 2$ bits are sufficient.
Thanks for helping, I the professor gave this formula: How many bits can represent X phases? Just take the Log(X), and divide by Log (2).
â Weaponized Autism
44 mins ago
@kelalaka sure.
â Weaponized Autism
7 mins ago
add a comment |Â
up vote
2
down vote
up vote
2
down vote
The Professor has really confused me here on this one on what the
$log 2$ mean? formula: $textEntropy = log(textPhrases)/log 2$
If I understand the problem correctly, you are asking what the $log 2$ is doing there in the denominator. This is essentially to ensure that the base of the logarithm (whether it's $10$ or $e$ or $2$) doesn't matter and you always get the computation result as a base-2 logarithm.
As a quick reminder: The base in a logarithm is the $b$ for which you are looking to find the $x$ such that $b^x=a$ for $x=log a$.
How many bits [are needed to] represent $x$ [phrases]?
So you have $x$ values and you want to know how many bits you need to have in order to identify all of them. Well, first note that 1 bit can address 2 values, 2 bits can address 4 values, 3 bits can address 8 values, etc. $n$ bits can address $2^n$ values. Thus we are looking for the smallest value $n$ such that $xleq 2^n$. So we first compute a logarithm, to an arbitrary base, on both sides of the inequality, which yields $log xleq log(2^n)=ncdot log 2Leftrightarrow log x/log 2leq n$, thus $log x/log 2$ bits are sufficient.
The Professor has really confused me here on this one on what the
$log 2$ mean? formula: $textEntropy = log(textPhrases)/log 2$
If I understand the problem correctly, you are asking what the $log 2$ is doing there in the denominator. This is essentially to ensure that the base of the logarithm (whether it's $10$ or $e$ or $2$) doesn't matter and you always get the computation result as a base-2 logarithm.
As a quick reminder: The base in a logarithm is the $b$ for which you are looking to find the $x$ such that $b^x=a$ for $x=log a$.
How many bits [are needed to] represent $x$ [phrases]?
So you have $x$ values and you want to know how many bits you need to have in order to identify all of them. Well, first note that 1 bit can address 2 values, 2 bits can address 4 values, 3 bits can address 8 values, etc. $n$ bits can address $2^n$ values. Thus we are looking for the smallest value $n$ such that $xleq 2^n$. So we first compute a logarithm, to an arbitrary base, on both sides of the inequality, which yields $log xleq log(2^n)=ncdot log 2Leftrightarrow log x/log 2leq n$, thus $log x/log 2$ bits are sufficient.
edited 29 mins ago
answered 1 hour ago
SEJPMâ¦
27.6k451130
27.6k451130
Thanks for helping, I the professor gave this formula: How many bits can represent X phases? Just take the Log(X), and divide by Log (2).
â Weaponized Autism
44 mins ago
@kelalaka sure.
â Weaponized Autism
7 mins ago
add a comment |Â
Thanks for helping, I the professor gave this formula: How many bits can represent X phases? Just take the Log(X), and divide by Log (2).
â Weaponized Autism
44 mins ago
@kelalaka sure.
â Weaponized Autism
7 mins ago
Thanks for helping, I the professor gave this formula: How many bits can represent X phases? Just take the Log(X), and divide by Log (2).
â Weaponized Autism
44 mins ago
Thanks for helping, I the professor gave this formula: How many bits can represent X phases? Just take the Log(X), and divide by Log (2).
â Weaponized Autism
44 mins ago
@kelalaka sure.
â Weaponized Autism
7 mins ago
@kelalaka sure.
â Weaponized Autism
7 mins ago
add a comment |Â
Weaponized Autism is a new contributor. Be nice, and check out our Code of Conduct.
Weaponized Autism is a new contributor. Be nice, and check out our Code of Conduct.
Weaponized Autism is a new contributor. Be nice, and check out our Code of Conduct.
Weaponized Autism is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f63698%2fkey-entropy-meaning-help%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Could you please edit your question to be more specific about what you don't understand about "key entropy"?
â SEJPMâ¦
1 hour ago
@SEJPM Done editing the question.
â Weaponized Autism
1 hour ago
Or written as $Entropy=log_2 (Phrases)$ this
â kelalaka
1 hour ago