Key Entropy meaning Help

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite
2












Can anyone help me understand key entropy the mathematics behind it and the uses for it? Thank you.



P.S please don't ignore this I need this for my preparation for my Cryptography exam in two days. I just want someone to explain this to me since I have ASD. I understand most of it but I just want to understand what does $log 2$ do like if I do. $log 128 / log 2 = 7$?




  • The Professor has really confused me here on this one on what the $log 2$ mean? formula;



    $Entropy = log(Phrases)/log 2$



  • The professor gave this formula, too: How many bits can represent $X$ phases? Just take the $log X$, and divide by $log 2$.










share|improve this question









New contributor




Weaponized Autism is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.



















  • Could you please edit your question to be more specific about what you don't understand about "key entropy"?
    – SEJPM♦
    1 hour ago










  • @SEJPM Done editing the question.
    – Weaponized Autism
    1 hour ago










  • Or written as $Entropy=log_2 (Phrases)$ this
    – kelalaka
    1 hour ago















up vote
1
down vote

favorite
2












Can anyone help me understand key entropy the mathematics behind it and the uses for it? Thank you.



P.S please don't ignore this I need this for my preparation for my Cryptography exam in two days. I just want someone to explain this to me since I have ASD. I understand most of it but I just want to understand what does $log 2$ do like if I do. $log 128 / log 2 = 7$?




  • The Professor has really confused me here on this one on what the $log 2$ mean? formula;



    $Entropy = log(Phrases)/log 2$



  • The professor gave this formula, too: How many bits can represent $X$ phases? Just take the $log X$, and divide by $log 2$.










share|improve this question









New contributor




Weaponized Autism is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.



















  • Could you please edit your question to be more specific about what you don't understand about "key entropy"?
    – SEJPM♦
    1 hour ago










  • @SEJPM Done editing the question.
    – Weaponized Autism
    1 hour ago










  • Or written as $Entropy=log_2 (Phrases)$ this
    – kelalaka
    1 hour ago













up vote
1
down vote

favorite
2









up vote
1
down vote

favorite
2






2





Can anyone help me understand key entropy the mathematics behind it and the uses for it? Thank you.



P.S please don't ignore this I need this for my preparation for my Cryptography exam in two days. I just want someone to explain this to me since I have ASD. I understand most of it but I just want to understand what does $log 2$ do like if I do. $log 128 / log 2 = 7$?




  • The Professor has really confused me here on this one on what the $log 2$ mean? formula;



    $Entropy = log(Phrases)/log 2$



  • The professor gave this formula, too: How many bits can represent $X$ phases? Just take the $log X$, and divide by $log 2$.










share|improve this question









New contributor




Weaponized Autism is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











Can anyone help me understand key entropy the mathematics behind it and the uses for it? Thank you.



P.S please don't ignore this I need this for my preparation for my Cryptography exam in two days. I just want someone to explain this to me since I have ASD. I understand most of it but I just want to understand what does $log 2$ do like if I do. $log 128 / log 2 = 7$?




  • The Professor has really confused me here on this one on what the $log 2$ mean? formula;



    $Entropy = log(Phrases)/log 2$



  • The professor gave this formula, too: How many bits can represent $X$ phases? Just take the $log X$, and divide by $log 2$.







keys entropy






share|improve this question









New contributor




Weaponized Autism is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|improve this question









New contributor




Weaponized Autism is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|improve this question




share|improve this question








edited 6 mins ago









kelalaka

2,384524




2,384524






New contributor




Weaponized Autism is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked 1 hour ago









Weaponized Autism

64




64




New contributor




Weaponized Autism is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





Weaponized Autism is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






Weaponized Autism is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











  • Could you please edit your question to be more specific about what you don't understand about "key entropy"?
    – SEJPM♦
    1 hour ago










  • @SEJPM Done editing the question.
    – Weaponized Autism
    1 hour ago










  • Or written as $Entropy=log_2 (Phrases)$ this
    – kelalaka
    1 hour ago

















  • Could you please edit your question to be more specific about what you don't understand about "key entropy"?
    – SEJPM♦
    1 hour ago










  • @SEJPM Done editing the question.
    – Weaponized Autism
    1 hour ago










  • Or written as $Entropy=log_2 (Phrases)$ this
    – kelalaka
    1 hour ago
















Could you please edit your question to be more specific about what you don't understand about "key entropy"?
– SEJPM♦
1 hour ago




Could you please edit your question to be more specific about what you don't understand about "key entropy"?
– SEJPM♦
1 hour ago












@SEJPM Done editing the question.
– Weaponized Autism
1 hour ago




@SEJPM Done editing the question.
– Weaponized Autism
1 hour ago












Or written as $Entropy=log_2 (Phrases)$ this
– kelalaka
1 hour ago





Or written as $Entropy=log_2 (Phrases)$ this
– kelalaka
1 hour ago











1 Answer
1






active

oldest

votes

















up vote
2
down vote














The Professor has really confused me here on this one on what the
$log 2$ mean? formula: $textEntropy = log(textPhrases)/log 2$




If I understand the problem correctly, you are asking what the $log 2$ is doing there in the denominator. This is essentially to ensure that the base of the logarithm (whether it's $10$ or $e$ or $2$) doesn't matter and you always get the computation result as a base-2 logarithm.



As a quick reminder: The base in a logarithm is the $b$ for which you are looking to find the $x$ such that $b^x=a$ for $x=log a$.





How many bits [are needed to] represent $x$ [phrases]?




So you have $x$ values and you want to know how many bits you need to have in order to identify all of them. Well, first note that 1 bit can address 2 values, 2 bits can address 4 values, 3 bits can address 8 values, etc. $n$ bits can address $2^n$ values. Thus we are looking for the smallest value $n$ such that $xleq 2^n$. So we first compute a logarithm, to an arbitrary base, on both sides of the inequality, which yields $log xleq log(2^n)=ncdot log 2Leftrightarrow log x/log 2leq n$, thus $log x/log 2$ bits are sufficient.






share|improve this answer






















  • Thanks for helping, I the professor gave this formula: How many bits can represent X phases? Just take the Log(X), and divide by Log (2).
    – Weaponized Autism
    44 mins ago










  • @kelalaka sure.
    – Weaponized Autism
    7 mins ago










Your Answer





StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "281"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);






Weaponized Autism is a new contributor. Be nice, and check out our Code of Conduct.









 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f63698%2fkey-entropy-meaning-help%23new-answer', 'question_page');

);

Post as a guest






























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
2
down vote














The Professor has really confused me here on this one on what the
$log 2$ mean? formula: $textEntropy = log(textPhrases)/log 2$




If I understand the problem correctly, you are asking what the $log 2$ is doing there in the denominator. This is essentially to ensure that the base of the logarithm (whether it's $10$ or $e$ or $2$) doesn't matter and you always get the computation result as a base-2 logarithm.



As a quick reminder: The base in a logarithm is the $b$ for which you are looking to find the $x$ such that $b^x=a$ for $x=log a$.





How many bits [are needed to] represent $x$ [phrases]?




So you have $x$ values and you want to know how many bits you need to have in order to identify all of them. Well, first note that 1 bit can address 2 values, 2 bits can address 4 values, 3 bits can address 8 values, etc. $n$ bits can address $2^n$ values. Thus we are looking for the smallest value $n$ such that $xleq 2^n$. So we first compute a logarithm, to an arbitrary base, on both sides of the inequality, which yields $log xleq log(2^n)=ncdot log 2Leftrightarrow log x/log 2leq n$, thus $log x/log 2$ bits are sufficient.






share|improve this answer






















  • Thanks for helping, I the professor gave this formula: How many bits can represent X phases? Just take the Log(X), and divide by Log (2).
    – Weaponized Autism
    44 mins ago










  • @kelalaka sure.
    – Weaponized Autism
    7 mins ago














up vote
2
down vote














The Professor has really confused me here on this one on what the
$log 2$ mean? formula: $textEntropy = log(textPhrases)/log 2$




If I understand the problem correctly, you are asking what the $log 2$ is doing there in the denominator. This is essentially to ensure that the base of the logarithm (whether it's $10$ or $e$ or $2$) doesn't matter and you always get the computation result as a base-2 logarithm.



As a quick reminder: The base in a logarithm is the $b$ for which you are looking to find the $x$ such that $b^x=a$ for $x=log a$.





How many bits [are needed to] represent $x$ [phrases]?




So you have $x$ values and you want to know how many bits you need to have in order to identify all of them. Well, first note that 1 bit can address 2 values, 2 bits can address 4 values, 3 bits can address 8 values, etc. $n$ bits can address $2^n$ values. Thus we are looking for the smallest value $n$ such that $xleq 2^n$. So we first compute a logarithm, to an arbitrary base, on both sides of the inequality, which yields $log xleq log(2^n)=ncdot log 2Leftrightarrow log x/log 2leq n$, thus $log x/log 2$ bits are sufficient.






share|improve this answer






















  • Thanks for helping, I the professor gave this formula: How many bits can represent X phases? Just take the Log(X), and divide by Log (2).
    – Weaponized Autism
    44 mins ago










  • @kelalaka sure.
    – Weaponized Autism
    7 mins ago












up vote
2
down vote










up vote
2
down vote










The Professor has really confused me here on this one on what the
$log 2$ mean? formula: $textEntropy = log(textPhrases)/log 2$




If I understand the problem correctly, you are asking what the $log 2$ is doing there in the denominator. This is essentially to ensure that the base of the logarithm (whether it's $10$ or $e$ or $2$) doesn't matter and you always get the computation result as a base-2 logarithm.



As a quick reminder: The base in a logarithm is the $b$ for which you are looking to find the $x$ such that $b^x=a$ for $x=log a$.





How many bits [are needed to] represent $x$ [phrases]?




So you have $x$ values and you want to know how many bits you need to have in order to identify all of them. Well, first note that 1 bit can address 2 values, 2 bits can address 4 values, 3 bits can address 8 values, etc. $n$ bits can address $2^n$ values. Thus we are looking for the smallest value $n$ such that $xleq 2^n$. So we first compute a logarithm, to an arbitrary base, on both sides of the inequality, which yields $log xleq log(2^n)=ncdot log 2Leftrightarrow log x/log 2leq n$, thus $log x/log 2$ bits are sufficient.






share|improve this answer















The Professor has really confused me here on this one on what the
$log 2$ mean? formula: $textEntropy = log(textPhrases)/log 2$




If I understand the problem correctly, you are asking what the $log 2$ is doing there in the denominator. This is essentially to ensure that the base of the logarithm (whether it's $10$ or $e$ or $2$) doesn't matter and you always get the computation result as a base-2 logarithm.



As a quick reminder: The base in a logarithm is the $b$ for which you are looking to find the $x$ such that $b^x=a$ for $x=log a$.





How many bits [are needed to] represent $x$ [phrases]?




So you have $x$ values and you want to know how many bits you need to have in order to identify all of them. Well, first note that 1 bit can address 2 values, 2 bits can address 4 values, 3 bits can address 8 values, etc. $n$ bits can address $2^n$ values. Thus we are looking for the smallest value $n$ such that $xleq 2^n$. So we first compute a logarithm, to an arbitrary base, on both sides of the inequality, which yields $log xleq log(2^n)=ncdot log 2Leftrightarrow log x/log 2leq n$, thus $log x/log 2$ bits are sufficient.







share|improve this answer














share|improve this answer



share|improve this answer








edited 29 mins ago

























answered 1 hour ago









SEJPM♦

27.6k451130




27.6k451130











  • Thanks for helping, I the professor gave this formula: How many bits can represent X phases? Just take the Log(X), and divide by Log (2).
    – Weaponized Autism
    44 mins ago










  • @kelalaka sure.
    – Weaponized Autism
    7 mins ago
















  • Thanks for helping, I the professor gave this formula: How many bits can represent X phases? Just take the Log(X), and divide by Log (2).
    – Weaponized Autism
    44 mins ago










  • @kelalaka sure.
    – Weaponized Autism
    7 mins ago















Thanks for helping, I the professor gave this formula: How many bits can represent X phases? Just take the Log(X), and divide by Log (2).
– Weaponized Autism
44 mins ago




Thanks for helping, I the professor gave this formula: How many bits can represent X phases? Just take the Log(X), and divide by Log (2).
– Weaponized Autism
44 mins ago












@kelalaka sure.
– Weaponized Autism
7 mins ago




@kelalaka sure.
– Weaponized Autism
7 mins ago










Weaponized Autism is a new contributor. Be nice, and check out our Code of Conduct.









 

draft saved


draft discarded


















Weaponized Autism is a new contributor. Be nice, and check out our Code of Conduct.












Weaponized Autism is a new contributor. Be nice, and check out our Code of Conduct.











Weaponized Autism is a new contributor. Be nice, and check out our Code of Conduct.













 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f63698%2fkey-entropy-meaning-help%23new-answer', 'question_page');

);

Post as a guest













































































Comments

Popular posts from this blog

Long meetings (6-7 hours a day): Being “babysat” by supervisor

Is the Concept of Multiple Fantasy Races Scientifically Flawed? [closed]

Confectionery