Calculating Fisher Information for Bernoulli rv
Clash Royale CLAN TAG#URR8PPP
up vote
4
down vote
favorite
Let $X_1,...,X_n$ be Bernoulli distributed with unknown parameter $p$.
My objective is to calculate the information contained in the first observation of the sample.
I know that the pdf of $X$ is given by $$f(xmid p)=p^x(1-p)^1-x$$, and my book defines the Fisher information about $p$ as
$$I_X(p)=E_pleft[left(fracddplogleft(p^x(1-p)^1-xright)right)^2right]$$
After some calculations, I arrive at
$$I_X(p)=E_pleft[fracx^2p^2right]-2E_pleft[fracx(1-x)p(1-p)right]+E_pleft[frac(1-x)^2(1-p)^2right]$$
I know that the Fisher information about $p$ of a Bernoulli RV is $frac1p(1-p)$, but I don't know how to get rid of the X-values, since I'm calculating an expectation with respect to $p$, not $X$. Any clues?
statistics probability-distributions expected-value
add a comment |Â
up vote
4
down vote
favorite
Let $X_1,...,X_n$ be Bernoulli distributed with unknown parameter $p$.
My objective is to calculate the information contained in the first observation of the sample.
I know that the pdf of $X$ is given by $$f(xmid p)=p^x(1-p)^1-x$$, and my book defines the Fisher information about $p$ as
$$I_X(p)=E_pleft[left(fracddplogleft(p^x(1-p)^1-xright)right)^2right]$$
After some calculations, I arrive at
$$I_X(p)=E_pleft[fracx^2p^2right]-2E_pleft[fracx(1-x)p(1-p)right]+E_pleft[frac(1-x)^2(1-p)^2right]$$
I know that the Fisher information about $p$ of a Bernoulli RV is $frac1p(1-p)$, but I don't know how to get rid of the X-values, since I'm calculating an expectation with respect to $p$, not $X$. Any clues?
statistics probability-distributions expected-value
1
$+1$ for showing your work to derive the correct $I_X(p)$
â Ahmad Bazzi
4 hours ago
add a comment |Â
up vote
4
down vote
favorite
up vote
4
down vote
favorite
Let $X_1,...,X_n$ be Bernoulli distributed with unknown parameter $p$.
My objective is to calculate the information contained in the first observation of the sample.
I know that the pdf of $X$ is given by $$f(xmid p)=p^x(1-p)^1-x$$, and my book defines the Fisher information about $p$ as
$$I_X(p)=E_pleft[left(fracddplogleft(p^x(1-p)^1-xright)right)^2right]$$
After some calculations, I arrive at
$$I_X(p)=E_pleft[fracx^2p^2right]-2E_pleft[fracx(1-x)p(1-p)right]+E_pleft[frac(1-x)^2(1-p)^2right]$$
I know that the Fisher information about $p$ of a Bernoulli RV is $frac1p(1-p)$, but I don't know how to get rid of the X-values, since I'm calculating an expectation with respect to $p$, not $X$. Any clues?
statistics probability-distributions expected-value
Let $X_1,...,X_n$ be Bernoulli distributed with unknown parameter $p$.
My objective is to calculate the information contained in the first observation of the sample.
I know that the pdf of $X$ is given by $$f(xmid p)=p^x(1-p)^1-x$$, and my book defines the Fisher information about $p$ as
$$I_X(p)=E_pleft[left(fracddplogleft(p^x(1-p)^1-xright)right)^2right]$$
After some calculations, I arrive at
$$I_X(p)=E_pleft[fracx^2p^2right]-2E_pleft[fracx(1-x)p(1-p)right]+E_pleft[frac(1-x)^2(1-p)^2right]$$
I know that the Fisher information about $p$ of a Bernoulli RV is $frac1p(1-p)$, but I don't know how to get rid of the X-values, since I'm calculating an expectation with respect to $p$, not $X$. Any clues?
statistics probability-distributions expected-value
statistics probability-distributions expected-value
edited 4 hours ago
StubbornAtom
4,09511136
4,09511136
asked 5 hours ago
DavidS
1348
1348
1
$+1$ for showing your work to derive the correct $I_X(p)$
â Ahmad Bazzi
4 hours ago
add a comment |Â
1
$+1$ for showing your work to derive the correct $I_X(p)$
â Ahmad Bazzi
4 hours ago
1
1
$+1$ for showing your work to derive the correct $I_X(p)$
â Ahmad Bazzi
4 hours ago
$+1$ for showing your work to derive the correct $I_X(p)$
â Ahmad Bazzi
4 hours ago
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
4
down vote
accepted
beginequation
I_X(p)=E_p left[fracx^2p^2right]-2E_p left[ fracx - x^2p(1-p) right] + E_p left[ fracx^2 - 2x + 1(1-p)^2right] tag1
endequation
For a Bernoulli RV, we know
beginalign
E(x) &= 0(Pr(X = 0)) + 1(Pr(X = 1)) = p\
E(x^2) &= 0^2(Pr(X = 0)) + 1^2(Pr(X = 1)) = p
endalign
Now, replace in $(1)$, we get
beginequation
I_X(p)=fracpp^2-2frac0-0p(1-p)+fracp-2p+1(1-p)^2
=
frac1p-fracp-1(1-p)^2
=
frac1p
-
frac1p-1
=
frac1p(p-1)
endequation
thanks @MichaelHardy
â Ahmad Bazzi
2 hours ago
I always thought that when an expectation is written as $E_p$, for example, it means we treat the other variables (not p) as constants. Apparently not?
â DavidS
2 hours ago
1
Yes i understand what you mean. I usually do not write $E_p()$. I write $E_X vert p()$, where the expectation is taken with respect to the samples given the parameters you want to estimate, i.e.
â Ahmad Bazzi
2 hours ago
1
The expectation is taken with respect to any random variable (whether is $X$, $Y$ or whatever other r.v. $E_p$ only intends to remark the fact that your model for the distribution of $X$ is not fully specified, but it is uncertain up to a parameter $p$, and thus the corresponding expectations taken may depend on that value $p$.
â Alejandro Nasif Salum
2 hours ago
Also, if you put $I_X$, then it is the information of the random variable $X$ which has to be calculated, and so the formula should be in terms of $X$, not $x$. Otherwise, you're finding $I_x$.
â Alejandro Nasif Salum
2 hours ago
 |Â
show 1 more comment
up vote
1
down vote
Actually, the Fisher information of $X$ about $p$ is
$$I_X(p)=E_pleft[left(fracddplog f(X|p)right)^2right],$$
that is
$$I_X(p)=E_pleft[left(fracddplogleft(p^X(1-p)^1-Xright)right)^2right].$$
I've only changed every $x$ by $X$, which may seem as a subtlety, but then you get
$$I_X(p)=E_pleft(fracX^2p^2right)-2E_pleft(fracX(1-X)p(1-p)right)+E_pleft(frac(1-X)^2(1-p)^2right).$$
The expectation is there for the fact that $X$ is a random variable. So, for instance:
$$E_pleft(fracX^2p^2right)=fracE_pleft(X^2right)p^2=fracpp^2=frac1p.$$
Here I used the fact that $E_p(X^2)=p$, which can easily be seen as
$$E_p(X^2)=0^2cdot p_X(0)+1^2cdot p_X(1)=0^2(1-p)+1^2p=p,$$
or by the observation that $Xsim Be(p)implies X^nsim Be(p)$ as well. Then you can go on with the remaining terms.
Additionally, an equivalent formula can be proved for $I_X(p)$ given the second derivative of $log f$ is well defined. This is
$$I_X(p)=-E_pleft(fracd^2dp^2log f(X|p)right),$$
and many times you'll get simpler expressions. In this case, for instance, you get
$$I_X(p)=-E_pleft(fracd^2dp^2log p^X(1-p)^1-Xright)=$$
$$=-E_pleft(-frac Xp^2-frac1-X(1-p)^2right)=frac E_p(X)p^2+fracE_p(1-X)(1-p)^2=$$
$$=frac pp^2+frac1-p(1-p)^2=frac 1p+frac 11-p=frac 1p(1-p),$$
as desired.
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
4
down vote
accepted
beginequation
I_X(p)=E_p left[fracx^2p^2right]-2E_p left[ fracx - x^2p(1-p) right] + E_p left[ fracx^2 - 2x + 1(1-p)^2right] tag1
endequation
For a Bernoulli RV, we know
beginalign
E(x) &= 0(Pr(X = 0)) + 1(Pr(X = 1)) = p\
E(x^2) &= 0^2(Pr(X = 0)) + 1^2(Pr(X = 1)) = p
endalign
Now, replace in $(1)$, we get
beginequation
I_X(p)=fracpp^2-2frac0-0p(1-p)+fracp-2p+1(1-p)^2
=
frac1p-fracp-1(1-p)^2
=
frac1p
-
frac1p-1
=
frac1p(p-1)
endequation
thanks @MichaelHardy
â Ahmad Bazzi
2 hours ago
I always thought that when an expectation is written as $E_p$, for example, it means we treat the other variables (not p) as constants. Apparently not?
â DavidS
2 hours ago
1
Yes i understand what you mean. I usually do not write $E_p()$. I write $E_X vert p()$, where the expectation is taken with respect to the samples given the parameters you want to estimate, i.e.
â Ahmad Bazzi
2 hours ago
1
The expectation is taken with respect to any random variable (whether is $X$, $Y$ or whatever other r.v. $E_p$ only intends to remark the fact that your model for the distribution of $X$ is not fully specified, but it is uncertain up to a parameter $p$, and thus the corresponding expectations taken may depend on that value $p$.
â Alejandro Nasif Salum
2 hours ago
Also, if you put $I_X$, then it is the information of the random variable $X$ which has to be calculated, and so the formula should be in terms of $X$, not $x$. Otherwise, you're finding $I_x$.
â Alejandro Nasif Salum
2 hours ago
 |Â
show 1 more comment
up vote
4
down vote
accepted
beginequation
I_X(p)=E_p left[fracx^2p^2right]-2E_p left[ fracx - x^2p(1-p) right] + E_p left[ fracx^2 - 2x + 1(1-p)^2right] tag1
endequation
For a Bernoulli RV, we know
beginalign
E(x) &= 0(Pr(X = 0)) + 1(Pr(X = 1)) = p\
E(x^2) &= 0^2(Pr(X = 0)) + 1^2(Pr(X = 1)) = p
endalign
Now, replace in $(1)$, we get
beginequation
I_X(p)=fracpp^2-2frac0-0p(1-p)+fracp-2p+1(1-p)^2
=
frac1p-fracp-1(1-p)^2
=
frac1p
-
frac1p-1
=
frac1p(p-1)
endequation
thanks @MichaelHardy
â Ahmad Bazzi
2 hours ago
I always thought that when an expectation is written as $E_p$, for example, it means we treat the other variables (not p) as constants. Apparently not?
â DavidS
2 hours ago
1
Yes i understand what you mean. I usually do not write $E_p()$. I write $E_X vert p()$, where the expectation is taken with respect to the samples given the parameters you want to estimate, i.e.
â Ahmad Bazzi
2 hours ago
1
The expectation is taken with respect to any random variable (whether is $X$, $Y$ or whatever other r.v. $E_p$ only intends to remark the fact that your model for the distribution of $X$ is not fully specified, but it is uncertain up to a parameter $p$, and thus the corresponding expectations taken may depend on that value $p$.
â Alejandro Nasif Salum
2 hours ago
Also, if you put $I_X$, then it is the information of the random variable $X$ which has to be calculated, and so the formula should be in terms of $X$, not $x$. Otherwise, you're finding $I_x$.
â Alejandro Nasif Salum
2 hours ago
 |Â
show 1 more comment
up vote
4
down vote
accepted
up vote
4
down vote
accepted
beginequation
I_X(p)=E_p left[fracx^2p^2right]-2E_p left[ fracx - x^2p(1-p) right] + E_p left[ fracx^2 - 2x + 1(1-p)^2right] tag1
endequation
For a Bernoulli RV, we know
beginalign
E(x) &= 0(Pr(X = 0)) + 1(Pr(X = 1)) = p\
E(x^2) &= 0^2(Pr(X = 0)) + 1^2(Pr(X = 1)) = p
endalign
Now, replace in $(1)$, we get
beginequation
I_X(p)=fracpp^2-2frac0-0p(1-p)+fracp-2p+1(1-p)^2
=
frac1p-fracp-1(1-p)^2
=
frac1p
-
frac1p-1
=
frac1p(p-1)
endequation
beginequation
I_X(p)=E_p left[fracx^2p^2right]-2E_p left[ fracx - x^2p(1-p) right] + E_p left[ fracx^2 - 2x + 1(1-p)^2right] tag1
endequation
For a Bernoulli RV, we know
beginalign
E(x) &= 0(Pr(X = 0)) + 1(Pr(X = 1)) = p\
E(x^2) &= 0^2(Pr(X = 0)) + 1^2(Pr(X = 1)) = p
endalign
Now, replace in $(1)$, we get
beginequation
I_X(p)=fracpp^2-2frac0-0p(1-p)+fracp-2p+1(1-p)^2
=
frac1p-fracp-1(1-p)^2
=
frac1p
-
frac1p-1
=
frac1p(p-1)
endequation
edited 1 hour ago
Michael Hardy
206k23187466
206k23187466
answered 4 hours ago
Ahmad Bazzi
5,7241623
5,7241623
thanks @MichaelHardy
â Ahmad Bazzi
2 hours ago
I always thought that when an expectation is written as $E_p$, for example, it means we treat the other variables (not p) as constants. Apparently not?
â DavidS
2 hours ago
1
Yes i understand what you mean. I usually do not write $E_p()$. I write $E_X vert p()$, where the expectation is taken with respect to the samples given the parameters you want to estimate, i.e.
â Ahmad Bazzi
2 hours ago
1
The expectation is taken with respect to any random variable (whether is $X$, $Y$ or whatever other r.v. $E_p$ only intends to remark the fact that your model for the distribution of $X$ is not fully specified, but it is uncertain up to a parameter $p$, and thus the corresponding expectations taken may depend on that value $p$.
â Alejandro Nasif Salum
2 hours ago
Also, if you put $I_X$, then it is the information of the random variable $X$ which has to be calculated, and so the formula should be in terms of $X$, not $x$. Otherwise, you're finding $I_x$.
â Alejandro Nasif Salum
2 hours ago
 |Â
show 1 more comment
thanks @MichaelHardy
â Ahmad Bazzi
2 hours ago
I always thought that when an expectation is written as $E_p$, for example, it means we treat the other variables (not p) as constants. Apparently not?
â DavidS
2 hours ago
1
Yes i understand what you mean. I usually do not write $E_p()$. I write $E_X vert p()$, where the expectation is taken with respect to the samples given the parameters you want to estimate, i.e.
â Ahmad Bazzi
2 hours ago
1
The expectation is taken with respect to any random variable (whether is $X$, $Y$ or whatever other r.v. $E_p$ only intends to remark the fact that your model for the distribution of $X$ is not fully specified, but it is uncertain up to a parameter $p$, and thus the corresponding expectations taken may depend on that value $p$.
â Alejandro Nasif Salum
2 hours ago
Also, if you put $I_X$, then it is the information of the random variable $X$ which has to be calculated, and so the formula should be in terms of $X$, not $x$. Otherwise, you're finding $I_x$.
â Alejandro Nasif Salum
2 hours ago
thanks @MichaelHardy
â Ahmad Bazzi
2 hours ago
thanks @MichaelHardy
â Ahmad Bazzi
2 hours ago
I always thought that when an expectation is written as $E_p$, for example, it means we treat the other variables (not p) as constants. Apparently not?
â DavidS
2 hours ago
I always thought that when an expectation is written as $E_p$, for example, it means we treat the other variables (not p) as constants. Apparently not?
â DavidS
2 hours ago
1
1
Yes i understand what you mean. I usually do not write $E_p()$. I write $E_X vert p()$, where the expectation is taken with respect to the samples given the parameters you want to estimate, i.e.
â Ahmad Bazzi
2 hours ago
Yes i understand what you mean. I usually do not write $E_p()$. I write $E_X vert p()$, where the expectation is taken with respect to the samples given the parameters you want to estimate, i.e.
â Ahmad Bazzi
2 hours ago
1
1
The expectation is taken with respect to any random variable (whether is $X$, $Y$ or whatever other r.v. $E_p$ only intends to remark the fact that your model for the distribution of $X$ is not fully specified, but it is uncertain up to a parameter $p$, and thus the corresponding expectations taken may depend on that value $p$.
â Alejandro Nasif Salum
2 hours ago
The expectation is taken with respect to any random variable (whether is $X$, $Y$ or whatever other r.v. $E_p$ only intends to remark the fact that your model for the distribution of $X$ is not fully specified, but it is uncertain up to a parameter $p$, and thus the corresponding expectations taken may depend on that value $p$.
â Alejandro Nasif Salum
2 hours ago
Also, if you put $I_X$, then it is the information of the random variable $X$ which has to be calculated, and so the formula should be in terms of $X$, not $x$. Otherwise, you're finding $I_x$.
â Alejandro Nasif Salum
2 hours ago
Also, if you put $I_X$, then it is the information of the random variable $X$ which has to be calculated, and so the formula should be in terms of $X$, not $x$. Otherwise, you're finding $I_x$.
â Alejandro Nasif Salum
2 hours ago
 |Â
show 1 more comment
up vote
1
down vote
Actually, the Fisher information of $X$ about $p$ is
$$I_X(p)=E_pleft[left(fracddplog f(X|p)right)^2right],$$
that is
$$I_X(p)=E_pleft[left(fracddplogleft(p^X(1-p)^1-Xright)right)^2right].$$
I've only changed every $x$ by $X$, which may seem as a subtlety, but then you get
$$I_X(p)=E_pleft(fracX^2p^2right)-2E_pleft(fracX(1-X)p(1-p)right)+E_pleft(frac(1-X)^2(1-p)^2right).$$
The expectation is there for the fact that $X$ is a random variable. So, for instance:
$$E_pleft(fracX^2p^2right)=fracE_pleft(X^2right)p^2=fracpp^2=frac1p.$$
Here I used the fact that $E_p(X^2)=p$, which can easily be seen as
$$E_p(X^2)=0^2cdot p_X(0)+1^2cdot p_X(1)=0^2(1-p)+1^2p=p,$$
or by the observation that $Xsim Be(p)implies X^nsim Be(p)$ as well. Then you can go on with the remaining terms.
Additionally, an equivalent formula can be proved for $I_X(p)$ given the second derivative of $log f$ is well defined. This is
$$I_X(p)=-E_pleft(fracd^2dp^2log f(X|p)right),$$
and many times you'll get simpler expressions. In this case, for instance, you get
$$I_X(p)=-E_pleft(fracd^2dp^2log p^X(1-p)^1-Xright)=$$
$$=-E_pleft(-frac Xp^2-frac1-X(1-p)^2right)=frac E_p(X)p^2+fracE_p(1-X)(1-p)^2=$$
$$=frac pp^2+frac1-p(1-p)^2=frac 1p+frac 11-p=frac 1p(1-p),$$
as desired.
add a comment |Â
up vote
1
down vote
Actually, the Fisher information of $X$ about $p$ is
$$I_X(p)=E_pleft[left(fracddplog f(X|p)right)^2right],$$
that is
$$I_X(p)=E_pleft[left(fracddplogleft(p^X(1-p)^1-Xright)right)^2right].$$
I've only changed every $x$ by $X$, which may seem as a subtlety, but then you get
$$I_X(p)=E_pleft(fracX^2p^2right)-2E_pleft(fracX(1-X)p(1-p)right)+E_pleft(frac(1-X)^2(1-p)^2right).$$
The expectation is there for the fact that $X$ is a random variable. So, for instance:
$$E_pleft(fracX^2p^2right)=fracE_pleft(X^2right)p^2=fracpp^2=frac1p.$$
Here I used the fact that $E_p(X^2)=p$, which can easily be seen as
$$E_p(X^2)=0^2cdot p_X(0)+1^2cdot p_X(1)=0^2(1-p)+1^2p=p,$$
or by the observation that $Xsim Be(p)implies X^nsim Be(p)$ as well. Then you can go on with the remaining terms.
Additionally, an equivalent formula can be proved for $I_X(p)$ given the second derivative of $log f$ is well defined. This is
$$I_X(p)=-E_pleft(fracd^2dp^2log f(X|p)right),$$
and many times you'll get simpler expressions. In this case, for instance, you get
$$I_X(p)=-E_pleft(fracd^2dp^2log p^X(1-p)^1-Xright)=$$
$$=-E_pleft(-frac Xp^2-frac1-X(1-p)^2right)=frac E_p(X)p^2+fracE_p(1-X)(1-p)^2=$$
$$=frac pp^2+frac1-p(1-p)^2=frac 1p+frac 11-p=frac 1p(1-p),$$
as desired.
add a comment |Â
up vote
1
down vote
up vote
1
down vote
Actually, the Fisher information of $X$ about $p$ is
$$I_X(p)=E_pleft[left(fracddplog f(X|p)right)^2right],$$
that is
$$I_X(p)=E_pleft[left(fracddplogleft(p^X(1-p)^1-Xright)right)^2right].$$
I've only changed every $x$ by $X$, which may seem as a subtlety, but then you get
$$I_X(p)=E_pleft(fracX^2p^2right)-2E_pleft(fracX(1-X)p(1-p)right)+E_pleft(frac(1-X)^2(1-p)^2right).$$
The expectation is there for the fact that $X$ is a random variable. So, for instance:
$$E_pleft(fracX^2p^2right)=fracE_pleft(X^2right)p^2=fracpp^2=frac1p.$$
Here I used the fact that $E_p(X^2)=p$, which can easily be seen as
$$E_p(X^2)=0^2cdot p_X(0)+1^2cdot p_X(1)=0^2(1-p)+1^2p=p,$$
or by the observation that $Xsim Be(p)implies X^nsim Be(p)$ as well. Then you can go on with the remaining terms.
Additionally, an equivalent formula can be proved for $I_X(p)$ given the second derivative of $log f$ is well defined. This is
$$I_X(p)=-E_pleft(fracd^2dp^2log f(X|p)right),$$
and many times you'll get simpler expressions. In this case, for instance, you get
$$I_X(p)=-E_pleft(fracd^2dp^2log p^X(1-p)^1-Xright)=$$
$$=-E_pleft(-frac Xp^2-frac1-X(1-p)^2right)=frac E_p(X)p^2+fracE_p(1-X)(1-p)^2=$$
$$=frac pp^2+frac1-p(1-p)^2=frac 1p+frac 11-p=frac 1p(1-p),$$
as desired.
Actually, the Fisher information of $X$ about $p$ is
$$I_X(p)=E_pleft[left(fracddplog f(X|p)right)^2right],$$
that is
$$I_X(p)=E_pleft[left(fracddplogleft(p^X(1-p)^1-Xright)right)^2right].$$
I've only changed every $x$ by $X$, which may seem as a subtlety, but then you get
$$I_X(p)=E_pleft(fracX^2p^2right)-2E_pleft(fracX(1-X)p(1-p)right)+E_pleft(frac(1-X)^2(1-p)^2right).$$
The expectation is there for the fact that $X$ is a random variable. So, for instance:
$$E_pleft(fracX^2p^2right)=fracE_pleft(X^2right)p^2=fracpp^2=frac1p.$$
Here I used the fact that $E_p(X^2)=p$, which can easily be seen as
$$E_p(X^2)=0^2cdot p_X(0)+1^2cdot p_X(1)=0^2(1-p)+1^2p=p,$$
or by the observation that $Xsim Be(p)implies X^nsim Be(p)$ as well. Then you can go on with the remaining terms.
Additionally, an equivalent formula can be proved for $I_X(p)$ given the second derivative of $log f$ is well defined. This is
$$I_X(p)=-E_pleft(fracd^2dp^2log f(X|p)right),$$
and many times you'll get simpler expressions. In this case, for instance, you get
$$I_X(p)=-E_pleft(fracd^2dp^2log p^X(1-p)^1-Xright)=$$
$$=-E_pleft(-frac Xp^2-frac1-X(1-p)^2right)=frac E_p(X)p^2+fracE_p(1-X)(1-p)^2=$$
$$=frac pp^2+frac1-p(1-p)^2=frac 1p+frac 11-p=frac 1p(1-p),$$
as desired.
edited 4 hours ago
answered 5 hours ago
Alejandro Nasif Salum
3,309117
3,309117
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2919044%2fcalculating-fisher-information-for-bernoulli-rv%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
$+1$ for showing your work to derive the correct $I_X(p)$
â Ahmad Bazzi
4 hours ago