Uncorrelatedness + Joint Normality = Independence. Why? Intuition and mechanics
Clash Royale CLAN TAG#URR8PPP
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;
up vote
3
down vote
favorite
Two variables that are uncorrelated are not necessarily independent, as is simply exemplified by the fact that $X$ and $X^2$ are uncorrelated but not independent. However, two variables that are uncorrelated AND jointly normally distributed are guaranteed to be independent. Can someone explain intuitively why this is true? What exactly does joint normality of two variables add to the knowledge of zero correlation between two variables, which leads us to conclude that these two variables MUST be independent?
correlation normal-distribution independence joint-distribution
New contributor
ColorStatistics is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
add a comment |
up vote
3
down vote
favorite
Two variables that are uncorrelated are not necessarily independent, as is simply exemplified by the fact that $X$ and $X^2$ are uncorrelated but not independent. However, two variables that are uncorrelated AND jointly normally distributed are guaranteed to be independent. Can someone explain intuitively why this is true? What exactly does joint normality of two variables add to the knowledge of zero correlation between two variables, which leads us to conclude that these two variables MUST be independent?
correlation normal-distribution independence joint-distribution
New contributor
ColorStatistics is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
add a comment |
up vote
3
down vote
favorite
up vote
3
down vote
favorite
Two variables that are uncorrelated are not necessarily independent, as is simply exemplified by the fact that $X$ and $X^2$ are uncorrelated but not independent. However, two variables that are uncorrelated AND jointly normally distributed are guaranteed to be independent. Can someone explain intuitively why this is true? What exactly does joint normality of two variables add to the knowledge of zero correlation between two variables, which leads us to conclude that these two variables MUST be independent?
correlation normal-distribution independence joint-distribution
New contributor
ColorStatistics is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
Two variables that are uncorrelated are not necessarily independent, as is simply exemplified by the fact that $X$ and $X^2$ are uncorrelated but not independent. However, two variables that are uncorrelated AND jointly normally distributed are guaranteed to be independent. Can someone explain intuitively why this is true? What exactly does joint normality of two variables add to the knowledge of zero correlation between two variables, which leads us to conclude that these two variables MUST be independent?
correlation normal-distribution independence joint-distribution
correlation normal-distribution independence joint-distribution
New contributor
ColorStatistics is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
New contributor
ColorStatistics is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
edited 2 hours ago
Michael Hardy
3,0101330
3,0101330
New contributor
ColorStatistics is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
asked 3 hours ago
ColorStatistics
454
454
New contributor
ColorStatistics is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
New contributor
ColorStatistics is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
ColorStatistics is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
up vote
4
down vote
accepted
The the joint probability density function (pdf) of bivariate normal distribution is:
$$f(x_1,x_2)=frac 12pisigma_1sigma_2sqrt1-rho^2exp[-frac z2(1-rho^2)], $$
where
$$z=frac(x_1-mu_1)^2sigma_1^2-frac2rho(x_1-mu_1)(x_2-mu_2)sigma_1sigma_2+frac(x_2-mu_2)^2sigma_2^2.$$
When $rho = 0$,
$$beginalignf(x_1,x_2) &=frac 12pisigma_1sigma_2exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2+frac(x_2-mu_2)^2sigma_2^2right ]\
& = frac 1sqrt2pisigma_1exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2right] frac 1sqrt2pisigma_2exp[-frac 12leftfrac(x_2-mu_2)^2sigma_2^2right]\ &= f(x_1)f(x_2)endalign$$.
So they are independent.
I was two lines slower than you! (+1)
– jbowman
2 hours ago
1
Thank you all. Elegant proof. It's clear now. It seems to me that given the flow of the proof, I should have asked what knowledge of zero correlation adds to knowledge of joint normality and not the other way around.
– ColorStatistics
1 hour ago
add a comment |
up vote
0
down vote
Joint normality of two random variables $X,Y$ can be characterized in either of two simple ways:
For every pair $a,b$ of (non-random) real numbers, $aX+bY$ has a univariate normal distribution.
There are random variables $Z_1,Z_2simoperatornametexti.i.d. operatorname N(0,1)$ and real numbers $a,b,c,d$ such that $$beginalign X & = aZ_1+bZ_2 \ textand Y & = cZ_1 + dZ_2. endalign$$
That the first of these follows from the second is easy to show. That the second follows from the first takes more work, and maybe I'll post on it soon . . .
If the second one it true, then $operatornamecov(X,Y) = ac + bd.$
If this covariance is $0,$ then the vectors $(a,b),$ $(c,d)$ are orthogonal to each other. Then $X$ is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto $(a,b)$ and $Y$ onto $(c,d).$
Now conjoin the fact of orthogonality with the circular symmetry of the joint density of $(Z_1,Z_2),$ to see that the distribution of $(X,Y)$ should be the same as the distribution of two random variables, one of which is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto the $x$-axis, i.e. it is a scalar multiple of $Z_1,$ and the other is similarly a scalar multiple of $Z_2.$
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
4
down vote
accepted
The the joint probability density function (pdf) of bivariate normal distribution is:
$$f(x_1,x_2)=frac 12pisigma_1sigma_2sqrt1-rho^2exp[-frac z2(1-rho^2)], $$
where
$$z=frac(x_1-mu_1)^2sigma_1^2-frac2rho(x_1-mu_1)(x_2-mu_2)sigma_1sigma_2+frac(x_2-mu_2)^2sigma_2^2.$$
When $rho = 0$,
$$beginalignf(x_1,x_2) &=frac 12pisigma_1sigma_2exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2+frac(x_2-mu_2)^2sigma_2^2right ]\
& = frac 1sqrt2pisigma_1exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2right] frac 1sqrt2pisigma_2exp[-frac 12leftfrac(x_2-mu_2)^2sigma_2^2right]\ &= f(x_1)f(x_2)endalign$$.
So they are independent.
I was two lines slower than you! (+1)
– jbowman
2 hours ago
1
Thank you all. Elegant proof. It's clear now. It seems to me that given the flow of the proof, I should have asked what knowledge of zero correlation adds to knowledge of joint normality and not the other way around.
– ColorStatistics
1 hour ago
add a comment |
up vote
4
down vote
accepted
The the joint probability density function (pdf) of bivariate normal distribution is:
$$f(x_1,x_2)=frac 12pisigma_1sigma_2sqrt1-rho^2exp[-frac z2(1-rho^2)], $$
where
$$z=frac(x_1-mu_1)^2sigma_1^2-frac2rho(x_1-mu_1)(x_2-mu_2)sigma_1sigma_2+frac(x_2-mu_2)^2sigma_2^2.$$
When $rho = 0$,
$$beginalignf(x_1,x_2) &=frac 12pisigma_1sigma_2exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2+frac(x_2-mu_2)^2sigma_2^2right ]\
& = frac 1sqrt2pisigma_1exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2right] frac 1sqrt2pisigma_2exp[-frac 12leftfrac(x_2-mu_2)^2sigma_2^2right]\ &= f(x_1)f(x_2)endalign$$.
So they are independent.
I was two lines slower than you! (+1)
– jbowman
2 hours ago
1
Thank you all. Elegant proof. It's clear now. It seems to me that given the flow of the proof, I should have asked what knowledge of zero correlation adds to knowledge of joint normality and not the other way around.
– ColorStatistics
1 hour ago
add a comment |
up vote
4
down vote
accepted
up vote
4
down vote
accepted
The the joint probability density function (pdf) of bivariate normal distribution is:
$$f(x_1,x_2)=frac 12pisigma_1sigma_2sqrt1-rho^2exp[-frac z2(1-rho^2)], $$
where
$$z=frac(x_1-mu_1)^2sigma_1^2-frac2rho(x_1-mu_1)(x_2-mu_2)sigma_1sigma_2+frac(x_2-mu_2)^2sigma_2^2.$$
When $rho = 0$,
$$beginalignf(x_1,x_2) &=frac 12pisigma_1sigma_2exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2+frac(x_2-mu_2)^2sigma_2^2right ]\
& = frac 1sqrt2pisigma_1exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2right] frac 1sqrt2pisigma_2exp[-frac 12leftfrac(x_2-mu_2)^2sigma_2^2right]\ &= f(x_1)f(x_2)endalign$$.
So they are independent.
The the joint probability density function (pdf) of bivariate normal distribution is:
$$f(x_1,x_2)=frac 12pisigma_1sigma_2sqrt1-rho^2exp[-frac z2(1-rho^2)], $$
where
$$z=frac(x_1-mu_1)^2sigma_1^2-frac2rho(x_1-mu_1)(x_2-mu_2)sigma_1sigma_2+frac(x_2-mu_2)^2sigma_2^2.$$
When $rho = 0$,
$$beginalignf(x_1,x_2) &=frac 12pisigma_1sigma_2exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2+frac(x_2-mu_2)^2sigma_2^2right ]\
& = frac 1sqrt2pisigma_1exp[-frac 12leftfrac(x_1-mu_1)^2sigma_1^2right] frac 1sqrt2pisigma_2exp[-frac 12leftfrac(x_2-mu_2)^2sigma_2^2right]\ &= f(x_1)f(x_2)endalign$$.
So they are independent.
edited 1 hour ago
Michael Hardy
3,0101330
3,0101330
answered 2 hours ago
a_statistician
3,220139
3,220139
I was two lines slower than you! (+1)
– jbowman
2 hours ago
1
Thank you all. Elegant proof. It's clear now. It seems to me that given the flow of the proof, I should have asked what knowledge of zero correlation adds to knowledge of joint normality and not the other way around.
– ColorStatistics
1 hour ago
add a comment |
I was two lines slower than you! (+1)
– jbowman
2 hours ago
1
Thank you all. Elegant proof. It's clear now. It seems to me that given the flow of the proof, I should have asked what knowledge of zero correlation adds to knowledge of joint normality and not the other way around.
– ColorStatistics
1 hour ago
I was two lines slower than you! (+1)
– jbowman
2 hours ago
I was two lines slower than you! (+1)
– jbowman
2 hours ago
1
1
Thank you all. Elegant proof. It's clear now. It seems to me that given the flow of the proof, I should have asked what knowledge of zero correlation adds to knowledge of joint normality and not the other way around.
– ColorStatistics
1 hour ago
Thank you all. Elegant proof. It's clear now. It seems to me that given the flow of the proof, I should have asked what knowledge of zero correlation adds to knowledge of joint normality and not the other way around.
– ColorStatistics
1 hour ago
add a comment |
up vote
0
down vote
Joint normality of two random variables $X,Y$ can be characterized in either of two simple ways:
For every pair $a,b$ of (non-random) real numbers, $aX+bY$ has a univariate normal distribution.
There are random variables $Z_1,Z_2simoperatornametexti.i.d. operatorname N(0,1)$ and real numbers $a,b,c,d$ such that $$beginalign X & = aZ_1+bZ_2 \ textand Y & = cZ_1 + dZ_2. endalign$$
That the first of these follows from the second is easy to show. That the second follows from the first takes more work, and maybe I'll post on it soon . . .
If the second one it true, then $operatornamecov(X,Y) = ac + bd.$
If this covariance is $0,$ then the vectors $(a,b),$ $(c,d)$ are orthogonal to each other. Then $X$ is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto $(a,b)$ and $Y$ onto $(c,d).$
Now conjoin the fact of orthogonality with the circular symmetry of the joint density of $(Z_1,Z_2),$ to see that the distribution of $(X,Y)$ should be the same as the distribution of two random variables, one of which is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto the $x$-axis, i.e. it is a scalar multiple of $Z_1,$ and the other is similarly a scalar multiple of $Z_2.$
add a comment |
up vote
0
down vote
Joint normality of two random variables $X,Y$ can be characterized in either of two simple ways:
For every pair $a,b$ of (non-random) real numbers, $aX+bY$ has a univariate normal distribution.
There are random variables $Z_1,Z_2simoperatornametexti.i.d. operatorname N(0,1)$ and real numbers $a,b,c,d$ such that $$beginalign X & = aZ_1+bZ_2 \ textand Y & = cZ_1 + dZ_2. endalign$$
That the first of these follows from the second is easy to show. That the second follows from the first takes more work, and maybe I'll post on it soon . . .
If the second one it true, then $operatornamecov(X,Y) = ac + bd.$
If this covariance is $0,$ then the vectors $(a,b),$ $(c,d)$ are orthogonal to each other. Then $X$ is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto $(a,b)$ and $Y$ onto $(c,d).$
Now conjoin the fact of orthogonality with the circular symmetry of the joint density of $(Z_1,Z_2),$ to see that the distribution of $(X,Y)$ should be the same as the distribution of two random variables, one of which is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto the $x$-axis, i.e. it is a scalar multiple of $Z_1,$ and the other is similarly a scalar multiple of $Z_2.$
add a comment |
up vote
0
down vote
up vote
0
down vote
Joint normality of two random variables $X,Y$ can be characterized in either of two simple ways:
For every pair $a,b$ of (non-random) real numbers, $aX+bY$ has a univariate normal distribution.
There are random variables $Z_1,Z_2simoperatornametexti.i.d. operatorname N(0,1)$ and real numbers $a,b,c,d$ such that $$beginalign X & = aZ_1+bZ_2 \ textand Y & = cZ_1 + dZ_2. endalign$$
That the first of these follows from the second is easy to show. That the second follows from the first takes more work, and maybe I'll post on it soon . . .
If the second one it true, then $operatornamecov(X,Y) = ac + bd.$
If this covariance is $0,$ then the vectors $(a,b),$ $(c,d)$ are orthogonal to each other. Then $X$ is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto $(a,b)$ and $Y$ onto $(c,d).$
Now conjoin the fact of orthogonality with the circular symmetry of the joint density of $(Z_1,Z_2),$ to see that the distribution of $(X,Y)$ should be the same as the distribution of two random variables, one of which is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto the $x$-axis, i.e. it is a scalar multiple of $Z_1,$ and the other is similarly a scalar multiple of $Z_2.$
Joint normality of two random variables $X,Y$ can be characterized in either of two simple ways:
For every pair $a,b$ of (non-random) real numbers, $aX+bY$ has a univariate normal distribution.
There are random variables $Z_1,Z_2simoperatornametexti.i.d. operatorname N(0,1)$ and real numbers $a,b,c,d$ such that $$beginalign X & = aZ_1+bZ_2 \ textand Y & = cZ_1 + dZ_2. endalign$$
That the first of these follows from the second is easy to show. That the second follows from the first takes more work, and maybe I'll post on it soon . . .
If the second one it true, then $operatornamecov(X,Y) = ac + bd.$
If this covariance is $0,$ then the vectors $(a,b),$ $(c,d)$ are orthogonal to each other. Then $X$ is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto $(a,b)$ and $Y$ onto $(c,d).$
Now conjoin the fact of orthogonality with the circular symmetry of the joint density of $(Z_1,Z_2),$ to see that the distribution of $(X,Y)$ should be the same as the distribution of two random variables, one of which is a scalar multiple of the orthogonal projection of $(Z_1,Z_2)$ onto the $x$-axis, i.e. it is a scalar multiple of $Z_1,$ and the other is similarly a scalar multiple of $Z_2.$
answered 1 hour ago
Michael Hardy
3,0101330
3,0101330
add a comment |
add a comment |
ColorStatistics is a new contributor. Be nice, and check out our Code of Conduct.
ColorStatistics is a new contributor. Be nice, and check out our Code of Conduct.
ColorStatistics is a new contributor. Be nice, and check out our Code of Conduct.
ColorStatistics is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f376229%2funcorrelatedness-joint-normality-independence-why-intuition-and-mechanics%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password