Why is random noise assumed to be normally distributed?
Clash Royale CLAN TAG#URR8PPP
up vote
2
down vote
favorite
From residual in the linear regression to noise in signal processing are assumed to be normally distributed? By considering them as normally distributed we are kind of telling the pattern in the noise but shouldn't noise be considered random. This seems contradictory to me as on one side it is random then on the other side their distribution is considered normally distributed. Shouldn't the noise distribution be just random?
I believe there is some lacking in my understanding of the concept of statistical distribution which has lead me to this confusion, or I am looking at it all wrong.
One more example- when one augment data by adding Gaussian noise then it is not expected to change the overall distribution of data, why?
noise gaussian
New contributor
zeal is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
add a comment |Â
up vote
2
down vote
favorite
From residual in the linear regression to noise in signal processing are assumed to be normally distributed? By considering them as normally distributed we are kind of telling the pattern in the noise but shouldn't noise be considered random. This seems contradictory to me as on one side it is random then on the other side their distribution is considered normally distributed. Shouldn't the noise distribution be just random?
I believe there is some lacking in my understanding of the concept of statistical distribution which has lead me to this confusion, or I am looking at it all wrong.
One more example- when one augment data by adding Gaussian noise then it is not expected to change the overall distribution of data, why?
noise gaussian
New contributor
zeal is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
some questions first- are you familiar with the central limit theorem? This helps to understand why many processes from our natural environment are Gaussian distributed. To answer your second question, the distributions will convolve so depending on the distribution of the data it will change the distribution. However in this context we often consider the data to be "signal"- we are often interested in how noise compares to the signal. In this case the noise would be every sample deviation relative to where signal should be- which is the original noise, so has the same dist.
– Dan Boschen
9 hours ago
2
Possible duplicate of Why is Gaussian noise called so?
– MBaz
8 hours ago
I was going to leave an answer along the lines of the physical phenomena but @MBaz 's answer covers that. I think that the way this question is posed it is better to look at "reality" first and then look at the mathematics that are used to describe it. Checkout for example the Gaussian as a solution to the diffusion equation. This can help you, conceptually, to see why it applies to so many things in nature.
– A_A
18 mins ago
add a comment |Â
up vote
2
down vote
favorite
up vote
2
down vote
favorite
From residual in the linear regression to noise in signal processing are assumed to be normally distributed? By considering them as normally distributed we are kind of telling the pattern in the noise but shouldn't noise be considered random. This seems contradictory to me as on one side it is random then on the other side their distribution is considered normally distributed. Shouldn't the noise distribution be just random?
I believe there is some lacking in my understanding of the concept of statistical distribution which has lead me to this confusion, or I am looking at it all wrong.
One more example- when one augment data by adding Gaussian noise then it is not expected to change the overall distribution of data, why?
noise gaussian
New contributor
zeal is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
From residual in the linear regression to noise in signal processing are assumed to be normally distributed? By considering them as normally distributed we are kind of telling the pattern in the noise but shouldn't noise be considered random. This seems contradictory to me as on one side it is random then on the other side their distribution is considered normally distributed. Shouldn't the noise distribution be just random?
I believe there is some lacking in my understanding of the concept of statistical distribution which has lead me to this confusion, or I am looking at it all wrong.
One more example- when one augment data by adding Gaussian noise then it is not expected to change the overall distribution of data, why?
noise gaussian
noise gaussian
New contributor
zeal is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
New contributor
zeal is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
New contributor
zeal is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
asked 9 hours ago
zeal
111
111
New contributor
zeal is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
New contributor
zeal is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
zeal is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
some questions first- are you familiar with the central limit theorem? This helps to understand why many processes from our natural environment are Gaussian distributed. To answer your second question, the distributions will convolve so depending on the distribution of the data it will change the distribution. However in this context we often consider the data to be "signal"- we are often interested in how noise compares to the signal. In this case the noise would be every sample deviation relative to where signal should be- which is the original noise, so has the same dist.
– Dan Boschen
9 hours ago
2
Possible duplicate of Why is Gaussian noise called so?
– MBaz
8 hours ago
I was going to leave an answer along the lines of the physical phenomena but @MBaz 's answer covers that. I think that the way this question is posed it is better to look at "reality" first and then look at the mathematics that are used to describe it. Checkout for example the Gaussian as a solution to the diffusion equation. This can help you, conceptually, to see why it applies to so many things in nature.
– A_A
18 mins ago
add a comment |Â
some questions first- are you familiar with the central limit theorem? This helps to understand why many processes from our natural environment are Gaussian distributed. To answer your second question, the distributions will convolve so depending on the distribution of the data it will change the distribution. However in this context we often consider the data to be "signal"- we are often interested in how noise compares to the signal. In this case the noise would be every sample deviation relative to where signal should be- which is the original noise, so has the same dist.
– Dan Boschen
9 hours ago
2
Possible duplicate of Why is Gaussian noise called so?
– MBaz
8 hours ago
I was going to leave an answer along the lines of the physical phenomena but @MBaz 's answer covers that. I think that the way this question is posed it is better to look at "reality" first and then look at the mathematics that are used to describe it. Checkout for example the Gaussian as a solution to the diffusion equation. This can help you, conceptually, to see why it applies to so many things in nature.
– A_A
18 mins ago
some questions first- are you familiar with the central limit theorem? This helps to understand why many processes from our natural environment are Gaussian distributed. To answer your second question, the distributions will convolve so depending on the distribution of the data it will change the distribution. However in this context we often consider the data to be "signal"- we are often interested in how noise compares to the signal. In this case the noise would be every sample deviation relative to where signal should be- which is the original noise, so has the same dist.
– Dan Boschen
9 hours ago
some questions first- are you familiar with the central limit theorem? This helps to understand why many processes from our natural environment are Gaussian distributed. To answer your second question, the distributions will convolve so depending on the distribution of the data it will change the distribution. However in this context we often consider the data to be "signal"- we are often interested in how noise compares to the signal. In this case the noise would be every sample deviation relative to where signal should be- which is the original noise, so has the same dist.
– Dan Boschen
9 hours ago
2
2
Possible duplicate of Why is Gaussian noise called so?
– MBaz
8 hours ago
Possible duplicate of Why is Gaussian noise called so?
– MBaz
8 hours ago
I was going to leave an answer along the lines of the physical phenomena but @MBaz 's answer covers that. I think that the way this question is posed it is better to look at "reality" first and then look at the mathematics that are used to describe it. Checkout for example the Gaussian as a solution to the diffusion equation. This can help you, conceptually, to see why it applies to so many things in nature.
– A_A
18 mins ago
I was going to leave an answer along the lines of the physical phenomena but @MBaz 's answer covers that. I think that the way this question is posed it is better to look at "reality" first and then look at the mathematics that are used to describe it. Checkout for example the Gaussian as a solution to the diffusion equation. This can help you, conceptually, to see why it applies to so many things in nature.
– A_A
18 mins ago
add a comment |Â
3 Answers
3
active
oldest
votes
up vote
2
down vote
the place to look are the weak and strong law of large numbers, which is the basis of the central limit theorem, which states that if you add a large number of independent random variable with some mild conditions on the variance of those random numbers, the sum will become indistinguishable from a Normal Distribution.
A Normal Distribution also has the property of the maximum entropy of all distributions with bound variance.
The Normal Distribution is key in linear estimation but it should be noted that it isn’t the only distribution considered in Signal Processing while it may seem so to a newcomer.
The Normal is often a good model. Many physical noise mechanisms are Normally distributed. It also tends to admit closed form solutions.
One also encounters situations where the Normal assumption works despite not be a fully accurate assumption.
I don’t understand your last statement. Data has a distribution and adding Normal noise doesn’t change that distribution. The Signal and Noise distribution reflects both.
There are are also “refinements†or corrections to Normal Distributions like Gram Chalier series.
I think his last statement is observing the classical binary modulation distribution-- the distribution is of course changed, but represents two Gaussian curves one centered at a mean of $+sqrtE$ and the other at $-sqrtE$, with the same distribution from each mean.
– Dan Boschen
9 hours ago
add a comment |Â
up vote
2
down vote
I'll try to clear one possible source of confusion. If picking each sample value from a single distribution feels "not random enough", then let's try to make things "more random" by adding another layer of randomness. This will be found to be futile.
Imagine that for each sample the noise is random in the sense that it comes from a distribution that is randomly selected for that sample from a list of possible distributions, each with their own probability of occurrence and a list of probabilities for the possible sample values. Keeping it simple with just three distributions and four possible sample values:
$$beginarrayllll&rlaptextSample value and its prob-\
textProbability&rlaptextability in the distribution\
textof distribution&-2&-1&0&1\
hline
colorblue0.3&0.4&0.2&0.3&0.1\
colorblue0.2&0.5&0.1&0.2&0.2\
colorblue0.5&0.1&0.4&0.4&0.1endarray$$
Here we have actually a distribution of distributions. But there is a single distribution that says everything about the probabilities of the values for that sample:
$$beginarrayllllrlaptextSample value and\
rlaptextits total probability\
-2&-1&0&1\
hline
0.27&0.28&0.33&0.12
endarray$$
The total probabilities were obtained as sums of conditional probabilities of the sample values over the possible distributions:
$$0.4timescolorblue0.3 + 0.5timescolorblue0.2 + 0.1timescolorblue0.5 = 0.27\
0.2timescolorblue0.3 + 0.1timescolorblue0.2 + 0.4timescolorblue0.5 = 0.28\
0.3timescolorblue0.3 + 0.2timescolorblue0.2 + 0.4timescolorblue0.5 = 0.33\
0.1timescolorblue0.3 + 0.2timescolorblue0.2 + 0.1timescolorblue0.5 = 0.12$$
The laws of probability that were applied:
$$P(A_icap B_j) = P(A_i|B_j)colorblueP(B_j)quadtextconditional probability$$
$$P(A_i) = sum_jP(A_icap B_j)quadtexttotal probability$$
where $A_i$ are the events of the $itextth$ sample value occurring, and $B_j$ are mutually exclusive and exhaustive events of choosing the $jtextth$ distribution.
With continuous distributions, similar things would take place, because those can be modeled as discrete distributions in the limit that the number of possible events approaches infinity.
add a comment |Â
up vote
1
down vote
normal distribution (i like to call it "gaussian") remains normal after addition of normally distributed numbers. so if gaussian goes into an LTI filter, a gaussian distribution comes out. but because of this central limit theorem, even if uniform p.d.f. random process goes into an LTI filter with a long and dense impulse response, what will come out tends to be normally distributed. so the LTI system really only changes some parameters, like the power spectrum or autocorrelation of the signal. an LTI filter can turn a uniform p.d.f. white random process into gaussian p.d.f. pink noise.
add a comment |Â
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
the place to look are the weak and strong law of large numbers, which is the basis of the central limit theorem, which states that if you add a large number of independent random variable with some mild conditions on the variance of those random numbers, the sum will become indistinguishable from a Normal Distribution.
A Normal Distribution also has the property of the maximum entropy of all distributions with bound variance.
The Normal Distribution is key in linear estimation but it should be noted that it isn’t the only distribution considered in Signal Processing while it may seem so to a newcomer.
The Normal is often a good model. Many physical noise mechanisms are Normally distributed. It also tends to admit closed form solutions.
One also encounters situations where the Normal assumption works despite not be a fully accurate assumption.
I don’t understand your last statement. Data has a distribution and adding Normal noise doesn’t change that distribution. The Signal and Noise distribution reflects both.
There are are also “refinements†or corrections to Normal Distributions like Gram Chalier series.
I think his last statement is observing the classical binary modulation distribution-- the distribution is of course changed, but represents two Gaussian curves one centered at a mean of $+sqrtE$ and the other at $-sqrtE$, with the same distribution from each mean.
– Dan Boschen
9 hours ago
add a comment |Â
up vote
2
down vote
the place to look are the weak and strong law of large numbers, which is the basis of the central limit theorem, which states that if you add a large number of independent random variable with some mild conditions on the variance of those random numbers, the sum will become indistinguishable from a Normal Distribution.
A Normal Distribution also has the property of the maximum entropy of all distributions with bound variance.
The Normal Distribution is key in linear estimation but it should be noted that it isn’t the only distribution considered in Signal Processing while it may seem so to a newcomer.
The Normal is often a good model. Many physical noise mechanisms are Normally distributed. It also tends to admit closed form solutions.
One also encounters situations where the Normal assumption works despite not be a fully accurate assumption.
I don’t understand your last statement. Data has a distribution and adding Normal noise doesn’t change that distribution. The Signal and Noise distribution reflects both.
There are are also “refinements†or corrections to Normal Distributions like Gram Chalier series.
I think his last statement is observing the classical binary modulation distribution-- the distribution is of course changed, but represents two Gaussian curves one centered at a mean of $+sqrtE$ and the other at $-sqrtE$, with the same distribution from each mean.
– Dan Boschen
9 hours ago
add a comment |Â
up vote
2
down vote
up vote
2
down vote
the place to look are the weak and strong law of large numbers, which is the basis of the central limit theorem, which states that if you add a large number of independent random variable with some mild conditions on the variance of those random numbers, the sum will become indistinguishable from a Normal Distribution.
A Normal Distribution also has the property of the maximum entropy of all distributions with bound variance.
The Normal Distribution is key in linear estimation but it should be noted that it isn’t the only distribution considered in Signal Processing while it may seem so to a newcomer.
The Normal is often a good model. Many physical noise mechanisms are Normally distributed. It also tends to admit closed form solutions.
One also encounters situations where the Normal assumption works despite not be a fully accurate assumption.
I don’t understand your last statement. Data has a distribution and adding Normal noise doesn’t change that distribution. The Signal and Noise distribution reflects both.
There are are also “refinements†or corrections to Normal Distributions like Gram Chalier series.
the place to look are the weak and strong law of large numbers, which is the basis of the central limit theorem, which states that if you add a large number of independent random variable with some mild conditions on the variance of those random numbers, the sum will become indistinguishable from a Normal Distribution.
A Normal Distribution also has the property of the maximum entropy of all distributions with bound variance.
The Normal Distribution is key in linear estimation but it should be noted that it isn’t the only distribution considered in Signal Processing while it may seem so to a newcomer.
The Normal is often a good model. Many physical noise mechanisms are Normally distributed. It also tends to admit closed form solutions.
One also encounters situations where the Normal assumption works despite not be a fully accurate assumption.
I don’t understand your last statement. Data has a distribution and adding Normal noise doesn’t change that distribution. The Signal and Noise distribution reflects both.
There are are also “refinements†or corrections to Normal Distributions like Gram Chalier series.
edited 6 hours ago


robert bristow-johnson
10.1k21448
10.1k21448
answered 9 hours ago


Stanley Pawlukiewicz
5,5692420
5,5692420
I think his last statement is observing the classical binary modulation distribution-- the distribution is of course changed, but represents two Gaussian curves one centered at a mean of $+sqrtE$ and the other at $-sqrtE$, with the same distribution from each mean.
– Dan Boschen
9 hours ago
add a comment |Â
I think his last statement is observing the classical binary modulation distribution-- the distribution is of course changed, but represents two Gaussian curves one centered at a mean of $+sqrtE$ and the other at $-sqrtE$, with the same distribution from each mean.
– Dan Boschen
9 hours ago
I think his last statement is observing the classical binary modulation distribution-- the distribution is of course changed, but represents two Gaussian curves one centered at a mean of $+sqrtE$ and the other at $-sqrtE$, with the same distribution from each mean.
– Dan Boschen
9 hours ago
I think his last statement is observing the classical binary modulation distribution-- the distribution is of course changed, but represents two Gaussian curves one centered at a mean of $+sqrtE$ and the other at $-sqrtE$, with the same distribution from each mean.
– Dan Boschen
9 hours ago
add a comment |Â
up vote
2
down vote
I'll try to clear one possible source of confusion. If picking each sample value from a single distribution feels "not random enough", then let's try to make things "more random" by adding another layer of randomness. This will be found to be futile.
Imagine that for each sample the noise is random in the sense that it comes from a distribution that is randomly selected for that sample from a list of possible distributions, each with their own probability of occurrence and a list of probabilities for the possible sample values. Keeping it simple with just three distributions and four possible sample values:
$$beginarrayllll&rlaptextSample value and its prob-\
textProbability&rlaptextability in the distribution\
textof distribution&-2&-1&0&1\
hline
colorblue0.3&0.4&0.2&0.3&0.1\
colorblue0.2&0.5&0.1&0.2&0.2\
colorblue0.5&0.1&0.4&0.4&0.1endarray$$
Here we have actually a distribution of distributions. But there is a single distribution that says everything about the probabilities of the values for that sample:
$$beginarrayllllrlaptextSample value and\
rlaptextits total probability\
-2&-1&0&1\
hline
0.27&0.28&0.33&0.12
endarray$$
The total probabilities were obtained as sums of conditional probabilities of the sample values over the possible distributions:
$$0.4timescolorblue0.3 + 0.5timescolorblue0.2 + 0.1timescolorblue0.5 = 0.27\
0.2timescolorblue0.3 + 0.1timescolorblue0.2 + 0.4timescolorblue0.5 = 0.28\
0.3timescolorblue0.3 + 0.2timescolorblue0.2 + 0.4timescolorblue0.5 = 0.33\
0.1timescolorblue0.3 + 0.2timescolorblue0.2 + 0.1timescolorblue0.5 = 0.12$$
The laws of probability that were applied:
$$P(A_icap B_j) = P(A_i|B_j)colorblueP(B_j)quadtextconditional probability$$
$$P(A_i) = sum_jP(A_icap B_j)quadtexttotal probability$$
where $A_i$ are the events of the $itextth$ sample value occurring, and $B_j$ are mutually exclusive and exhaustive events of choosing the $jtextth$ distribution.
With continuous distributions, similar things would take place, because those can be modeled as discrete distributions in the limit that the number of possible events approaches infinity.
add a comment |Â
up vote
2
down vote
I'll try to clear one possible source of confusion. If picking each sample value from a single distribution feels "not random enough", then let's try to make things "more random" by adding another layer of randomness. This will be found to be futile.
Imagine that for each sample the noise is random in the sense that it comes from a distribution that is randomly selected for that sample from a list of possible distributions, each with their own probability of occurrence and a list of probabilities for the possible sample values. Keeping it simple with just three distributions and four possible sample values:
$$beginarrayllll&rlaptextSample value and its prob-\
textProbability&rlaptextability in the distribution\
textof distribution&-2&-1&0&1\
hline
colorblue0.3&0.4&0.2&0.3&0.1\
colorblue0.2&0.5&0.1&0.2&0.2\
colorblue0.5&0.1&0.4&0.4&0.1endarray$$
Here we have actually a distribution of distributions. But there is a single distribution that says everything about the probabilities of the values for that sample:
$$beginarrayllllrlaptextSample value and\
rlaptextits total probability\
-2&-1&0&1\
hline
0.27&0.28&0.33&0.12
endarray$$
The total probabilities were obtained as sums of conditional probabilities of the sample values over the possible distributions:
$$0.4timescolorblue0.3 + 0.5timescolorblue0.2 + 0.1timescolorblue0.5 = 0.27\
0.2timescolorblue0.3 + 0.1timescolorblue0.2 + 0.4timescolorblue0.5 = 0.28\
0.3timescolorblue0.3 + 0.2timescolorblue0.2 + 0.4timescolorblue0.5 = 0.33\
0.1timescolorblue0.3 + 0.2timescolorblue0.2 + 0.1timescolorblue0.5 = 0.12$$
The laws of probability that were applied:
$$P(A_icap B_j) = P(A_i|B_j)colorblueP(B_j)quadtextconditional probability$$
$$P(A_i) = sum_jP(A_icap B_j)quadtexttotal probability$$
where $A_i$ are the events of the $itextth$ sample value occurring, and $B_j$ are mutually exclusive and exhaustive events of choosing the $jtextth$ distribution.
With continuous distributions, similar things would take place, because those can be modeled as discrete distributions in the limit that the number of possible events approaches infinity.
add a comment |Â
up vote
2
down vote
up vote
2
down vote
I'll try to clear one possible source of confusion. If picking each sample value from a single distribution feels "not random enough", then let's try to make things "more random" by adding another layer of randomness. This will be found to be futile.
Imagine that for each sample the noise is random in the sense that it comes from a distribution that is randomly selected for that sample from a list of possible distributions, each with their own probability of occurrence and a list of probabilities for the possible sample values. Keeping it simple with just three distributions and four possible sample values:
$$beginarrayllll&rlaptextSample value and its prob-\
textProbability&rlaptextability in the distribution\
textof distribution&-2&-1&0&1\
hline
colorblue0.3&0.4&0.2&0.3&0.1\
colorblue0.2&0.5&0.1&0.2&0.2\
colorblue0.5&0.1&0.4&0.4&0.1endarray$$
Here we have actually a distribution of distributions. But there is a single distribution that says everything about the probabilities of the values for that sample:
$$beginarrayllllrlaptextSample value and\
rlaptextits total probability\
-2&-1&0&1\
hline
0.27&0.28&0.33&0.12
endarray$$
The total probabilities were obtained as sums of conditional probabilities of the sample values over the possible distributions:
$$0.4timescolorblue0.3 + 0.5timescolorblue0.2 + 0.1timescolorblue0.5 = 0.27\
0.2timescolorblue0.3 + 0.1timescolorblue0.2 + 0.4timescolorblue0.5 = 0.28\
0.3timescolorblue0.3 + 0.2timescolorblue0.2 + 0.4timescolorblue0.5 = 0.33\
0.1timescolorblue0.3 + 0.2timescolorblue0.2 + 0.1timescolorblue0.5 = 0.12$$
The laws of probability that were applied:
$$P(A_icap B_j) = P(A_i|B_j)colorblueP(B_j)quadtextconditional probability$$
$$P(A_i) = sum_jP(A_icap B_j)quadtexttotal probability$$
where $A_i$ are the events of the $itextth$ sample value occurring, and $B_j$ are mutually exclusive and exhaustive events of choosing the $jtextth$ distribution.
With continuous distributions, similar things would take place, because those can be modeled as discrete distributions in the limit that the number of possible events approaches infinity.
I'll try to clear one possible source of confusion. If picking each sample value from a single distribution feels "not random enough", then let's try to make things "more random" by adding another layer of randomness. This will be found to be futile.
Imagine that for each sample the noise is random in the sense that it comes from a distribution that is randomly selected for that sample from a list of possible distributions, each with their own probability of occurrence and a list of probabilities for the possible sample values. Keeping it simple with just three distributions and four possible sample values:
$$beginarrayllll&rlaptextSample value and its prob-\
textProbability&rlaptextability in the distribution\
textof distribution&-2&-1&0&1\
hline
colorblue0.3&0.4&0.2&0.3&0.1\
colorblue0.2&0.5&0.1&0.2&0.2\
colorblue0.5&0.1&0.4&0.4&0.1endarray$$
Here we have actually a distribution of distributions. But there is a single distribution that says everything about the probabilities of the values for that sample:
$$beginarrayllllrlaptextSample value and\
rlaptextits total probability\
-2&-1&0&1\
hline
0.27&0.28&0.33&0.12
endarray$$
The total probabilities were obtained as sums of conditional probabilities of the sample values over the possible distributions:
$$0.4timescolorblue0.3 + 0.5timescolorblue0.2 + 0.1timescolorblue0.5 = 0.27\
0.2timescolorblue0.3 + 0.1timescolorblue0.2 + 0.4timescolorblue0.5 = 0.28\
0.3timescolorblue0.3 + 0.2timescolorblue0.2 + 0.4timescolorblue0.5 = 0.33\
0.1timescolorblue0.3 + 0.2timescolorblue0.2 + 0.1timescolorblue0.5 = 0.12$$
The laws of probability that were applied:
$$P(A_icap B_j) = P(A_i|B_j)colorblueP(B_j)quadtextconditional probability$$
$$P(A_i) = sum_jP(A_icap B_j)quadtexttotal probability$$
where $A_i$ are the events of the $itextth$ sample value occurring, and $B_j$ are mutually exclusive and exhaustive events of choosing the $jtextth$ distribution.
With continuous distributions, similar things would take place, because those can be modeled as discrete distributions in the limit that the number of possible events approaches infinity.
edited 13 mins ago
answered 32 mins ago


Olli Niemitalo
7,2181233
7,2181233
add a comment |Â
add a comment |Â
up vote
1
down vote
normal distribution (i like to call it "gaussian") remains normal after addition of normally distributed numbers. so if gaussian goes into an LTI filter, a gaussian distribution comes out. but because of this central limit theorem, even if uniform p.d.f. random process goes into an LTI filter with a long and dense impulse response, what will come out tends to be normally distributed. so the LTI system really only changes some parameters, like the power spectrum or autocorrelation of the signal. an LTI filter can turn a uniform p.d.f. white random process into gaussian p.d.f. pink noise.
add a comment |Â
up vote
1
down vote
normal distribution (i like to call it "gaussian") remains normal after addition of normally distributed numbers. so if gaussian goes into an LTI filter, a gaussian distribution comes out. but because of this central limit theorem, even if uniform p.d.f. random process goes into an LTI filter with a long and dense impulse response, what will come out tends to be normally distributed. so the LTI system really only changes some parameters, like the power spectrum or autocorrelation of the signal. an LTI filter can turn a uniform p.d.f. white random process into gaussian p.d.f. pink noise.
add a comment |Â
up vote
1
down vote
up vote
1
down vote
normal distribution (i like to call it "gaussian") remains normal after addition of normally distributed numbers. so if gaussian goes into an LTI filter, a gaussian distribution comes out. but because of this central limit theorem, even if uniform p.d.f. random process goes into an LTI filter with a long and dense impulse response, what will come out tends to be normally distributed. so the LTI system really only changes some parameters, like the power spectrum or autocorrelation of the signal. an LTI filter can turn a uniform p.d.f. white random process into gaussian p.d.f. pink noise.
normal distribution (i like to call it "gaussian") remains normal after addition of normally distributed numbers. so if gaussian goes into an LTI filter, a gaussian distribution comes out. but because of this central limit theorem, even if uniform p.d.f. random process goes into an LTI filter with a long and dense impulse response, what will come out tends to be normally distributed. so the LTI system really only changes some parameters, like the power spectrum or autocorrelation of the signal. an LTI filter can turn a uniform p.d.f. white random process into gaussian p.d.f. pink noise.
answered 6 hours ago


robert bristow-johnson
10.1k21448
10.1k21448
add a comment |Â
add a comment |Â
zeal is a new contributor. Be nice, and check out our Code of Conduct.
zeal is a new contributor. Be nice, and check out our Code of Conduct.
zeal is a new contributor. Be nice, and check out our Code of Conduct.
zeal is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdsp.stackexchange.com%2fquestions%2f53128%2fwhy-is-random-noise-assumed-to-be-normally-distributed%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
some questions first- are you familiar with the central limit theorem? This helps to understand why many processes from our natural environment are Gaussian distributed. To answer your second question, the distributions will convolve so depending on the distribution of the data it will change the distribution. However in this context we often consider the data to be "signal"- we are often interested in how noise compares to the signal. In this case the noise would be every sample deviation relative to where signal should be- which is the original noise, so has the same dist.
– Dan Boschen
9 hours ago
2
Possible duplicate of Why is Gaussian noise called so?
– MBaz
8 hours ago
I was going to leave an answer along the lines of the physical phenomena but @MBaz 's answer covers that. I think that the way this question is posed it is better to look at "reality" first and then look at the mathematics that are used to describe it. Checkout for example the Gaussian as a solution to the diffusion equation. This can help you, conceptually, to see why it applies to so many things in nature.
– A_A
18 mins ago