Are opinion polls always representative of the extreme opinions?
Clash Royale CLAN TAG#URR8PPP
up vote
30
down vote
favorite
Opinion polls involves a choice. The people who are being asked to take the poll can say yes or no.
I suspect that this leads to such a significant amount of self-selection bias that I am starting to think that most opinion polls are by and large misleading due to them representing the views of the extreme and not the average. Am I correct in this position?
Here's what I mean. If you, say, call up a person and ask for five minutes of their time to do a poll, most likely they'll say no thank you. Most people ... just aren't interested in doing polls, right?
So if a person actually says yes, that would suggest that this person is abnormal in the sense that they actually want to do the poll. Why would that be? Well, there are many possible reasons (lonely elderly, perhaps?), but certainly one possible reason is that this person just happens to have an extreme opinion on the topic that they are about the be polled on, and therefore has an increased interested in making that opinion known. The average person, having the average opinion, will therefore be underrepresented in such polls, while the extreme person will be eager to express their extreme opinion
For example, if I received a call today by a pollster asking for a few questions on "LGBT", I'd pass. I have no strong opinions on the topic, am not LGBT myself, nor do I hate LGBT members. It's just not relevant to my life, so I have little to say. However, if I happened to be LGBT myself, or if I happened to be somebody who strongly disliked the LGBT community, I might very well be interested in taking such a poll. Hence the poll becomes skewed and representative of more extreme opinions.
polling
 |Â
show 9 more comments
up vote
30
down vote
favorite
Opinion polls involves a choice. The people who are being asked to take the poll can say yes or no.
I suspect that this leads to such a significant amount of self-selection bias that I am starting to think that most opinion polls are by and large misleading due to them representing the views of the extreme and not the average. Am I correct in this position?
Here's what I mean. If you, say, call up a person and ask for five minutes of their time to do a poll, most likely they'll say no thank you. Most people ... just aren't interested in doing polls, right?
So if a person actually says yes, that would suggest that this person is abnormal in the sense that they actually want to do the poll. Why would that be? Well, there are many possible reasons (lonely elderly, perhaps?), but certainly one possible reason is that this person just happens to have an extreme opinion on the topic that they are about the be polled on, and therefore has an increased interested in making that opinion known. The average person, having the average opinion, will therefore be underrepresented in such polls, while the extreme person will be eager to express their extreme opinion
For example, if I received a call today by a pollster asking for a few questions on "LGBT", I'd pass. I have no strong opinions on the topic, am not LGBT myself, nor do I hate LGBT members. It's just not relevant to my life, so I have little to say. However, if I happened to be LGBT myself, or if I happened to be somebody who strongly disliked the LGBT community, I might very well be interested in taking such a poll. Hence the poll becomes skewed and representative of more extreme opinions.
polling
23
This is an interesting question. But the actual problem gets much deeper. There are a lot more problems than just sampling bias which can make political opinion polls unreliable. Leading questions, leading answer options, context. This clip from the UK TV show Yes Prime Minister is satire, but very close to reality.
â Philippâ¦
Aug 23 at 11:06
16
I've heard more about the opposite problem. Some extremist views may be socially unacceptable so they will be unrepresented in the polls. For instance if a poll asked if all the Jews should be sent to gas chambers the number of people wanting to do so will be unrepresented because it (rightfully) is a social stigma with having such views so they may decline to answer or say they are undecided instead.
â liftarn
Aug 23 at 11:38
3
You could even generalize this for all political actions. If I have no strong opinion on LGBT rights, I probably won't agitate for or against them. The discourse is then only driven by the extreme poles, decision-making is done by the opinion in power...
â Cliff
Aug 23 at 13:34
1
@AzorAhai - though I suspect your question was of a more rhetorical nature.... no, most opinion polls are not primarily binary in their inquiries.
â PoloHoleSet
Aug 23 at 20:18
1
@PoloHoleSet I'm not a pollster or politics geek so I didn't know what was standard, but Nemsia seemed to make a pretty definitive statement
â Azor Ahai
Aug 23 at 20:18
 |Â
show 9 more comments
up vote
30
down vote
favorite
up vote
30
down vote
favorite
Opinion polls involves a choice. The people who are being asked to take the poll can say yes or no.
I suspect that this leads to such a significant amount of self-selection bias that I am starting to think that most opinion polls are by and large misleading due to them representing the views of the extreme and not the average. Am I correct in this position?
Here's what I mean. If you, say, call up a person and ask for five minutes of their time to do a poll, most likely they'll say no thank you. Most people ... just aren't interested in doing polls, right?
So if a person actually says yes, that would suggest that this person is abnormal in the sense that they actually want to do the poll. Why would that be? Well, there are many possible reasons (lonely elderly, perhaps?), but certainly one possible reason is that this person just happens to have an extreme opinion on the topic that they are about the be polled on, and therefore has an increased interested in making that opinion known. The average person, having the average opinion, will therefore be underrepresented in such polls, while the extreme person will be eager to express their extreme opinion
For example, if I received a call today by a pollster asking for a few questions on "LGBT", I'd pass. I have no strong opinions on the topic, am not LGBT myself, nor do I hate LGBT members. It's just not relevant to my life, so I have little to say. However, if I happened to be LGBT myself, or if I happened to be somebody who strongly disliked the LGBT community, I might very well be interested in taking such a poll. Hence the poll becomes skewed and representative of more extreme opinions.
polling
Opinion polls involves a choice. The people who are being asked to take the poll can say yes or no.
I suspect that this leads to such a significant amount of self-selection bias that I am starting to think that most opinion polls are by and large misleading due to them representing the views of the extreme and not the average. Am I correct in this position?
Here's what I mean. If you, say, call up a person and ask for five minutes of their time to do a poll, most likely they'll say no thank you. Most people ... just aren't interested in doing polls, right?
So if a person actually says yes, that would suggest that this person is abnormal in the sense that they actually want to do the poll. Why would that be? Well, there are many possible reasons (lonely elderly, perhaps?), but certainly one possible reason is that this person just happens to have an extreme opinion on the topic that they are about the be polled on, and therefore has an increased interested in making that opinion known. The average person, having the average opinion, will therefore be underrepresented in such polls, while the extreme person will be eager to express their extreme opinion
For example, if I received a call today by a pollster asking for a few questions on "LGBT", I'd pass. I have no strong opinions on the topic, am not LGBT myself, nor do I hate LGBT members. It's just not relevant to my life, so I have little to say. However, if I happened to be LGBT myself, or if I happened to be somebody who strongly disliked the LGBT community, I might very well be interested in taking such a poll. Hence the poll becomes skewed and representative of more extreme opinions.
polling
asked Aug 23 at 10:40
Nemsia
15423
15423
23
This is an interesting question. But the actual problem gets much deeper. There are a lot more problems than just sampling bias which can make political opinion polls unreliable. Leading questions, leading answer options, context. This clip from the UK TV show Yes Prime Minister is satire, but very close to reality.
â Philippâ¦
Aug 23 at 11:06
16
I've heard more about the opposite problem. Some extremist views may be socially unacceptable so they will be unrepresented in the polls. For instance if a poll asked if all the Jews should be sent to gas chambers the number of people wanting to do so will be unrepresented because it (rightfully) is a social stigma with having such views so they may decline to answer or say they are undecided instead.
â liftarn
Aug 23 at 11:38
3
You could even generalize this for all political actions. If I have no strong opinion on LGBT rights, I probably won't agitate for or against them. The discourse is then only driven by the extreme poles, decision-making is done by the opinion in power...
â Cliff
Aug 23 at 13:34
1
@AzorAhai - though I suspect your question was of a more rhetorical nature.... no, most opinion polls are not primarily binary in their inquiries.
â PoloHoleSet
Aug 23 at 20:18
1
@PoloHoleSet I'm not a pollster or politics geek so I didn't know what was standard, but Nemsia seemed to make a pretty definitive statement
â Azor Ahai
Aug 23 at 20:18
 |Â
show 9 more comments
23
This is an interesting question. But the actual problem gets much deeper. There are a lot more problems than just sampling bias which can make political opinion polls unreliable. Leading questions, leading answer options, context. This clip from the UK TV show Yes Prime Minister is satire, but very close to reality.
â Philippâ¦
Aug 23 at 11:06
16
I've heard more about the opposite problem. Some extremist views may be socially unacceptable so they will be unrepresented in the polls. For instance if a poll asked if all the Jews should be sent to gas chambers the number of people wanting to do so will be unrepresented because it (rightfully) is a social stigma with having such views so they may decline to answer or say they are undecided instead.
â liftarn
Aug 23 at 11:38
3
You could even generalize this for all political actions. If I have no strong opinion on LGBT rights, I probably won't agitate for or against them. The discourse is then only driven by the extreme poles, decision-making is done by the opinion in power...
â Cliff
Aug 23 at 13:34
1
@AzorAhai - though I suspect your question was of a more rhetorical nature.... no, most opinion polls are not primarily binary in their inquiries.
â PoloHoleSet
Aug 23 at 20:18
1
@PoloHoleSet I'm not a pollster or politics geek so I didn't know what was standard, but Nemsia seemed to make a pretty definitive statement
â Azor Ahai
Aug 23 at 20:18
23
23
This is an interesting question. But the actual problem gets much deeper. There are a lot more problems than just sampling bias which can make political opinion polls unreliable. Leading questions, leading answer options, context. This clip from the UK TV show Yes Prime Minister is satire, but very close to reality.
â Philippâ¦
Aug 23 at 11:06
This is an interesting question. But the actual problem gets much deeper. There are a lot more problems than just sampling bias which can make political opinion polls unreliable. Leading questions, leading answer options, context. This clip from the UK TV show Yes Prime Minister is satire, but very close to reality.
â Philippâ¦
Aug 23 at 11:06
16
16
I've heard more about the opposite problem. Some extremist views may be socially unacceptable so they will be unrepresented in the polls. For instance if a poll asked if all the Jews should be sent to gas chambers the number of people wanting to do so will be unrepresented because it (rightfully) is a social stigma with having such views so they may decline to answer or say they are undecided instead.
â liftarn
Aug 23 at 11:38
I've heard more about the opposite problem. Some extremist views may be socially unacceptable so they will be unrepresented in the polls. For instance if a poll asked if all the Jews should be sent to gas chambers the number of people wanting to do so will be unrepresented because it (rightfully) is a social stigma with having such views so they may decline to answer or say they are undecided instead.
â liftarn
Aug 23 at 11:38
3
3
You could even generalize this for all political actions. If I have no strong opinion on LGBT rights, I probably won't agitate for or against them. The discourse is then only driven by the extreme poles, decision-making is done by the opinion in power...
â Cliff
Aug 23 at 13:34
You could even generalize this for all political actions. If I have no strong opinion on LGBT rights, I probably won't agitate for or against them. The discourse is then only driven by the extreme poles, decision-making is done by the opinion in power...
â Cliff
Aug 23 at 13:34
1
1
@AzorAhai - though I suspect your question was of a more rhetorical nature.... no, most opinion polls are not primarily binary in their inquiries.
â PoloHoleSet
Aug 23 at 20:18
@AzorAhai - though I suspect your question was of a more rhetorical nature.... no, most opinion polls are not primarily binary in their inquiries.
â PoloHoleSet
Aug 23 at 20:18
1
1
@PoloHoleSet I'm not a pollster or politics geek so I didn't know what was standard, but Nemsia seemed to make a pretty definitive statement
â Azor Ahai
Aug 23 at 20:18
@PoloHoleSet I'm not a pollster or politics geek so I didn't know what was standard, but Nemsia seemed to make a pretty definitive statement
â Azor Ahai
Aug 23 at 20:18
 |Â
show 9 more comments
4 Answers
4
active
oldest
votes
up vote
27
down vote
The survey/poll participants not being all equally likely to answer is a well-known problem called participation bias aka non-response bias. A typical example:
A study of nonrespondents from the National AIDS Behavioral Survey (NABS) was conducted in 1990 to attempt to identify potential differences in participants and nonâÂÂparticipants that may influence estimates of sexual risk behavior. [...] Results indicate that refusers are older, attend church more often, are less likely to believe in the confidentiality of surveys, and have lower sexual self disclosure.
The effect probably depends on the topic of the poll. One (highly cited) meta-analysis has among its conclusions:
Large nonresponse biases can happen in surveys.
High response rates can reduce the risks of bias. They do this less when
the causes of participation are highly correlated with the survey variables.
Indeed, in the studies we assembled, some surveys with low nonresponse
rates have estimates with high relative nonresponse bias.
The interviewer effect is related in that some people may refuse to talk to some interviewers e.g. based on the race or age of both, or just not like to talk to people in general, creating a specific non-response bias. E.g.:
An analysis of the 2004 and 2008 [US election] phone surveys and exit polls reveals differing patterns of item non-response across the two interview modes.
So to answer your title question "Are opinion polls always representative of the extreme opinions?" (My emphasis.) I think the answer is clearly no. The non-response bias isn't necessarily equally likely to favor both extremes (and to exclude the "middle"). Depending on the study design (topic, participants, mode of survey), it can favor one extreme, both or the "middle". In fact, detecting the pattern of the non-response bias (for a given study) is a non-trivial problem. Quoting again from the meta-analysis:
We cannot rely on full or partial canceling of nonresponse biases when
we subtract one subclass mean from another. The bias of the difference is a
function of differences of response rates and covariances between
response propensities of the subgroups and the survey variable.
References
* Heather A. Turner (1999) Participation bias in AIDSâÂÂrelated telephone surveys: Results from the national AIDS behavioral survey (NABS) nonâÂÂresponse study, The Journal of Sex Research, 36:1, 52-58, DOI: 10.1080/00224499909551967
* Robert M. Groves, Emilia Peytcheva; The Impact of Nonresponse Rates on Nonresponse Bias: A Meta-Analysis, Public Opinion Quarterly, Volume 72, Issue 2, 1 January 2008, Pages 167âÂÂ189, DOI: 10.1093/poq/nfn011
* Michael P. McDonald, Matthew P. Thornburg; Interview Mode Effects: The Case of Exit Polls and Early Voting, Public Opinion Quarterly, Volume 76, Issue 2, 1 July 2012, Pages 326âÂÂ349, DOI: 10.1093/poq/nfs025
1
+1. It might be helpful to point out that just because there are non-respondents doesn't mean there is non-response bias. And reducing non-respondents isn't particularly helpful in reducing non-response bias. Cit.: academic.oup.com/poq/article/70/5/646/4084443
â indigochild
Aug 23 at 22:47
1
@indigochild that article doesn't sayreducing non-respondents isn't particularly helpful in reducing non-response bias
. It says reducing non-response rate doesn't necessarily reduce non-response bias. This is a very different statement. Specifically, if you reduce the non-response rate in a way that is biased, you may end up increasing non-response bias.
â De Novo
Aug 23 at 23:25
One common response to non-response bias is for pollsters to re-weigh different responses so that the adjusted sample represents the true demographics of the target population (e.g. voters or citizens). This helps, but can skill be problematic if the respondent from a demographic with a high non-response rate is atypical of responders in that demographic, or if the target population's demographics are in flux.
â ohwilleke
Aug 27 at 2:34
add a comment |Â
up vote
10
down vote
Defining an opinion poll as "an assessment of public opinion obtained by questioning a representative sample," the answer to your question is no. Opinion polls are not always overly representative of extreme opinions.
If you were to poll 20 people in your company (selected randomly) and they all respond, your poll would not be overly representative of extreme opinions, on average.
The problem arises when using voluntarily response survey methodology and is exacerbated by a low response rate. The problem of non-representative samples is not unique to opinion polls, but polls in general.
3
This minimal jargon-free answer is the most helpful because it zeros in on response rate being key.
â agc
Aug 23 at 17:44
If you were to poll 20 people in your company (selected randomly) and they all respond, your poll would not be overly representative of extreme opinions, on average, that would depend very much on the company or organisation. If I poll 20 people of Blackwater I may get very different views than if I poll 20 people of the International Peace Research Association. Certainly if my questions are related to war and peace, but probably also if they aren't.
â gerrit
Aug 24 at 9:40
@gerrit My intention in that toy example is to get a sense of opinions from the company only (e.g. do you think the CEO is doing a good job? Do you want Pizza Friday's?).
â Underminer
Aug 24 at 13:09
@gerrit It is implicit in the discussion of any sampling that, at best, the sample will be representative of the population sampledâÂÂclaims generalizing outward from the population sampled must be worded very carefully and very weakly, because you are basically claiming your population is, itself, a representative sample of some even-wider population. It is not usually assumed that someone is making such a claim unless they explicitly say so. So Underminer here would be assumedâÂÂas they confirmâÂÂto be talking about representing that companyâÂÂs employees and nothing wider.
â KRyan
Aug 24 at 14:14
(That said, misuse and abuse of statistics frequently takes the form of improperly generalizing from one population to a wider one, or a separate one altogether, and there is substantial room for improvement in the general publicâÂÂs understanding of sampling and the problems that such abuses cause.)
â KRyan
Aug 24 at 14:15
 |Â
show 1 more comment
up vote
3
down vote
This is a known problem with polls. Along with sampling error (they accidentally picked an unrepresentative group of people to call), polls may be skewed by people refusing to participate.
Pollsters attempt to control for both effects by comparing to more reliable data. For example, if it is a political poll, they may compare the demographics of the respondents to those of the region as a whole. In particular, they often try to control for political identification, that is to say, party registration. If their poll has too many Democrats to Republicans, they may decide to call more people looking for more Republicans. But of course that has a problem too.
What if the Republicans who answer the poll are unrepresentative of Republicans overall? For example, there is a group of Republicans called "Never Trump" Republicans. What if they answer polls more often than pro-Trump Republicans? This might introduce more skew.
Most pollsters use other demographic data. They'll often ask age for example. And they may also use other questions in the poll as controls. For example, they may ask someone's opinion on Donald Trump or how the person voted in the last election (or both).
Some pollsters do reference polls with the same people every time. These give baselines for how people should answer certain questions. They or other pollsters can then do comparisons of other polls to the baseline polls to look for skew. Or compare the baseline polls to actual data. For example, if you're polling an election, the election itself can be a base line to which the polls can be compared. Since who voted is generally public information, you can compare the actual demographics to the polled demographics.
1
@Typically the polls don't call more people of a different demographic, but rather control for the difference by weighting the responding statistics with the region's proportional statistics... So if more democrats responded to the poll, but the sample region has more Republicans, the responding samples are weighted to reflect the difference.
â hszmv
Aug 23 at 14:00
2
Maybe I'm misinterpreting what you mean by "accidentally", but if a measurement from a random sample produces a statistic that differs from the population, that difference is due to random error, not bias. Bias is a systematic error, not an error due to chance. Sampling bias is when certain individuals are less likely to be included in a sample because of the sampling method.
â De Novo
Aug 23 at 21:04
-1 for the reason De Novo said. Error is not bias.
â indigochild
Aug 24 at 18:57
add a comment |Â
up vote
-3
down vote
The TV series Yes Prime Minister (S01E02 "The Ministerial Broadcast") explains all about this:
Sir Humphrey Appleby: [demonstrating how public surveys can reach
opposite conclusions] Mr. Woolley, are you worried about the rise in
crime among teenagers?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think there is lack of discipline and
vigorous training in our Comprehensive Schools?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think young people welcome some structure
and leadership in their lives?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do they respond to a challenge?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Might you be in favour of reintroducing National
Service?
Bernard Woolley: Er, I might be.
Sir Humphrey Appleby: Yes or no?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Of course, after all you've said you can't say
no to that. On the other hand, the surveys can reach opposite
conclusions.
[survey two]
Sir Humphrey Appleby: Mr. Woolley, are you worried about the danger of
war?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Are you unhappy about the growth of armaments?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think there's a danger in giving young
people guns and teaching them how to kill?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think it's wrong to force people to take
arms against their will?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Would you oppose the reintroduction of
conscription?
Bernard Woolley: Yes.
[does a double-take]
Sir Humphrey Appleby: There you are, Bernard. The perfectly balanced
sample.
More often than not opinion polls support the views, desires or prejudices of the people that funded them. This may explain to a some extent the differences between the polls and the election results for Trump-Clinton and Brexit, for example.
The episode you refer to is about different phrasings of questions to get a specific outcome. If you phrase questions right (e.g. who will you vote for, will you vote yes or no on the referendum) and make the questions public with the results (so they can be scrutinised) that problem shouldn't exist (or be known by those reading the results).
â JJJ
Aug 23 at 14:40
1
Yes. But as Sir Humphrey said, there must be some honest pollsters around, he just hasn't happened to meet any.
â Jonathan Rosenne
Aug 23 at 14:43
Not sure what you mean by honest. I think the series only mentions using specific wording to get a certain outcome. Are you suggesting pollsters make up their own results? If that were the case and it came out that pollster would be done (nobody would hire them as the results would be useless).
â JJJ
Aug 23 at 14:51
1
youtube.com/watch?v=PKiTNC96yvs
â Jonathan Rosenne
Aug 23 at 17:49
1
@JJJ, if you think that a biased or fraudulent pollster has no use and would therefore never be hired, you are clearly unfamiliar with American media.
â Wildcard
Aug 24 at 2:35
 |Â
show 2 more comments
4 Answers
4
active
oldest
votes
4 Answers
4
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
27
down vote
The survey/poll participants not being all equally likely to answer is a well-known problem called participation bias aka non-response bias. A typical example:
A study of nonrespondents from the National AIDS Behavioral Survey (NABS) was conducted in 1990 to attempt to identify potential differences in participants and nonâÂÂparticipants that may influence estimates of sexual risk behavior. [...] Results indicate that refusers are older, attend church more often, are less likely to believe in the confidentiality of surveys, and have lower sexual self disclosure.
The effect probably depends on the topic of the poll. One (highly cited) meta-analysis has among its conclusions:
Large nonresponse biases can happen in surveys.
High response rates can reduce the risks of bias. They do this less when
the causes of participation are highly correlated with the survey variables.
Indeed, in the studies we assembled, some surveys with low nonresponse
rates have estimates with high relative nonresponse bias.
The interviewer effect is related in that some people may refuse to talk to some interviewers e.g. based on the race or age of both, or just not like to talk to people in general, creating a specific non-response bias. E.g.:
An analysis of the 2004 and 2008 [US election] phone surveys and exit polls reveals differing patterns of item non-response across the two interview modes.
So to answer your title question "Are opinion polls always representative of the extreme opinions?" (My emphasis.) I think the answer is clearly no. The non-response bias isn't necessarily equally likely to favor both extremes (and to exclude the "middle"). Depending on the study design (topic, participants, mode of survey), it can favor one extreme, both or the "middle". In fact, detecting the pattern of the non-response bias (for a given study) is a non-trivial problem. Quoting again from the meta-analysis:
We cannot rely on full or partial canceling of nonresponse biases when
we subtract one subclass mean from another. The bias of the difference is a
function of differences of response rates and covariances between
response propensities of the subgroups and the survey variable.
References
* Heather A. Turner (1999) Participation bias in AIDSâÂÂrelated telephone surveys: Results from the national AIDS behavioral survey (NABS) nonâÂÂresponse study, The Journal of Sex Research, 36:1, 52-58, DOI: 10.1080/00224499909551967
* Robert M. Groves, Emilia Peytcheva; The Impact of Nonresponse Rates on Nonresponse Bias: A Meta-Analysis, Public Opinion Quarterly, Volume 72, Issue 2, 1 January 2008, Pages 167âÂÂ189, DOI: 10.1093/poq/nfn011
* Michael P. McDonald, Matthew P. Thornburg; Interview Mode Effects: The Case of Exit Polls and Early Voting, Public Opinion Quarterly, Volume 76, Issue 2, 1 July 2012, Pages 326âÂÂ349, DOI: 10.1093/poq/nfs025
1
+1. It might be helpful to point out that just because there are non-respondents doesn't mean there is non-response bias. And reducing non-respondents isn't particularly helpful in reducing non-response bias. Cit.: academic.oup.com/poq/article/70/5/646/4084443
â indigochild
Aug 23 at 22:47
1
@indigochild that article doesn't sayreducing non-respondents isn't particularly helpful in reducing non-response bias
. It says reducing non-response rate doesn't necessarily reduce non-response bias. This is a very different statement. Specifically, if you reduce the non-response rate in a way that is biased, you may end up increasing non-response bias.
â De Novo
Aug 23 at 23:25
One common response to non-response bias is for pollsters to re-weigh different responses so that the adjusted sample represents the true demographics of the target population (e.g. voters or citizens). This helps, but can skill be problematic if the respondent from a demographic with a high non-response rate is atypical of responders in that demographic, or if the target population's demographics are in flux.
â ohwilleke
Aug 27 at 2:34
add a comment |Â
up vote
27
down vote
The survey/poll participants not being all equally likely to answer is a well-known problem called participation bias aka non-response bias. A typical example:
A study of nonrespondents from the National AIDS Behavioral Survey (NABS) was conducted in 1990 to attempt to identify potential differences in participants and nonâÂÂparticipants that may influence estimates of sexual risk behavior. [...] Results indicate that refusers are older, attend church more often, are less likely to believe in the confidentiality of surveys, and have lower sexual self disclosure.
The effect probably depends on the topic of the poll. One (highly cited) meta-analysis has among its conclusions:
Large nonresponse biases can happen in surveys.
High response rates can reduce the risks of bias. They do this less when
the causes of participation are highly correlated with the survey variables.
Indeed, in the studies we assembled, some surveys with low nonresponse
rates have estimates with high relative nonresponse bias.
The interviewer effect is related in that some people may refuse to talk to some interviewers e.g. based on the race or age of both, or just not like to talk to people in general, creating a specific non-response bias. E.g.:
An analysis of the 2004 and 2008 [US election] phone surveys and exit polls reveals differing patterns of item non-response across the two interview modes.
So to answer your title question "Are opinion polls always representative of the extreme opinions?" (My emphasis.) I think the answer is clearly no. The non-response bias isn't necessarily equally likely to favor both extremes (and to exclude the "middle"). Depending on the study design (topic, participants, mode of survey), it can favor one extreme, both or the "middle". In fact, detecting the pattern of the non-response bias (for a given study) is a non-trivial problem. Quoting again from the meta-analysis:
We cannot rely on full or partial canceling of nonresponse biases when
we subtract one subclass mean from another. The bias of the difference is a
function of differences of response rates and covariances between
response propensities of the subgroups and the survey variable.
References
* Heather A. Turner (1999) Participation bias in AIDSâÂÂrelated telephone surveys: Results from the national AIDS behavioral survey (NABS) nonâÂÂresponse study, The Journal of Sex Research, 36:1, 52-58, DOI: 10.1080/00224499909551967
* Robert M. Groves, Emilia Peytcheva; The Impact of Nonresponse Rates on Nonresponse Bias: A Meta-Analysis, Public Opinion Quarterly, Volume 72, Issue 2, 1 January 2008, Pages 167âÂÂ189, DOI: 10.1093/poq/nfn011
* Michael P. McDonald, Matthew P. Thornburg; Interview Mode Effects: The Case of Exit Polls and Early Voting, Public Opinion Quarterly, Volume 76, Issue 2, 1 July 2012, Pages 326âÂÂ349, DOI: 10.1093/poq/nfs025
1
+1. It might be helpful to point out that just because there are non-respondents doesn't mean there is non-response bias. And reducing non-respondents isn't particularly helpful in reducing non-response bias. Cit.: academic.oup.com/poq/article/70/5/646/4084443
â indigochild
Aug 23 at 22:47
1
@indigochild that article doesn't sayreducing non-respondents isn't particularly helpful in reducing non-response bias
. It says reducing non-response rate doesn't necessarily reduce non-response bias. This is a very different statement. Specifically, if you reduce the non-response rate in a way that is biased, you may end up increasing non-response bias.
â De Novo
Aug 23 at 23:25
One common response to non-response bias is for pollsters to re-weigh different responses so that the adjusted sample represents the true demographics of the target population (e.g. voters or citizens). This helps, but can skill be problematic if the respondent from a demographic with a high non-response rate is atypical of responders in that demographic, or if the target population's demographics are in flux.
â ohwilleke
Aug 27 at 2:34
add a comment |Â
up vote
27
down vote
up vote
27
down vote
The survey/poll participants not being all equally likely to answer is a well-known problem called participation bias aka non-response bias. A typical example:
A study of nonrespondents from the National AIDS Behavioral Survey (NABS) was conducted in 1990 to attempt to identify potential differences in participants and nonâÂÂparticipants that may influence estimates of sexual risk behavior. [...] Results indicate that refusers are older, attend church more often, are less likely to believe in the confidentiality of surveys, and have lower sexual self disclosure.
The effect probably depends on the topic of the poll. One (highly cited) meta-analysis has among its conclusions:
Large nonresponse biases can happen in surveys.
High response rates can reduce the risks of bias. They do this less when
the causes of participation are highly correlated with the survey variables.
Indeed, in the studies we assembled, some surveys with low nonresponse
rates have estimates with high relative nonresponse bias.
The interviewer effect is related in that some people may refuse to talk to some interviewers e.g. based on the race or age of both, or just not like to talk to people in general, creating a specific non-response bias. E.g.:
An analysis of the 2004 and 2008 [US election] phone surveys and exit polls reveals differing patterns of item non-response across the two interview modes.
So to answer your title question "Are opinion polls always representative of the extreme opinions?" (My emphasis.) I think the answer is clearly no. The non-response bias isn't necessarily equally likely to favor both extremes (and to exclude the "middle"). Depending on the study design (topic, participants, mode of survey), it can favor one extreme, both or the "middle". In fact, detecting the pattern of the non-response bias (for a given study) is a non-trivial problem. Quoting again from the meta-analysis:
We cannot rely on full or partial canceling of nonresponse biases when
we subtract one subclass mean from another. The bias of the difference is a
function of differences of response rates and covariances between
response propensities of the subgroups and the survey variable.
References
* Heather A. Turner (1999) Participation bias in AIDSâÂÂrelated telephone surveys: Results from the national AIDS behavioral survey (NABS) nonâÂÂresponse study, The Journal of Sex Research, 36:1, 52-58, DOI: 10.1080/00224499909551967
* Robert M. Groves, Emilia Peytcheva; The Impact of Nonresponse Rates on Nonresponse Bias: A Meta-Analysis, Public Opinion Quarterly, Volume 72, Issue 2, 1 January 2008, Pages 167âÂÂ189, DOI: 10.1093/poq/nfn011
* Michael P. McDonald, Matthew P. Thornburg; Interview Mode Effects: The Case of Exit Polls and Early Voting, Public Opinion Quarterly, Volume 76, Issue 2, 1 July 2012, Pages 326âÂÂ349, DOI: 10.1093/poq/nfs025
The survey/poll participants not being all equally likely to answer is a well-known problem called participation bias aka non-response bias. A typical example:
A study of nonrespondents from the National AIDS Behavioral Survey (NABS) was conducted in 1990 to attempt to identify potential differences in participants and nonâÂÂparticipants that may influence estimates of sexual risk behavior. [...] Results indicate that refusers are older, attend church more often, are less likely to believe in the confidentiality of surveys, and have lower sexual self disclosure.
The effect probably depends on the topic of the poll. One (highly cited) meta-analysis has among its conclusions:
Large nonresponse biases can happen in surveys.
High response rates can reduce the risks of bias. They do this less when
the causes of participation are highly correlated with the survey variables.
Indeed, in the studies we assembled, some surveys with low nonresponse
rates have estimates with high relative nonresponse bias.
The interviewer effect is related in that some people may refuse to talk to some interviewers e.g. based on the race or age of both, or just not like to talk to people in general, creating a specific non-response bias. E.g.:
An analysis of the 2004 and 2008 [US election] phone surveys and exit polls reveals differing patterns of item non-response across the two interview modes.
So to answer your title question "Are opinion polls always representative of the extreme opinions?" (My emphasis.) I think the answer is clearly no. The non-response bias isn't necessarily equally likely to favor both extremes (and to exclude the "middle"). Depending on the study design (topic, participants, mode of survey), it can favor one extreme, both or the "middle". In fact, detecting the pattern of the non-response bias (for a given study) is a non-trivial problem. Quoting again from the meta-analysis:
We cannot rely on full or partial canceling of nonresponse biases when
we subtract one subclass mean from another. The bias of the difference is a
function of differences of response rates and covariances between
response propensities of the subgroups and the survey variable.
References
* Heather A. Turner (1999) Participation bias in AIDSâÂÂrelated telephone surveys: Results from the national AIDS behavioral survey (NABS) nonâÂÂresponse study, The Journal of Sex Research, 36:1, 52-58, DOI: 10.1080/00224499909551967
* Robert M. Groves, Emilia Peytcheva; The Impact of Nonresponse Rates on Nonresponse Bias: A Meta-Analysis, Public Opinion Quarterly, Volume 72, Issue 2, 1 January 2008, Pages 167âÂÂ189, DOI: 10.1093/poq/nfn011
* Michael P. McDonald, Matthew P. Thornburg; Interview Mode Effects: The Case of Exit Polls and Early Voting, Public Opinion Quarterly, Volume 76, Issue 2, 1 July 2012, Pages 326âÂÂ349, DOI: 10.1093/poq/nfs025
edited Aug 23 at 13:05
answered Aug 23 at 12:29
Fizz
7,92012164
7,92012164
1
+1. It might be helpful to point out that just because there are non-respondents doesn't mean there is non-response bias. And reducing non-respondents isn't particularly helpful in reducing non-response bias. Cit.: academic.oup.com/poq/article/70/5/646/4084443
â indigochild
Aug 23 at 22:47
1
@indigochild that article doesn't sayreducing non-respondents isn't particularly helpful in reducing non-response bias
. It says reducing non-response rate doesn't necessarily reduce non-response bias. This is a very different statement. Specifically, if you reduce the non-response rate in a way that is biased, you may end up increasing non-response bias.
â De Novo
Aug 23 at 23:25
One common response to non-response bias is for pollsters to re-weigh different responses so that the adjusted sample represents the true demographics of the target population (e.g. voters or citizens). This helps, but can skill be problematic if the respondent from a demographic with a high non-response rate is atypical of responders in that demographic, or if the target population's demographics are in flux.
â ohwilleke
Aug 27 at 2:34
add a comment |Â
1
+1. It might be helpful to point out that just because there are non-respondents doesn't mean there is non-response bias. And reducing non-respondents isn't particularly helpful in reducing non-response bias. Cit.: academic.oup.com/poq/article/70/5/646/4084443
â indigochild
Aug 23 at 22:47
1
@indigochild that article doesn't sayreducing non-respondents isn't particularly helpful in reducing non-response bias
. It says reducing non-response rate doesn't necessarily reduce non-response bias. This is a very different statement. Specifically, if you reduce the non-response rate in a way that is biased, you may end up increasing non-response bias.
â De Novo
Aug 23 at 23:25
One common response to non-response bias is for pollsters to re-weigh different responses so that the adjusted sample represents the true demographics of the target population (e.g. voters or citizens). This helps, but can skill be problematic if the respondent from a demographic with a high non-response rate is atypical of responders in that demographic, or if the target population's demographics are in flux.
â ohwilleke
Aug 27 at 2:34
1
1
+1. It might be helpful to point out that just because there are non-respondents doesn't mean there is non-response bias. And reducing non-respondents isn't particularly helpful in reducing non-response bias. Cit.: academic.oup.com/poq/article/70/5/646/4084443
â indigochild
Aug 23 at 22:47
+1. It might be helpful to point out that just because there are non-respondents doesn't mean there is non-response bias. And reducing non-respondents isn't particularly helpful in reducing non-response bias. Cit.: academic.oup.com/poq/article/70/5/646/4084443
â indigochild
Aug 23 at 22:47
1
1
@indigochild that article doesn't say
reducing non-respondents isn't particularly helpful in reducing non-response bias
. It says reducing non-response rate doesn't necessarily reduce non-response bias. This is a very different statement. Specifically, if you reduce the non-response rate in a way that is biased, you may end up increasing non-response bias.â De Novo
Aug 23 at 23:25
@indigochild that article doesn't say
reducing non-respondents isn't particularly helpful in reducing non-response bias
. It says reducing non-response rate doesn't necessarily reduce non-response bias. This is a very different statement. Specifically, if you reduce the non-response rate in a way that is biased, you may end up increasing non-response bias.â De Novo
Aug 23 at 23:25
One common response to non-response bias is for pollsters to re-weigh different responses so that the adjusted sample represents the true demographics of the target population (e.g. voters or citizens). This helps, but can skill be problematic if the respondent from a demographic with a high non-response rate is atypical of responders in that demographic, or if the target population's demographics are in flux.
â ohwilleke
Aug 27 at 2:34
One common response to non-response bias is for pollsters to re-weigh different responses so that the adjusted sample represents the true demographics of the target population (e.g. voters or citizens). This helps, but can skill be problematic if the respondent from a demographic with a high non-response rate is atypical of responders in that demographic, or if the target population's demographics are in flux.
â ohwilleke
Aug 27 at 2:34
add a comment |Â
up vote
10
down vote
Defining an opinion poll as "an assessment of public opinion obtained by questioning a representative sample," the answer to your question is no. Opinion polls are not always overly representative of extreme opinions.
If you were to poll 20 people in your company (selected randomly) and they all respond, your poll would not be overly representative of extreme opinions, on average.
The problem arises when using voluntarily response survey methodology and is exacerbated by a low response rate. The problem of non-representative samples is not unique to opinion polls, but polls in general.
3
This minimal jargon-free answer is the most helpful because it zeros in on response rate being key.
â agc
Aug 23 at 17:44
If you were to poll 20 people in your company (selected randomly) and they all respond, your poll would not be overly representative of extreme opinions, on average, that would depend very much on the company or organisation. If I poll 20 people of Blackwater I may get very different views than if I poll 20 people of the International Peace Research Association. Certainly if my questions are related to war and peace, but probably also if they aren't.
â gerrit
Aug 24 at 9:40
@gerrit My intention in that toy example is to get a sense of opinions from the company only (e.g. do you think the CEO is doing a good job? Do you want Pizza Friday's?).
â Underminer
Aug 24 at 13:09
@gerrit It is implicit in the discussion of any sampling that, at best, the sample will be representative of the population sampledâÂÂclaims generalizing outward from the population sampled must be worded very carefully and very weakly, because you are basically claiming your population is, itself, a representative sample of some even-wider population. It is not usually assumed that someone is making such a claim unless they explicitly say so. So Underminer here would be assumedâÂÂas they confirmâÂÂto be talking about representing that companyâÂÂs employees and nothing wider.
â KRyan
Aug 24 at 14:14
(That said, misuse and abuse of statistics frequently takes the form of improperly generalizing from one population to a wider one, or a separate one altogether, and there is substantial room for improvement in the general publicâÂÂs understanding of sampling and the problems that such abuses cause.)
â KRyan
Aug 24 at 14:15
 |Â
show 1 more comment
up vote
10
down vote
Defining an opinion poll as "an assessment of public opinion obtained by questioning a representative sample," the answer to your question is no. Opinion polls are not always overly representative of extreme opinions.
If you were to poll 20 people in your company (selected randomly) and they all respond, your poll would not be overly representative of extreme opinions, on average.
The problem arises when using voluntarily response survey methodology and is exacerbated by a low response rate. The problem of non-representative samples is not unique to opinion polls, but polls in general.
3
This minimal jargon-free answer is the most helpful because it zeros in on response rate being key.
â agc
Aug 23 at 17:44
If you were to poll 20 people in your company (selected randomly) and they all respond, your poll would not be overly representative of extreme opinions, on average, that would depend very much on the company or organisation. If I poll 20 people of Blackwater I may get very different views than if I poll 20 people of the International Peace Research Association. Certainly if my questions are related to war and peace, but probably also if they aren't.
â gerrit
Aug 24 at 9:40
@gerrit My intention in that toy example is to get a sense of opinions from the company only (e.g. do you think the CEO is doing a good job? Do you want Pizza Friday's?).
â Underminer
Aug 24 at 13:09
@gerrit It is implicit in the discussion of any sampling that, at best, the sample will be representative of the population sampledâÂÂclaims generalizing outward from the population sampled must be worded very carefully and very weakly, because you are basically claiming your population is, itself, a representative sample of some even-wider population. It is not usually assumed that someone is making such a claim unless they explicitly say so. So Underminer here would be assumedâÂÂas they confirmâÂÂto be talking about representing that companyâÂÂs employees and nothing wider.
â KRyan
Aug 24 at 14:14
(That said, misuse and abuse of statistics frequently takes the form of improperly generalizing from one population to a wider one, or a separate one altogether, and there is substantial room for improvement in the general publicâÂÂs understanding of sampling and the problems that such abuses cause.)
â KRyan
Aug 24 at 14:15
 |Â
show 1 more comment
up vote
10
down vote
up vote
10
down vote
Defining an opinion poll as "an assessment of public opinion obtained by questioning a representative sample," the answer to your question is no. Opinion polls are not always overly representative of extreme opinions.
If you were to poll 20 people in your company (selected randomly) and they all respond, your poll would not be overly representative of extreme opinions, on average.
The problem arises when using voluntarily response survey methodology and is exacerbated by a low response rate. The problem of non-representative samples is not unique to opinion polls, but polls in general.
Defining an opinion poll as "an assessment of public opinion obtained by questioning a representative sample," the answer to your question is no. Opinion polls are not always overly representative of extreme opinions.
If you were to poll 20 people in your company (selected randomly) and they all respond, your poll would not be overly representative of extreme opinions, on average.
The problem arises when using voluntarily response survey methodology and is exacerbated by a low response rate. The problem of non-representative samples is not unique to opinion polls, but polls in general.
answered Aug 23 at 15:11
Underminer
2014
2014
3
This minimal jargon-free answer is the most helpful because it zeros in on response rate being key.
â agc
Aug 23 at 17:44
If you were to poll 20 people in your company (selected randomly) and they all respond, your poll would not be overly representative of extreme opinions, on average, that would depend very much on the company or organisation. If I poll 20 people of Blackwater I may get very different views than if I poll 20 people of the International Peace Research Association. Certainly if my questions are related to war and peace, but probably also if they aren't.
â gerrit
Aug 24 at 9:40
@gerrit My intention in that toy example is to get a sense of opinions from the company only (e.g. do you think the CEO is doing a good job? Do you want Pizza Friday's?).
â Underminer
Aug 24 at 13:09
@gerrit It is implicit in the discussion of any sampling that, at best, the sample will be representative of the population sampledâÂÂclaims generalizing outward from the population sampled must be worded very carefully and very weakly, because you are basically claiming your population is, itself, a representative sample of some even-wider population. It is not usually assumed that someone is making such a claim unless they explicitly say so. So Underminer here would be assumedâÂÂas they confirmâÂÂto be talking about representing that companyâÂÂs employees and nothing wider.
â KRyan
Aug 24 at 14:14
(That said, misuse and abuse of statistics frequently takes the form of improperly generalizing from one population to a wider one, or a separate one altogether, and there is substantial room for improvement in the general publicâÂÂs understanding of sampling and the problems that such abuses cause.)
â KRyan
Aug 24 at 14:15
 |Â
show 1 more comment
3
This minimal jargon-free answer is the most helpful because it zeros in on response rate being key.
â agc
Aug 23 at 17:44
If you were to poll 20 people in your company (selected randomly) and they all respond, your poll would not be overly representative of extreme opinions, on average, that would depend very much on the company or organisation. If I poll 20 people of Blackwater I may get very different views than if I poll 20 people of the International Peace Research Association. Certainly if my questions are related to war and peace, but probably also if they aren't.
â gerrit
Aug 24 at 9:40
@gerrit My intention in that toy example is to get a sense of opinions from the company only (e.g. do you think the CEO is doing a good job? Do you want Pizza Friday's?).
â Underminer
Aug 24 at 13:09
@gerrit It is implicit in the discussion of any sampling that, at best, the sample will be representative of the population sampledâÂÂclaims generalizing outward from the population sampled must be worded very carefully and very weakly, because you are basically claiming your population is, itself, a representative sample of some even-wider population. It is not usually assumed that someone is making such a claim unless they explicitly say so. So Underminer here would be assumedâÂÂas they confirmâÂÂto be talking about representing that companyâÂÂs employees and nothing wider.
â KRyan
Aug 24 at 14:14
(That said, misuse and abuse of statistics frequently takes the form of improperly generalizing from one population to a wider one, or a separate one altogether, and there is substantial room for improvement in the general publicâÂÂs understanding of sampling and the problems that such abuses cause.)
â KRyan
Aug 24 at 14:15
3
3
This minimal jargon-free answer is the most helpful because it zeros in on response rate being key.
â agc
Aug 23 at 17:44
This minimal jargon-free answer is the most helpful because it zeros in on response rate being key.
â agc
Aug 23 at 17:44
If you were to poll 20 people in your company (selected randomly) and they all respond, your poll would not be overly representative of extreme opinions, on average, that would depend very much on the company or organisation. If I poll 20 people of Blackwater I may get very different views than if I poll 20 people of the International Peace Research Association. Certainly if my questions are related to war and peace, but probably also if they aren't.
â gerrit
Aug 24 at 9:40
If you were to poll 20 people in your company (selected randomly) and they all respond, your poll would not be overly representative of extreme opinions, on average, that would depend very much on the company or organisation. If I poll 20 people of Blackwater I may get very different views than if I poll 20 people of the International Peace Research Association. Certainly if my questions are related to war and peace, but probably also if they aren't.
â gerrit
Aug 24 at 9:40
@gerrit My intention in that toy example is to get a sense of opinions from the company only (e.g. do you think the CEO is doing a good job? Do you want Pizza Friday's?).
â Underminer
Aug 24 at 13:09
@gerrit My intention in that toy example is to get a sense of opinions from the company only (e.g. do you think the CEO is doing a good job? Do you want Pizza Friday's?).
â Underminer
Aug 24 at 13:09
@gerrit It is implicit in the discussion of any sampling that, at best, the sample will be representative of the population sampledâÂÂclaims generalizing outward from the population sampled must be worded very carefully and very weakly, because you are basically claiming your population is, itself, a representative sample of some even-wider population. It is not usually assumed that someone is making such a claim unless they explicitly say so. So Underminer here would be assumedâÂÂas they confirmâÂÂto be talking about representing that companyâÂÂs employees and nothing wider.
â KRyan
Aug 24 at 14:14
@gerrit It is implicit in the discussion of any sampling that, at best, the sample will be representative of the population sampledâÂÂclaims generalizing outward from the population sampled must be worded very carefully and very weakly, because you are basically claiming your population is, itself, a representative sample of some even-wider population. It is not usually assumed that someone is making such a claim unless they explicitly say so. So Underminer here would be assumedâÂÂas they confirmâÂÂto be talking about representing that companyâÂÂs employees and nothing wider.
â KRyan
Aug 24 at 14:14
(That said, misuse and abuse of statistics frequently takes the form of improperly generalizing from one population to a wider one, or a separate one altogether, and there is substantial room for improvement in the general publicâÂÂs understanding of sampling and the problems that such abuses cause.)
â KRyan
Aug 24 at 14:15
(That said, misuse and abuse of statistics frequently takes the form of improperly generalizing from one population to a wider one, or a separate one altogether, and there is substantial room for improvement in the general publicâÂÂs understanding of sampling and the problems that such abuses cause.)
â KRyan
Aug 24 at 14:15
 |Â
show 1 more comment
up vote
3
down vote
This is a known problem with polls. Along with sampling error (they accidentally picked an unrepresentative group of people to call), polls may be skewed by people refusing to participate.
Pollsters attempt to control for both effects by comparing to more reliable data. For example, if it is a political poll, they may compare the demographics of the respondents to those of the region as a whole. In particular, they often try to control for political identification, that is to say, party registration. If their poll has too many Democrats to Republicans, they may decide to call more people looking for more Republicans. But of course that has a problem too.
What if the Republicans who answer the poll are unrepresentative of Republicans overall? For example, there is a group of Republicans called "Never Trump" Republicans. What if they answer polls more often than pro-Trump Republicans? This might introduce more skew.
Most pollsters use other demographic data. They'll often ask age for example. And they may also use other questions in the poll as controls. For example, they may ask someone's opinion on Donald Trump or how the person voted in the last election (or both).
Some pollsters do reference polls with the same people every time. These give baselines for how people should answer certain questions. They or other pollsters can then do comparisons of other polls to the baseline polls to look for skew. Or compare the baseline polls to actual data. For example, if you're polling an election, the election itself can be a base line to which the polls can be compared. Since who voted is generally public information, you can compare the actual demographics to the polled demographics.
1
@Typically the polls don't call more people of a different demographic, but rather control for the difference by weighting the responding statistics with the region's proportional statistics... So if more democrats responded to the poll, but the sample region has more Republicans, the responding samples are weighted to reflect the difference.
â hszmv
Aug 23 at 14:00
2
Maybe I'm misinterpreting what you mean by "accidentally", but if a measurement from a random sample produces a statistic that differs from the population, that difference is due to random error, not bias. Bias is a systematic error, not an error due to chance. Sampling bias is when certain individuals are less likely to be included in a sample because of the sampling method.
â De Novo
Aug 23 at 21:04
-1 for the reason De Novo said. Error is not bias.
â indigochild
Aug 24 at 18:57
add a comment |Â
up vote
3
down vote
This is a known problem with polls. Along with sampling error (they accidentally picked an unrepresentative group of people to call), polls may be skewed by people refusing to participate.
Pollsters attempt to control for both effects by comparing to more reliable data. For example, if it is a political poll, they may compare the demographics of the respondents to those of the region as a whole. In particular, they often try to control for political identification, that is to say, party registration. If their poll has too many Democrats to Republicans, they may decide to call more people looking for more Republicans. But of course that has a problem too.
What if the Republicans who answer the poll are unrepresentative of Republicans overall? For example, there is a group of Republicans called "Never Trump" Republicans. What if they answer polls more often than pro-Trump Republicans? This might introduce more skew.
Most pollsters use other demographic data. They'll often ask age for example. And they may also use other questions in the poll as controls. For example, they may ask someone's opinion on Donald Trump or how the person voted in the last election (or both).
Some pollsters do reference polls with the same people every time. These give baselines for how people should answer certain questions. They or other pollsters can then do comparisons of other polls to the baseline polls to look for skew. Or compare the baseline polls to actual data. For example, if you're polling an election, the election itself can be a base line to which the polls can be compared. Since who voted is generally public information, you can compare the actual demographics to the polled demographics.
1
@Typically the polls don't call more people of a different demographic, but rather control for the difference by weighting the responding statistics with the region's proportional statistics... So if more democrats responded to the poll, but the sample region has more Republicans, the responding samples are weighted to reflect the difference.
â hszmv
Aug 23 at 14:00
2
Maybe I'm misinterpreting what you mean by "accidentally", but if a measurement from a random sample produces a statistic that differs from the population, that difference is due to random error, not bias. Bias is a systematic error, not an error due to chance. Sampling bias is when certain individuals are less likely to be included in a sample because of the sampling method.
â De Novo
Aug 23 at 21:04
-1 for the reason De Novo said. Error is not bias.
â indigochild
Aug 24 at 18:57
add a comment |Â
up vote
3
down vote
up vote
3
down vote
This is a known problem with polls. Along with sampling error (they accidentally picked an unrepresentative group of people to call), polls may be skewed by people refusing to participate.
Pollsters attempt to control for both effects by comparing to more reliable data. For example, if it is a political poll, they may compare the demographics of the respondents to those of the region as a whole. In particular, they often try to control for political identification, that is to say, party registration. If their poll has too many Democrats to Republicans, they may decide to call more people looking for more Republicans. But of course that has a problem too.
What if the Republicans who answer the poll are unrepresentative of Republicans overall? For example, there is a group of Republicans called "Never Trump" Republicans. What if they answer polls more often than pro-Trump Republicans? This might introduce more skew.
Most pollsters use other demographic data. They'll often ask age for example. And they may also use other questions in the poll as controls. For example, they may ask someone's opinion on Donald Trump or how the person voted in the last election (or both).
Some pollsters do reference polls with the same people every time. These give baselines for how people should answer certain questions. They or other pollsters can then do comparisons of other polls to the baseline polls to look for skew. Or compare the baseline polls to actual data. For example, if you're polling an election, the election itself can be a base line to which the polls can be compared. Since who voted is generally public information, you can compare the actual demographics to the polled demographics.
This is a known problem with polls. Along with sampling error (they accidentally picked an unrepresentative group of people to call), polls may be skewed by people refusing to participate.
Pollsters attempt to control for both effects by comparing to more reliable data. For example, if it is a political poll, they may compare the demographics of the respondents to those of the region as a whole. In particular, they often try to control for political identification, that is to say, party registration. If their poll has too many Democrats to Republicans, they may decide to call more people looking for more Republicans. But of course that has a problem too.
What if the Republicans who answer the poll are unrepresentative of Republicans overall? For example, there is a group of Republicans called "Never Trump" Republicans. What if they answer polls more often than pro-Trump Republicans? This might introduce more skew.
Most pollsters use other demographic data. They'll often ask age for example. And they may also use other questions in the poll as controls. For example, they may ask someone's opinion on Donald Trump or how the person voted in the last election (or both).
Some pollsters do reference polls with the same people every time. These give baselines for how people should answer certain questions. They or other pollsters can then do comparisons of other polls to the baseline polls to look for skew. Or compare the baseline polls to actual data. For example, if you're polling an election, the election itself can be a base line to which the polls can be compared. Since who voted is generally public information, you can compare the actual demographics to the polled demographics.
edited Aug 24 at 23:22
answered Aug 23 at 11:17
Brythan
60.7k7122213
60.7k7122213
1
@Typically the polls don't call more people of a different demographic, but rather control for the difference by weighting the responding statistics with the region's proportional statistics... So if more democrats responded to the poll, but the sample region has more Republicans, the responding samples are weighted to reflect the difference.
â hszmv
Aug 23 at 14:00
2
Maybe I'm misinterpreting what you mean by "accidentally", but if a measurement from a random sample produces a statistic that differs from the population, that difference is due to random error, not bias. Bias is a systematic error, not an error due to chance. Sampling bias is when certain individuals are less likely to be included in a sample because of the sampling method.
â De Novo
Aug 23 at 21:04
-1 for the reason De Novo said. Error is not bias.
â indigochild
Aug 24 at 18:57
add a comment |Â
1
@Typically the polls don't call more people of a different demographic, but rather control for the difference by weighting the responding statistics with the region's proportional statistics... So if more democrats responded to the poll, but the sample region has more Republicans, the responding samples are weighted to reflect the difference.
â hszmv
Aug 23 at 14:00
2
Maybe I'm misinterpreting what you mean by "accidentally", but if a measurement from a random sample produces a statistic that differs from the population, that difference is due to random error, not bias. Bias is a systematic error, not an error due to chance. Sampling bias is when certain individuals are less likely to be included in a sample because of the sampling method.
â De Novo
Aug 23 at 21:04
-1 for the reason De Novo said. Error is not bias.
â indigochild
Aug 24 at 18:57
1
1
@Typically the polls don't call more people of a different demographic, but rather control for the difference by weighting the responding statistics with the region's proportional statistics... So if more democrats responded to the poll, but the sample region has more Republicans, the responding samples are weighted to reflect the difference.
â hszmv
Aug 23 at 14:00
@Typically the polls don't call more people of a different demographic, but rather control for the difference by weighting the responding statistics with the region's proportional statistics... So if more democrats responded to the poll, but the sample region has more Republicans, the responding samples are weighted to reflect the difference.
â hszmv
Aug 23 at 14:00
2
2
Maybe I'm misinterpreting what you mean by "accidentally", but if a measurement from a random sample produces a statistic that differs from the population, that difference is due to random error, not bias. Bias is a systematic error, not an error due to chance. Sampling bias is when certain individuals are less likely to be included in a sample because of the sampling method.
â De Novo
Aug 23 at 21:04
Maybe I'm misinterpreting what you mean by "accidentally", but if a measurement from a random sample produces a statistic that differs from the population, that difference is due to random error, not bias. Bias is a systematic error, not an error due to chance. Sampling bias is when certain individuals are less likely to be included in a sample because of the sampling method.
â De Novo
Aug 23 at 21:04
-1 for the reason De Novo said. Error is not bias.
â indigochild
Aug 24 at 18:57
-1 for the reason De Novo said. Error is not bias.
â indigochild
Aug 24 at 18:57
add a comment |Â
up vote
-3
down vote
The TV series Yes Prime Minister (S01E02 "The Ministerial Broadcast") explains all about this:
Sir Humphrey Appleby: [demonstrating how public surveys can reach
opposite conclusions] Mr. Woolley, are you worried about the rise in
crime among teenagers?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think there is lack of discipline and
vigorous training in our Comprehensive Schools?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think young people welcome some structure
and leadership in their lives?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do they respond to a challenge?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Might you be in favour of reintroducing National
Service?
Bernard Woolley: Er, I might be.
Sir Humphrey Appleby: Yes or no?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Of course, after all you've said you can't say
no to that. On the other hand, the surveys can reach opposite
conclusions.
[survey two]
Sir Humphrey Appleby: Mr. Woolley, are you worried about the danger of
war?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Are you unhappy about the growth of armaments?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think there's a danger in giving young
people guns and teaching them how to kill?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think it's wrong to force people to take
arms against their will?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Would you oppose the reintroduction of
conscription?
Bernard Woolley: Yes.
[does a double-take]
Sir Humphrey Appleby: There you are, Bernard. The perfectly balanced
sample.
More often than not opinion polls support the views, desires or prejudices of the people that funded them. This may explain to a some extent the differences between the polls and the election results for Trump-Clinton and Brexit, for example.
The episode you refer to is about different phrasings of questions to get a specific outcome. If you phrase questions right (e.g. who will you vote for, will you vote yes or no on the referendum) and make the questions public with the results (so they can be scrutinised) that problem shouldn't exist (or be known by those reading the results).
â JJJ
Aug 23 at 14:40
1
Yes. But as Sir Humphrey said, there must be some honest pollsters around, he just hasn't happened to meet any.
â Jonathan Rosenne
Aug 23 at 14:43
Not sure what you mean by honest. I think the series only mentions using specific wording to get a certain outcome. Are you suggesting pollsters make up their own results? If that were the case and it came out that pollster would be done (nobody would hire them as the results would be useless).
â JJJ
Aug 23 at 14:51
1
youtube.com/watch?v=PKiTNC96yvs
â Jonathan Rosenne
Aug 23 at 17:49
1
@JJJ, if you think that a biased or fraudulent pollster has no use and would therefore never be hired, you are clearly unfamiliar with American media.
â Wildcard
Aug 24 at 2:35
 |Â
show 2 more comments
up vote
-3
down vote
The TV series Yes Prime Minister (S01E02 "The Ministerial Broadcast") explains all about this:
Sir Humphrey Appleby: [demonstrating how public surveys can reach
opposite conclusions] Mr. Woolley, are you worried about the rise in
crime among teenagers?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think there is lack of discipline and
vigorous training in our Comprehensive Schools?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think young people welcome some structure
and leadership in their lives?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do they respond to a challenge?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Might you be in favour of reintroducing National
Service?
Bernard Woolley: Er, I might be.
Sir Humphrey Appleby: Yes or no?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Of course, after all you've said you can't say
no to that. On the other hand, the surveys can reach opposite
conclusions.
[survey two]
Sir Humphrey Appleby: Mr. Woolley, are you worried about the danger of
war?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Are you unhappy about the growth of armaments?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think there's a danger in giving young
people guns and teaching them how to kill?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think it's wrong to force people to take
arms against their will?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Would you oppose the reintroduction of
conscription?
Bernard Woolley: Yes.
[does a double-take]
Sir Humphrey Appleby: There you are, Bernard. The perfectly balanced
sample.
More often than not opinion polls support the views, desires or prejudices of the people that funded them. This may explain to a some extent the differences between the polls and the election results for Trump-Clinton and Brexit, for example.
The episode you refer to is about different phrasings of questions to get a specific outcome. If you phrase questions right (e.g. who will you vote for, will you vote yes or no on the referendum) and make the questions public with the results (so they can be scrutinised) that problem shouldn't exist (or be known by those reading the results).
â JJJ
Aug 23 at 14:40
1
Yes. But as Sir Humphrey said, there must be some honest pollsters around, he just hasn't happened to meet any.
â Jonathan Rosenne
Aug 23 at 14:43
Not sure what you mean by honest. I think the series only mentions using specific wording to get a certain outcome. Are you suggesting pollsters make up their own results? If that were the case and it came out that pollster would be done (nobody would hire them as the results would be useless).
â JJJ
Aug 23 at 14:51
1
youtube.com/watch?v=PKiTNC96yvs
â Jonathan Rosenne
Aug 23 at 17:49
1
@JJJ, if you think that a biased or fraudulent pollster has no use and would therefore never be hired, you are clearly unfamiliar with American media.
â Wildcard
Aug 24 at 2:35
 |Â
show 2 more comments
up vote
-3
down vote
up vote
-3
down vote
The TV series Yes Prime Minister (S01E02 "The Ministerial Broadcast") explains all about this:
Sir Humphrey Appleby: [demonstrating how public surveys can reach
opposite conclusions] Mr. Woolley, are you worried about the rise in
crime among teenagers?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think there is lack of discipline and
vigorous training in our Comprehensive Schools?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think young people welcome some structure
and leadership in their lives?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do they respond to a challenge?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Might you be in favour of reintroducing National
Service?
Bernard Woolley: Er, I might be.
Sir Humphrey Appleby: Yes or no?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Of course, after all you've said you can't say
no to that. On the other hand, the surveys can reach opposite
conclusions.
[survey two]
Sir Humphrey Appleby: Mr. Woolley, are you worried about the danger of
war?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Are you unhappy about the growth of armaments?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think there's a danger in giving young
people guns and teaching them how to kill?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think it's wrong to force people to take
arms against their will?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Would you oppose the reintroduction of
conscription?
Bernard Woolley: Yes.
[does a double-take]
Sir Humphrey Appleby: There you are, Bernard. The perfectly balanced
sample.
More often than not opinion polls support the views, desires or prejudices of the people that funded them. This may explain to a some extent the differences between the polls and the election results for Trump-Clinton and Brexit, for example.
The TV series Yes Prime Minister (S01E02 "The Ministerial Broadcast") explains all about this:
Sir Humphrey Appleby: [demonstrating how public surveys can reach
opposite conclusions] Mr. Woolley, are you worried about the rise in
crime among teenagers?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think there is lack of discipline and
vigorous training in our Comprehensive Schools?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think young people welcome some structure
and leadership in their lives?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do they respond to a challenge?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Might you be in favour of reintroducing National
Service?
Bernard Woolley: Er, I might be.
Sir Humphrey Appleby: Yes or no?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Of course, after all you've said you can't say
no to that. On the other hand, the surveys can reach opposite
conclusions.
[survey two]
Sir Humphrey Appleby: Mr. Woolley, are you worried about the danger of
war?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Are you unhappy about the growth of armaments?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think there's a danger in giving young
people guns and teaching them how to kill?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Do you think it's wrong to force people to take
arms against their will?
Bernard Woolley: Yes.
Sir Humphrey Appleby: Would you oppose the reintroduction of
conscription?
Bernard Woolley: Yes.
[does a double-take]
Sir Humphrey Appleby: There you are, Bernard. The perfectly balanced
sample.
More often than not opinion polls support the views, desires or prejudices of the people that funded them. This may explain to a some extent the differences between the polls and the election results for Trump-Clinton and Brexit, for example.
edited Aug 24 at 18:29
agc
3,8141243
3,8141243
answered Aug 23 at 14:14
Jonathan Rosenne
2676
2676
The episode you refer to is about different phrasings of questions to get a specific outcome. If you phrase questions right (e.g. who will you vote for, will you vote yes or no on the referendum) and make the questions public with the results (so they can be scrutinised) that problem shouldn't exist (or be known by those reading the results).
â JJJ
Aug 23 at 14:40
1
Yes. But as Sir Humphrey said, there must be some honest pollsters around, he just hasn't happened to meet any.
â Jonathan Rosenne
Aug 23 at 14:43
Not sure what you mean by honest. I think the series only mentions using specific wording to get a certain outcome. Are you suggesting pollsters make up their own results? If that were the case and it came out that pollster would be done (nobody would hire them as the results would be useless).
â JJJ
Aug 23 at 14:51
1
youtube.com/watch?v=PKiTNC96yvs
â Jonathan Rosenne
Aug 23 at 17:49
1
@JJJ, if you think that a biased or fraudulent pollster has no use and would therefore never be hired, you are clearly unfamiliar with American media.
â Wildcard
Aug 24 at 2:35
 |Â
show 2 more comments
The episode you refer to is about different phrasings of questions to get a specific outcome. If you phrase questions right (e.g. who will you vote for, will you vote yes or no on the referendum) and make the questions public with the results (so they can be scrutinised) that problem shouldn't exist (or be known by those reading the results).
â JJJ
Aug 23 at 14:40
1
Yes. But as Sir Humphrey said, there must be some honest pollsters around, he just hasn't happened to meet any.
â Jonathan Rosenne
Aug 23 at 14:43
Not sure what you mean by honest. I think the series only mentions using specific wording to get a certain outcome. Are you suggesting pollsters make up their own results? If that were the case and it came out that pollster would be done (nobody would hire them as the results would be useless).
â JJJ
Aug 23 at 14:51
1
youtube.com/watch?v=PKiTNC96yvs
â Jonathan Rosenne
Aug 23 at 17:49
1
@JJJ, if you think that a biased or fraudulent pollster has no use and would therefore never be hired, you are clearly unfamiliar with American media.
â Wildcard
Aug 24 at 2:35
The episode you refer to is about different phrasings of questions to get a specific outcome. If you phrase questions right (e.g. who will you vote for, will you vote yes or no on the referendum) and make the questions public with the results (so they can be scrutinised) that problem shouldn't exist (or be known by those reading the results).
â JJJ
Aug 23 at 14:40
The episode you refer to is about different phrasings of questions to get a specific outcome. If you phrase questions right (e.g. who will you vote for, will you vote yes or no on the referendum) and make the questions public with the results (so they can be scrutinised) that problem shouldn't exist (or be known by those reading the results).
â JJJ
Aug 23 at 14:40
1
1
Yes. But as Sir Humphrey said, there must be some honest pollsters around, he just hasn't happened to meet any.
â Jonathan Rosenne
Aug 23 at 14:43
Yes. But as Sir Humphrey said, there must be some honest pollsters around, he just hasn't happened to meet any.
â Jonathan Rosenne
Aug 23 at 14:43
Not sure what you mean by honest. I think the series only mentions using specific wording to get a certain outcome. Are you suggesting pollsters make up their own results? If that were the case and it came out that pollster would be done (nobody would hire them as the results would be useless).
â JJJ
Aug 23 at 14:51
Not sure what you mean by honest. I think the series only mentions using specific wording to get a certain outcome. Are you suggesting pollsters make up their own results? If that were the case and it came out that pollster would be done (nobody would hire them as the results would be useless).
â JJJ
Aug 23 at 14:51
1
1
youtube.com/watch?v=PKiTNC96yvs
â Jonathan Rosenne
Aug 23 at 17:49
youtube.com/watch?v=PKiTNC96yvs
â Jonathan Rosenne
Aug 23 at 17:49
1
1
@JJJ, if you think that a biased or fraudulent pollster has no use and would therefore never be hired, you are clearly unfamiliar with American media.
â Wildcard
Aug 24 at 2:35
@JJJ, if you think that a biased or fraudulent pollster has no use and would therefore never be hired, you are clearly unfamiliar with American media.
â Wildcard
Aug 24 at 2:35
 |Â
show 2 more comments
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fpolitics.stackexchange.com%2fquestions%2f33157%2fare-opinion-polls-always-representative-of-the-extreme-opinions%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
23
This is an interesting question. But the actual problem gets much deeper. There are a lot more problems than just sampling bias which can make political opinion polls unreliable. Leading questions, leading answer options, context. This clip from the UK TV show Yes Prime Minister is satire, but very close to reality.
â Philippâ¦
Aug 23 at 11:06
16
I've heard more about the opposite problem. Some extremist views may be socially unacceptable so they will be unrepresented in the polls. For instance if a poll asked if all the Jews should be sent to gas chambers the number of people wanting to do so will be unrepresented because it (rightfully) is a social stigma with having such views so they may decline to answer or say they are undecided instead.
â liftarn
Aug 23 at 11:38
3
You could even generalize this for all political actions. If I have no strong opinion on LGBT rights, I probably won't agitate for or against them. The discourse is then only driven by the extreme poles, decision-making is done by the opinion in power...
â Cliff
Aug 23 at 13:34
1
@AzorAhai - though I suspect your question was of a more rhetorical nature.... no, most opinion polls are not primarily binary in their inquiries.
â PoloHoleSet
Aug 23 at 20:18
1
@PoloHoleSet I'm not a pollster or politics geek so I didn't know what was standard, but Nemsia seemed to make a pretty definitive statement
â Azor Ahai
Aug 23 at 20:18