10 Reasons Why Opinion Polls Can’t Be Trusted




cnn-latest-presidential-poll-hillary-97-percent-trump-3
It seems like every day we see some new opinion poll in the media. Rather than be used to support some other story, the poll results have become news themselves. But as we saw in the 2016 election of Donald Trump, they often are wildly inaccurate. In the 1984 presidential election, various polls showed Ronald Reagan trailing Walter Mondale by over 10 points. However, Reagan ended up winning 49 states in the election. Over time some polling organizations have improved the science and accuracy of their methods, but unfortunately even results from reputable polls are usually manipulated by the mainstream media. Let’s examine ten reasons why polls cited by the media can’t be trusted.

  1. Polls can use samples that are too small or don’t accurately represent the population; for example, over sampling democrats. The American population is pretty evenly divided, with about the same percentages self identifying as republican or democrat. So when the poll sample has 10 percent more democrats than republicans, the results are likely to be distorted by about the same percent.
  2. Survey questions can ask leading questions framed in a way to get the answer desired. For example, the survey questions “Do you believe children of illegal immigrants should be separated from their parents at the border?” inevitably leads to an overwhelming ‘No’ answer. However, if you ask, “Should children by separated and genetic tested at the border to stop possible child traffickers?”, you’re going to get a far different result.
  3. Large segments of the population may be left out by certain polling methods; for example, those who don’t use the web often or who don’t answer unknown numbers on calls. Large portions of the population, especially older adults, simply don’t use the internet very often. Also, with the standardization of caller ID, it’s easy to ignore calls from polling organizations.
  4. Poll results can be recorded in a dishonest or sloppy way. There will always be a human component to any poll, which means mistakes can be made. Plus, questioners have their own biases and may intentionally falsify answers if they desire poll results to swing a certain way. For example, if a questioner believes a border wall should not be built but poll responses are at 55 percent in favor of the wall, what’s to stop the questioner from falsifying enough responses to swing the result to the other side. Poll results usually factor into decisions by politicians, so they may use this to justify their deceit.
  5. Polls reflect only a snapshot in time that may be distorted by a recent event, such as a mass shooting or terrorist attack. Emotions can heavily distort results. If 30 kids are killed in a school shooting, gun control will receive more favorable results. Immediately after a terrorist attack, polling for increase defense spending usually is favorable. As time goes by, more information and discussion usually overtakes some of the immediate emotion, leading to more rational decisions and accurate poll results.
  6. Political correctness and militancy of prominent liberals often lead to answers that don’t reflect the respondent’s true feelings. Most people naturally avoid conflict. And when you see Maxine Waters, Antifa, and scores of other nutcases calling for violence and harassment of Trump supporters, some feel it’s easier just to give the safe, non-controversial answer. Plus, lives and careers are often destroyed if someone is perceived as racist, sexist, homophobic, or whatever. So many respondents will stick to the politically correct script rather than reveal how they truly feel.
  7. Polls can leave out possible answers, forcing respondents to pick from remaining answers that don’t include what they really believe. Are you a republican or democrat? That poll question leaves out all those who would identify as independent, libertarian, etc.; how do you rate Trump’s performance as president–Ok, Bad, Very Bad, Disastrous. That question leaves out all those who would answer Very Good or Outstanding. Do you support war with Iran, yes or no? Maybe your answer is ‘no’ unless they directly attack us or fund terrorist groups that do so. You get the idea.
  8. Poll results can be presented in a misleading way; for example, leaving out poll disclaimers or the impact of “undecided” & “no opinion” answers. The media loves to cherry pick parts of all poll and leave out key information. For example, say the poll asks, “Do you support nominee Brett Kavanaugh for the Supreme Court?” The results may be 38 percent ‘yes’, 28 percent ‘no’, and 34 percent ‘no opinion/don’t know him’. Dishonest media people may report the poll showing only 38 percent support for Kavanaugh, leading people to believe 62 percent of the country is against him.
  9. The poll may not reflect the portion of the population that really matters; for example, a national poll doesn’t matter for a local election; a poll of “all adults” or “registered voters” doesn’t matter as much as “likely voters”. Nancy Pelosi approval ratings are usually in the teens nationally, but she easily wins re-election every time in her ultra far-left district. For a presidential election, half the adult population doesn’t vote or pay attention to current events, so citing polls of “all adults” nowhere near reflects what will happen on election day. “Registered voters” are a slightly better measure, but even a significant portion of registered voters don’t make it in on election day because they don’t follow current events, don’t like either candidate, don’t believe their vote matters, or don’t have the time to get there. “Likely voters” use experience and scientific methods to give the best approximation. You will almost never see a mainstream media outlet cite polls of likely voters since it gives the results they least like to see.  Many media sources were citing polls showing Hillary with 10-15 point leads the day before the election specifically for this reason.
  10. The media cherry picks which polls to present as “news” and which ones to leave out, choosing those that meet their story narrative objective. The Investors Business Daily presidential polls were by far the most accurate in 2008 and 2012 elections, so when their polls showed Trump slightly favored to win the 2016 race, you would think the media might include it in their stories, but it never happened. Even right-leaning Fox News threw out the IBD poll as an outlier in their predictive analysis. It simply didn’t fit what they wanted to present. The same kind of cherry picking can be seen on almost every political issue–immigration, gun control, Obamacare, and so on. If the media finds a poll that seems to back the narrative they want to push, they will use it. If it doesn’t, it will be ignored or attacked as non-credible. The media wants conservatives to be discouraged from even voting. They want to convince them they’re in the minority and that is something is wrong with them because the majority doesn’t agree with them.

In summary, take every poll you see in the news with a grain of salt. Polls have joined the fake & distorted mainstream media news propaganda effort used to brainwash the population to a certain way of thinking. Don’t fall for it. Follow your common sense and always think for yourself. Chances are that far more people agree with you than you realize.


mathematical-problem-poll
1979-reagan-carter-gallup-poll-only-poll-that-matters-is-election


Other Links That May Interest You

11 Ways the Media Manipulates the Truth
Mainstream Media Meme Gallery
Media Research Center


Written by: Joe Messerli
Last Modified: 8/18/2018

2 thoughts on “10 Reasons Why Opinion Polls Can’t Be Trusted

  1. I worked for 2 polling companies doing telephone surveys in 2 states a decade apart, and demographics was something I studied a lot with my B.A. in sociology. Polls are as legit as reality shows. Additional points for your lust. 1. One of my companies was doing transportation surveys funded by the city. The CEO was an “expert” who presented to the City Council a hypothesis on the city’s traffic patterns. Interestingly for all the cities he did acoss the globe the surveys always matched his hypothesis, leading to more money in his pocket for additional surveys. The guys totalling our surveys manipulated data to get “his” answers. On one city they left out a year from the end results, as that year had bad weather skewing results, but didn’t tell anyone. They would often leave things out. 2. My job was to schedule a final 3rd survey with people, which was a 30 min face to face with a bubble sheet. I was told not to randomly choose people but certain types – activists, talkative, not retired, civic minded, liberal. If you had voiced opinions in an earlier survey that were too conservative we wouldn’t choose you. left leaning moderates was who we talked to to get a “diverse” range of input. Ha. This 3rd survey would be a conversation, and back at the office we would fill in the bubbles based on what we remembered. If we missed a question? We would fill it in to our “best guess.” We did this on other surveys over the phone where someone didn’t want to answer a question. For the record, this is one of the largest transportation survey companies with hubs in numerous countries. 2. In both my companies we used phone lists. I would call 400 people a night easy. The 2nd company also did mailings and online, as the phone lists are landline lists. Who has landlines? Seniors! People who are home a lot. Big families. Many of us no longer have landlines. Add to this fact how many refuse to do surveys, which are often folks under 40, folks busy with families/jobs, skeptical folks, people upwardly mobile, and we had a lot of talks with old people to get our “diverse” responses. 3. At both companies I got paid bonuses for more completed surveys and surveys filling a certain answer category. To finish surveys we filled in answers when folks skipped or were hesitant. we would guess ages if someone refused. It could mean $300 more in my pocket if I completed a huge amount. As for categories… 4. In both companies if we didn’t seem to have enough of one answer or another we aimed to finish surveys with those folks. For example, I did a political survey where we weren’t getting enough moderate answers to “accurately represent society” (or our pro-LGBT view of it for a survey that was reaching out to a small country community). We were told to end surveys with anyone not moderate, to keep hunting for moderate ones. Once we were promised a staff BBQ if we got the goal. ….. There is likely more I could say, but as a former insider I wanted to add to your list.

Leave a Reply

Your email address will not be published. Required fields are marked *