Politics, Race, Social Issues

Depending on how you ask, NC State students have different opinions on protests

Protests have been a way to demonstrate opinion and advocate for social change in America since the beginning. Recently, many NFL football players have been protesting in opposition to police brutality towards  African-Americans. Another recent protest is the objection of the dismantling of Confederate monuments in Charlottesville. While freedom of speech is protected in the Constitution, people have differing opinions on protests depending on the execution and the object of the protest. Americans have a diverse attitude towards protests contingent upon whether it complies with their beliefs and how the question is framed.

The PackPoll sent out a poll to NCSU undergraduates to gauge student opinion of protests in America. To understand where the students are gathering their opinions on protests, two different sets of questions were given in different orders.

Q1a, “As you may know, some athletes have begun protesting during the national anthem in order to draw attention to systematic racism in the United States. Do you approve or disapprove of this form of protest?” was given to half of the respondents and Q1b, “As you may know, some athletes have begun protesting during the national anthem in order to draw attention to systematic racism in the United States. Do you approve or disapprove of this form of protest?” was given to the other half. 

The difference between the protests’ approval ratings indicates that student’s opinions on protests are affected by the reason for the protest.

To really understand how students think of protestors, students were asked to agree or disagree with the statement: “For the most part, people who protest and demonstrate against US policy are good, upstanding, intelligent people.”

However, this question did not always come second in the question order. Half of the respondents were given this question before the specific protest question(Q1a and Q1b), and half of the respondents were given this question after being asked about their opinion on the specific protest.

Unsurprisingly, approval ratings of protests went down when the question regarding Confederate statues was asked instead of the one about athletes. This proves that when a specific question is asked before a broad question, people are more likely to have an opinion and feel more strongly. Though NC State students may say that they are in favor of freedom of speech, opinion changes when they do not agree with what the people are protesting.


Almost 700 students were asked to take this survey and 275 completed it.  However, we cannot report a margin of sampling error for this survey because our results come from on a non-probability sample.  Most of our surveys adhere to the theoretical principles of probability sampling, such as when every NCSU student has a non-zero and equal chance of being randomly invited to take a survey (and nearly all we contact respond to it).  Instead, only certain students were asked to take this survey about protesting.

Our results about protesting come from students who previously agreed to be sent our future surveys.  In short, they chose us, non-randomly, so we can’t know for sure if they “think like” most students.  If respondents are not selected according to probability theory, it isn’t possible to calculate traditional diagnostic statistics about a survey, such as the margin of sampling error.

Most industry professionals today, however, agree that the margin of sampling error is overrated for evaluating the validity of polling results; if only 20% (or less!) of students respond to an invitation to take a survey, even when they were contacted at random, the subsequent sample doesnt conform to the assumptions of probability theory.  We could present advanced statistics about the likely representativeness of our sample, but the benefit of generating those stats is outweighed by their complicatedness.

Instead, we argue that in general we’ve learned that our panel of interested survey takers does a good job of mimicking a random sample of State students. Over the past two semesters, we’ve tested whether differences exists between results we obtain from the non-probability panel compared to a truly random draw.  So far, we don’t observe significant differences of opinions asked among students contacted the different methods.  Past results suggests that our results for students’ opinions about protesting are broadly representative of what most undergraduates at NCSU think about protesting.

Nevertheless, we might have overestimated State’s support for protesting.  More of our respondents call themselves “Democrat” than is probably true for all undergraduates, and Democrats are more supportive of protesting.  Since political partisanship is a fluid attitude and not a fixed characteristic, like age, we can’t be certain about the “true” percentage of Democrats (or Republicans). Thus, without knowing more about the fixed traits of our protesting sample (we did not ask more questions about their demographics), nor being certain our sample is “too Democratic,” we do not attempt to weight/adjust our data to known properties about NCSU undergraduates.

For additional information about best practices for reporting on the precision of non-probability sampling, you can watch this debate and/or read this guidance for how to report on results from non-probability samples.




Leave a Reply