Comparing Racial Attitude Change Before vs. Including the 2016 Election

As I’ve documented before, sizable racial attitude change has occurred among Democrats in the liberal direction over the last 5-10 years. An unsettled aspect of this development, though, is how much the 2016 election (elites, campaign messages, etc.) drove the racial liberalization as opposed to preceding forces (e.g., activism around racial issues). A quick review of some racial attitude survey data here can produce a rough answer to this question.

Panel survey data can isolate individual level change (actual changing of minds) on racial attitudes. The 2010-2014 Cooperative Congressional Election Study panel and 2011-16 Voter Study Group panel have two common racial resentment (RR) items that can shed light on attitude change during a span of years that does include the last election and thus can capture factors from the 2016 election environment (2011-2016) and another span of years that does not (2010-2014). The obvious caveat in comparing individual level racial attitude change across these two datasets is that these are two different surveys with different people, so results should be treated as suggestive (though the data does come from the same vendor, YouGov, offering some reassurance).

Two graphs below visualize individual level change from 2010-2014 and from 2011-2016. Similar to what I’ve done before, the first graph (Figure 1) breaks down 2014 RR responses by 2010 responses, and the second graph (Figure 2) breaks down 2016 RR responses by 2011 responses. The key portion of the graphs to pay attention to is the percentage of original non-racial liberals (most importantly, those taking a conservative position) changing to a liberal opinion on each item and how this varies by time span.

demsrr061918_1.png
Figure 1

On the “overcome” RR item (left-hand side) in Figure 1, 12 percent of original racial conservatives change to the liberal position between 2010 and 2014. In terms of the survey responses, this represents a change from agreeing that blacks should overcome prejudice without special favors (a racially conservative stance) to disagreeing with this sentiment (a liberal stance). Importantly, this racial position switch outweighs the mirror opposite as only six percent of original racial liberals change to a conservative position (change in the liberal direction does not get cancelled out by opposite movement). However, as the below Figure 2 graph shows for the same item (left-hand side), a greater amount of original conservatives (21 percent) switch to the liberal side on this RR item 2011 to 2016–a time span that includes any influence from the 2016 election. Greater movement in liberal direction for those originally indifferent (Neither/DK) for 2011-16 compared to 2010-14 occurs too.

demsrr061918_2
Figure 2

Similar results appear for the “slavery” RR item shown on the right-hand side of Figures 1 and 2. On this question, 20 percent of original racial conservatives–disagreeing that generations of slavery/discrimination have made it difficult for blacks to work their way up–became racially liberal (agreeing with the statement) four years later in 2014. Once again, while greater than the opposite movement (only seven percent of racial liberals became conservative), the trend towards more racially liberal attitudes is greater for the time span that includes 2016. 30 percent of original racial conservatives (in 2011) adopt racially liberal attitudes on this RR item five years later, a greater percentage than the one seen for the 2010-14 change.

In sum, the data here shows that some individual level racial attitude change was already developing prior to the 2016 election. It’s also worth clarifying that for all of these comparisons of percentages, the 2011-16 span essentially covers the 2010-14 span and thus this former time span picks up most of any pre-2016 election racial attitude change. Given these time frames and the fact that these are different surveys, it remains difficult to clearly discern when the change occurs most. Nevertheless, a time frame that includes 2016–and thus likely any influence from the last election–clearly contributes at the very least some amount to the racial attitude change seen within the last decade.

Advertisement
Comparing Racial Attitude Change Before vs. Including the 2016 Election

The Effects of Survey Topic Salience on Response Rate and Opinions: Evidence from a Student Survey Experiment

As part of a recent survey of Dartmouth students, I implemented a survey topic experiment to determine how revealing the topic of the survey when soliciting responses affects the 1) response rate and 2) responses themselves. For background, in order to gather responses for these student surveys, I send out email invitations with a survey link to the entire student body. Partly inspired by past research demonstrating that interest in a survey’s topic increases participation rate for the survey, I created two conditions that varied whether the topic of the survey was made salient in the email message (i.e., in the email header and body) or not. This resulted in what I call a “topic” email sendout and a “generic” email sendout, respectively, to which 4,441 student email addresses were randomly assigned (N = 2,221 for generic, N = 2,220 for topic).

The below table shows the contents for each experimental condition:

treatments

Because the survey I was fielding focused on politics and social attitudes on campus, the topic treatment email–on the right-hand side–explicitly revealed that the survey was about politics (both in the header and body). The generic treatment on the left simply described the survey as one from “The Dartmouth” (the name for the student newspaper for which the survey was being fielded) that implied general questions would be asked of students. Much like in other related research, this made for a fairly subtle but realistic manipulation in the introduction of the survey to the student population.

Given this subtle difference, it might come as no surprise that small differences resulted for the outcomes of interest (response rate and opinions on specific survey questions). However, both surprising and expected effects did arise, suggesting that revealing a survey topic–in this case, the political nature of it–does make for a slightly different set of results and could lead to some nonresponse bias. These results are of course specific to the Dartmouth student body, but may have some bearing for surveys of younger populations more broadly.

Students received two rounds of survey invitation emails–first on a Monday night, then another email on the following Thursday night. After one email sendout, as the below Table 1 shows, students in the topic email condition (8.9 percent response rate) were significantly (p=0.04) less likely to respond to the survey (by 1.7 percentage points) compared to the students in the generic email condition (7.2% RR).

experiment results 1
Table 1

Knowing a survey is about politics made students less likely to take it. Speculatively, perhaps this politics survey request–which entails discussing politics and expressing oneself politically–acts as a deterrent in light of how controversy and rancor often become associated with both college campus and national political scenes. In other words, politics could be a “turn-off” for students in deciding whether to take a survey. However, after receiving one more email request to take the survey, students in both conditions start responding more similarly (note: those who originally took the survey could not take it again). Although the topic email treatment still leads to lower response rate, the size of the response rate difference shrinks (from 1.7 to 0.9 points) and the statistical significance of the difference goes away (p=0.34).

The bottom half of Table 1 also shows how distributions for key demographics and political characteristics. Women are five points less likely to take a survey they know is about politics compared to a perceived generic survey–perhaps in line with a view of politics as conflictual and thus spurring aversion as I’ve discussed before–but this doesn’t reach statistical significance. Little difference emerges by race.

Most interestingly, the survey topic treatment causes different pictures of the partisanship distribution. When survey responses are solicited using a generic email invitation, Democrats make up 71.9 percent of the student body; that drops by more than seven points for the politics topic treatment, as Democrats are at 64.3 here (the difference reaches marginal statistical significance). On the other hand, it appears that Republican students select into taking a politics survey at higher rate: the generic email condition results in 14.5 percent Republicans while the topic email condition results in 23.1 Republicans (difference significant at p=0.01). Republicans thus are more inclined to take surveys when they know it’s about politics, while for Democrats they become less inclined to do so. At least in the Republican case–which is the stronger result–one reason for this may be because a political survey affords them an opportunity for political expression in a campus environment where they’re typically outnumbered 3 to 1 by Democrats and therefore might be less open about their politics. Whatever the mechanism is, this result is not totally unexpected: the two highest Republican percentages that I’ve found in surveys of Dartmouth students have come from surveys where email invitations revealed the survey as a political one.

A few notable differences by experimental condition for substantive survey items materialized as well. A battery of questions (shown in the upper fourth of Table 2) probed how knowing that another student had opposing political views affected social a range of social relations. No consistent differences (and none reaching statistical significance) resulted for these questions.

On the question of whether someone ever lost a friend at the school because of political disagreements, however, more students indicated this was the case in the topic email: 17.2 percent did so compared to 10.9 percent in the generic email treatment, a difference significant at p=0.04. Raising the salience of the survey topic (politics) to potential respondents thus leads to higher reports of politics factoring into students’ lives in a substantial way such as this one.

experiment results 2
Table 2

This latter finding is not the only piece of evidence suggesting that the politics email treatment more strongly attracts students for whom politics plays a big role in their lives. Many fewer students report politics rarely/never being brought up in classes for the generic email condition (24.1 percent) than in the topic email condition (13.9), a statistically significant decline. This smaller role of politics in personal lives for the generic email invitation is also evident when asking about how often politics are brought up when talking with friends and in campus clubs/organizations.

Lastly, a question asked whether the political identification of a professor would affect a student’s likelihood of taking the professor’s class. Greater indifference to professor ideology emerged for the generic email and specifically for the two non-mainstream ideologies (libertarianism and socialism); students who took the survey in the topic email condition indicated that non-mainstream professor ideology influenced their course election to a greater extent.

In sum, many of the data points in Table 2 suggest that a survey email invitation raising the salience of the survey topic (i.e., politics) results in a sample for whom politics assumes a greater role in personal life. This intuitive and expected nonresponse bias–although secondary to the more important response rate and partisanship distribution findings–is still worth noting and demonstrating statistical support for.

The Effects of Survey Topic Salience on Response Rate and Opinions: Evidence from a Student Survey Experiment