Partisan Sorting and Opinion Change on Immigration and Sexist Attitudes

One important takeaway from CCES 2018 was that the liberalizing racial attitude trend–especially for Democrats–continued past 2016 into 2018. The Voter Study Group panel survey–with waves in 2011, 2016, 2017, and 2018–allows for a few important things:

  1. A panel-based check on these type of trends–what happens when one holds party identification constant–namely, checking attitude change by first wave partisan identification–and thus rule out sorting?
  2. Using other outgroup attitudes (e.g. immigrant attitudes and opinion on immigration policy) as well as separate forms of prejudice like measures that try to approximate sexism

Note: to stay consistent with the other analysis on attitudes toward outgroups (e.g. blacks and immigrants, from the perspective of whites), I’ll check for trends among whites only on both measures. 

First for sexism, which comprises six separate items and an average index of them, attitudes are very stable from 2016 to 2018 among white partisans (based on their partisanship in 2016). On average, there is a slight decline in sexism but that decline is consistent across partisanship.


When using a cross-lagged approach to test between partisan opinion change on sexism versus partisan sorting around sexism (in the mold of Lenz 2009, 2012 and Engelhardt 2018), evidence for both processes to roughly equal degrees emerges. Underlying partisanship drives sexism change but underlying sexism also drives partisanship. Both effects are very small though.


The set of immigration attitudes comprise three items and an average index, and extend back to 2011. Among 2011 partisans, the change in more liberal direction from 2011 to 2016–having more positive attitudes towards immigrants and immigration–is greatest. But at the same time, the liberalizing trend thereafter from 2016 to 2018 continues for Democrats.


Applying the same cross-lagged regression approach as before to adjudicate opinion change vs sorting, it becomes clear that opinion change occurs to a much larger degree than sorting does. Controlling for 2016 immigration attitudes, going from strongest Republican identifiers to strongest Democratic identifiers in 2016 moves 2018 average immigration opinion 0.22 points in the more liberal direction (on a 0 to 1 scale). Meanwhile, controlling for 2016 partisanship, moving from most anti-immigrant to most pro-immigrant in 2016 does affect partisanship at a statistically significant level, but the effect is not that large (0.05 points, 0-1 scale) and pales in comparison to partisanship-driven immigration change (an equality of coefficients tests shows these two effects from separate regressions are significantly different at p<.001).


3/28/19 Update:

Below are the sexism/party ID analysis graphs split by gender. See here for more discussion.



Partisan Sorting and Opinion Change on Immigration and Sexist Attitudes

Liberalizing Racial Attitudes Continued into 2018


With the recent release of 2018 CCES data, I extended the time series on racial resentment battery responses by party (filling in 2016, which didn’t include the standard RR items, with combined university team modules). White Democrats have continued a huge movement in the liberal direction on questions of race into 2018 (attitudes towards blacks here). In the CCES, it’s been an even larger shift over the last two years than in prior years.

Some context on this change (liberalizing racial attitudes especially among Democrats):

  • It’s apparent across various datasets, supporting the idea that this is real and meaningful movement (see graphs/explanations here for ANES, here for VSG, and here for GSS)
  • Similar movement appears for other outgroups (i.e. relative to whites) and policy questions that invoke other outgroups (see graph here)
  • The change appears more to do with “changing of minds” rather than partisan sorting around racial views (see analysis here)
  • Panel data from 2011 to 2016 shows the individual level change is strongest among white Democratic youth (see here), but CCES (cross-sectional) data from 2010 to 2018 shows a liberalizing trend that cuts across all age groups (among white Democrats), as the below graph shows:


What’s driving this liberalizing racial attitude change? Some potential factors:

  • Candidate rhetoric in the 2016 election–which heavily centered on race and identity–likely played a role (see here), as did partisan cues–coming in different forms–more broadly over the last decade or so (see here)
  • Lasting movement past the election and into 2018 likely has to do with the Trump presidency and the strong, negative outgroup cue that presidential rhetoric represents for Democrats (see here)
  • Given that these trends occurred over the course of nearly a decade now, it’s important to look before the 2016 election period and Trump era; reactions to and conversations around police shootings and social movements like Black Lives Matter perhaps have a role in spurring racial attitude change (see discussion/preliminary evidence here)

3/27/19 update: Check out a piece by Thomas Edsall in The New York Times opinion pages that includes the first graph above, and some discussion on it and other liberalizing social attitudes/trends in the U.S.

Liberalizing Racial Attitudes Continued into 2018

Who’s Most Likely to Take Follow-Up Surveys in Panels?


Who’s most likely to take follow-up surveys in panels? Such a question is important to keep in mind when drawing conclusions based on panel surveys (like here and here). When panel survey datasets include the full set of respondents in the first of a survey and the smaller portion of respondents who participated in a later wave/s, this allows one to examine what factors correlate with later wave participation given earlier participation. I’ve used the Voter Study Group dataset to study this question before, finding higher educated and white individuals to be particularly likely to take follow-up surveys. The popular Cooperative Congressional Election Study (CCES), which has pre- and post-election waves, offers potential for a similar and more comprehensive analysis.

The above graph shows the correlates of taking the post-election CCES survey among the entire sample (those who took the pre-election wave) for the last three general elections. A few quick notes on the modeling:

  • A linear probability model is used and 95% confidence intervals are shown
  • The main model includes validated voter status, 3-category education level (base=HS/less), female, age group (base=18-29), race (base=white), and political interest (0-1 scale)
  • Coefficients for income (0-1 scale), party (base=Democrat), partisan intensity (Ind./Not Sure, 0, to strong partisans, 1), pre-election vote intent (base=Rep vote), internet access at home AND work, and political media engagement come from adding each to the main model individually (i.e. each tested separately)
  • Pol. media engagement = sharing, commenting, OR forwarding something about politics on social media

Some observations on the results:

  • Older age and white individuals are most likely to take follow up surveys; voters are also more likely to participate in later waves, as are those who are more interested in politics (though the effect here is less than I expected)
  • There were a few surprising null, small, or inconsistent effects
    1. Full internet access (at both home and work); given that the CCES is an online survey, I thought people who had more consistent internet access would be more likely to take follow up surveys, but that’s not the case (this variable is available only for 2012 and 2016)
    2. Politically engaged on social media; those most willing to actively express themselves politically on social media seem like they would participate in later waves at higher rates, but that’s not the case (this variable is available only for 2016)
    3. Political winners/losers; given that the CCES solicits post wave responses right after the general election outcome, I expected that partisans (in terms of partisanship and intended pre-election vote choice) who experience an election loss would be less willing to engage in a political act (taking the CCES survey) amid a politically disappointing period, but evidence does not support this; individuals without partisan ties on the party identification question and unsure about whom to vote for/whether to vote before the election do indeed take the post-election survey less (they are probably less interested in politics anyways), but partisan differences are very minimal
Who’s Most Likely to Take Follow-Up Surveys in Panels?