In June earlier this year, the Supreme Court rejected the Trump Administration’s plan to add a citizenship question to the 2020 U.S. Census. Observers worried about the chilling effect this question’s inclusion would have on Census participation–particularly among immigrants and Hispanics–and so this outcome appeared to resolve concerns about Census counting and its implications. But many say the damage might have already been done in dissuading people from participating in the Census.
One lingering repercussion could be in fostering misperceptions about the Census. Given how much the Census citizenship question was in the news in the last year, and how at times its inclusion seemed certain, perhaps some Americans were left believing the 2020 Census will indeed ask about citizenship status. An incorrect view like this is still significant as past evidence suggests views about the Census and whether it asks about citizenship could factor into intended participation:
With this concern in mind, I wanted to see whether people believed the 2020 Census would include a citizenship question–long after the Supreme Court established it would not. In late October, I asked the following question to 7,966 participants on the Civis Analytics online panel:
“The past year has seen debate over whether the Census should ask people if they are citizens of the U.S. To the best of your knowledge, will the 2020 U.S. Census include a citizenship question?”
Answer options: Yes, No, Not Sure
The data included several demographic and political covariates. To make the sample reflect the national population, I weighted the data to match national 18+ citizen population characteristics for race, age group, gender, and education (thanks to G. Elliott Morris for providing benchmarks for this).
The below graph shows results for the entire sample, accompanied by 95% confidence intervals.
40.0 percent of American adults think that the 2020 U.S. Census will include a citizenship question. A similar amount express uncertainty in saying they are not sure (35.2 percent), while only about a quarter correctly respond to the question.
Demographic subgroup breakdowns below reveal some interesting wrinkles. The below graph plots percentages for different responses options (Yes / No / Not Sure) going across and different subgroups going down.
Hispanics are not more likely than whites–and not much more likely than other racial groups–to hold this incorrect belief. Many still do, though, and this is notable given this belief about the citizenship question is more likely to affect them.
The largest determinant of holding this incorrect belief happens to be Republican partisan identification: 51.3 percent of Republican say there will be a citizenship question on the 2020 Census, while fewer Democrats (35.0) and Independents (26.0) say so. Given the citizenship question’s attachment to Trump and his administration, this political cue likely explains the higher support among the Republican mass public. This fits with patterns for other factual beliefs, and makes views on this issue begin to resemble other misperceptions. In hindsight, I should have asked about strength of belief in respondent’s answers and whether they have heard about this issue before, as this would better distinguish between factual knowledge levels and truly held misperceptions. (Small note: In results not shown here, I test whether “Yes” responses among Republicans increase as education–a proxy for political sophistication–goes up, as this would mirror a pattern from other cases of misperceptions. The expected interactive effect does not emerge however.)
To round out subgroup breakdowns, along age, few notable differences outside of younger individuals being more uncertain exist. Those with the highest education level (college+) correctly answer the question (saying “No”) most. Greater “Not Sure” responses from females fits with past political science research and suggests this item is operating like a political knowledge question.
Takeaways and Caveats
Results here show a sizable percentage of the American public–nearing a majority–believes that the 2020 U.S. Census will include a citizenship question, months after the Supreme Court ruled the Trump administration could not add this to the Census. This is important in light of past evidence showing this question’s inclusion dampens reported intent to take the Census among Hispanics. At the same time, a partisan dimension exists to this belief, as over half of Republicans believe that the next Census will include a citizenship question.
A few caveats for these results are worth keeping in mind. Although I applied weights, this is still not a purely nationally representative sample, so I would warn against an exaggerated focus on the precise population rates. Second, acquiescence bias and uncertainty about the Census could distort results. People might be prone to offer an affirmative response when pushed into giving an answer. Given the nature of the Census and that citizenship seems like another personal question that would make sense appearing there, inclusion of the citizenship question might sound like the right answer and would be obscure to many regardless of whether it was debated heavily in the last year or not. This means that the rate of belief I find is not solely attributable to the Trump administration’s recent effort to add the question to the Census.
In the realm of misperceptions, researchers sometimes fabricate a survey item to see if respondents would express belief in anything, and compare that to belief rates for legitimate items (see here and here for examples). In hindsight, and motivated by these strategies, I should have asked respondents about whether they thought other items would be on the Census–a bogus item as well as a more believable one, such as religion.
I appreciate all the discussion and critique surrounding my New York Times article that came out earlier this week. It has made me think harder about the structure of my study design and implications of my findings, and I now realize where I could have been clearer in presentation of results (though of course, there are limits to how much detail and rationale I can include in a general audience piece). I wanted to briefly clarify some things in this blog post, and specifically address three common critiques that arose:
Over at The Upshot (New York Times), I wrote about a recent survey experiment that I ran. I randomly exposed people to a short summary of recent Democratic debates—and the various left-leaning proposed policies—or other unrelated political content, and measured various reactions regarding the 2020 election. Most notably, I find evidence of backlash among Independents in their intended vote choice, moving against the eventual Democrat nominee after they read about the sharp left turn taken by current Democratic candidates. In this post, I wanted to record details on the survey experimental design and methods.
9/7/2019 edit: I recently discussed some of these findings in an article for the Washington Post, “How Joe Biden attracts both black voters and racially ‘resentful’ voters.” I wanted to add a few things about this and related analysis.
In 2016, the CCES did not include its typical set of racial resentment battery items on the Common Content portion of the survey. To fill this gap, I searched for CCES team modules that contained racial resentment items, enabling me to extend a time series picture of racial resentment levels in the CCES. Several people have asked for some of the information behind my data collection effort — to make this easily accessible for everyone, I recently posted a Github repository with a few hopefully useful components:
You can find everything here.