Home > News > Finding Ukraine on a map, revisited
286 views 12 min 0 Comment

Finding Ukraine on a map, revisited

- April 18, 2014

Joshua Tucker:  Last week we featured a guest post from political scientists Kyle Dropp, Joshua D. Kertzer, and Thomas Zeitzoff concerning Americans’ ability to place Ukraine on a map and how this was correlated with attitudes toward the appropriate U.S. policy response to the Ukraine crisis. We invited the authors to respond to questions raised in response to their original post. The authors have also made available a more detailed and technical description of their methodology and findings here.
*****
We’re encouraged that our blog post last Monday exploring the extent to which Americans can put Ukraine on a map has received so much attention. In a year when lawmakers and pundits have expressed skepticism about the contribution and relevance of the social sciences, we think the fact that a post by three political scientists topped The Washington Post’s most-read list demonstrates that there’s a very real appetite for our discipline’s work.
Where’s Ukraine? Each dot depicts the location where a U.S. survey respondent situated Ukraine; the dots are colored based on how far removed they are from the actual country, with the most accurate responses in red and the least accurate ones in blue. (Data: Paragon Insights and Survey Sampling International, Inc.; Figure: Thomas Zeitzoff/The Monkey Cage)
In the previous post, we argued that, first, many Americans have trouble placing Ukraine on a map (see the map above). Second, accuracy varies across some subgroups (the more education you have, for example, the better your chance of getting it right), but not among others (both Republicans and Democrats fare about the same).  Third, there is a relationship between how inaccurate people were and attitudes related to U.S. military intervention.
Below, we describe why political knowledge matters, discuss our survey design, and reflect on the implications our findings. A more in-depth discussion of these issues is here.
1. Why we think this is important
Why does it matter whether Americans can put Ukraine on a map?
Researchers studying political attitudes care deeply about processes underlying those attitudes such as partisanship, trust, political knowledge, and tolerance. And political scientists have shown that knowledge is related to a wide range of attitudes and behaviors. A key assumption in many of the existing theories that seek to explain variation in military conflict is that democratic publics can constrain their leaders’ actions in the foreign policy realm. However, these theories assume citizens have the knowledge to hold their leaders accountable for their actions. If, as some have argued, foreign policy is far removed from most people’s daily lives, then the ability for the public to serve as a check on a leaders’ foreign policy is greatly diminished.
What do we think is going on between geographic knowledge and attitudes about the use of force?
We are still in the early stages of explaining why Americans with lower foreign policy knowledge are more supportive of using military force in Ukraine. One possible explanation is that our Ukraine distance measure is a proxy for overall knowledge and news consumption. Americans who place Ukraine closer to its actual border may have seen news on the subject, may be aware that elites (e.g., members of Congress, the president) on both sides are hesitant to engage in military action, or may have an elevated sense of the overall costs to action in Ukraine (in terms of casualties). Less informed Americans may not have seen that elites on both sides are hesitant to engage in military action. Along these lines, we observe an absence of partisan differences once we control for general foreign policy attitudes – a finding that is consistent with this explanation.
This reflects an ongoing debate in political science: We know knowledge matters, but although we know that voters who can name the vice president, for example, behave differently than voters who can’t, it is less clear whether this difference stems from knowledgeable voters putting their command of facts to good use, or to certain types of people being more likely to have this knowledge in the first place.
2. SURVEY DESIGN AND MEASUREMENT
How was the survey designed?
This poll was conducted from March 28-31, 2014, with a national sample of 2,066 registered 2014 voters. The survey was programmed in Qualtrics online survey software, conducted by Paragon Insights, and fielded online by Survey Sampling International, Inc. (SSI). The data were weighted to approximate a target sample of registered voters based on age, race/ethnicity, gender, educational attainment, geographic region, annual household income, homeownership status, and marital status. The results we reported use weighted data, but the findings are the same with the raw, unweighted data. The full survey has a margin of error of plus or minus two percentage points.
The March 28-31, 2014, survey (Survey 2) is a replication of another survey we conducted from March 21-24, 2014, (Survey 1) on a separate national sample of 1,997 registered voters. Levels of support for U.S. actions on Survey 2 closely matched those from Survey 1. The principal difference is that in Survey 1 we used a map of Eurasia, and in Survey 2 we used a map of the entire world.
Is it strange that Americans have trouble putting Ukraine on a map?
Political scientists have known for a long time that people don’t know very much about international politics. This shouldn’t be surprising: Foreign policy issues are far removed from most people’s daily lives. What people know about world affairs seems to be associated with what they think; in other research we’ve done, for example, we’ve found that higher-knowledge respondents are more likely to react to events on the world stage, simply because they’re more likely to be aware of them in the first place. One reason why our results may seem so striking is because people who are relatively politically engaged (e.g., the sort of person who reads The Monkey Cage in their spare time) tend to interact with other people who are relatively politically engaged, and thus overestimate how much other people know.
How would people do in other countries?
We don’t have data on how people in other countries would do on our map quiz (yet), but a 1994 survey found that Americans tend to be less knowledgeable about foreign affairs than citizens in four other western democracies. Other, more recent studies from education echo these findings about America’s lack of geographic literacy compared to other industrialized countries. In ongoing research, we are asking ~1,000 residents in Britain to identify many of the same countries on the same world map.
Would Americans do better with other maps?
This is a question we’re still exploring. Here is a map of where Americans place Britain, where respondents do relatively well.

Where’s the United Kingdom? Each dot depicts the location where a U.S. survey respondent situated the United Kingdom. (Data: Paragon Insights and Survey Sampling International, Inc.; Figure: Thomas Zeitzoff/The Monkey Cage)

Where’s the United Kingdom? Each dot depicts the location where a U.S. survey respondent situated the United Kingdom. (Data: Paragon Insights and Survey Sampling International, Inc.; Figure: Thomas Zeitzoff/The Monkey Cage)


These results are from a poll conducted April 3-6, 2014, also via Survey Sampling International (Inc.), where one in six respondents were asked to locate Britain on a world map (hence the fewer number of dots on the map). They also suggest that Americans are more familiar with the geography of some countries than others (in this case, perhaps because of the special relationship between the United States and Britain).
How depressing are these results?
What you take away from these results depends on where you sit — the Kazakhstani news site TengriNews, for example, summarized the findings with the headline “Americans confuse Ukraine and Kazakhstan on world map.”
Yet, we are a little more sanguine about the findings. Although there were some dramatic outliers, most respondents were reasonably close. If we look at the results as a sign of the wisdom of crowds, the modal response was quite accurate. More generally, since many survey respondents often don’t try very hard, it is difficult to tell whether the respondents who clicked on Brazil were doing so because they had no idea and were clicking randomly or because they genuinely thought it was in South America. Future surveys and experiments will attempt to tease out these differences, and help us understand the connection between foreign policy knowledge and public opinion about the use of force.
*****
Past Monkey Cage posts on developments in Ukraine, Russia and Crimea can be found by clicking here.  Recent posts include:
Tomila Lankina and Kinga Niemczyk: What Putin gets about soft power
Maria Popova: What is lustration and is it a good idea for Ukraine to adopt it?
Kyle Dropp, Joshua D. Kertzer and Thomas Zeitzoff: The less Americans know about Ukraine’s location, the more they want U.S. to intervene
M. Steven Fish: The end of the Putin mystique
Kimberly Marten: Crimea: Putin’s Olympic diversion
Joshua Tucker: What is motivating Putin?