Michigan State Professor Says the Citizenship Test Lacks Reliability

study suggests that only 10% of questions are performing well

Paula Winke, an assistant professor of second language studies at Michigan State University, says that it appears that a $6.5 million overhaul of the U.S. Naturalization Test hasn’t made the test more fair or standardized. According to Winke, there’s a risk that “depending on which test version you’re given, you might pass or fail.” 

Her findings come from a study published in the winter edition of the journal Language Assessment Quarterly.

 

 

Winke said study findings show her that only 10 percent of the questions are working well. And she finds 13 of them especially problematic. “These 13 questions I would recommend to be dropped, because they’re not assessing a core, unique American civics knowledge,” she said.

Winke said her data comes from a small sample and shouldn’t be the final word on the test’s reliability, but she recommends that the government publish the data to prove the test is reliable and equally weighted.

Winke says in the abstract for the report:  In this study, I investigated the reliability of the U.S. Naturalization Test's civics component by asking 414 individuals to take a mock U.S. citizenship test comprising civics test questions. The analysis estimated how difficult the items are, whether they are interchangeable, and how reliably they measure civics knowledge. In addition, I estimated how uniformly difficult the items are for noncitizens and citizens and how accurate the cutoff score is. Results demonstrated the items vary widely in difficulty and do not all reliably measure civics knowledge. Most items do not function differently for citizens and noncitizens. The cutoff is not as accurate as applied in the operational test. The data revealed that test scores contain construct-irrelevant variance that undermines the overall reliability and validity of the instrument. I discuss these results not only to better understand the civics test but also to recommend how United States Citizenship and Immigration Services could conduct a similar study with the goal of raising the reliability and validity of the test.

To read Winke's study, you need a subscription to Language Assessment Quarterly, but there's a summary in the newspaper article posted here: 

http://www.macombdaily.com/articles/2012/03/10/news/doc4f5a48ed0e64d362373529.txt?viewmode=fullstory

Note: This is not the first time the reliability of the citizenship test has been questioned. A National Academies study in 2004 (based on an earlier version of the test) brought up similar concerns:   http://www.nap.edu/openbook.php?record_id=11168&page=1