April 11, 2014
Dear Carolina Community,
We have just posted to the new Carolina Commitment website, http://carolinacommitment.unc.edu/, the executive summary and full reports from the independent analysis of a data set that was the basis for public claims about the reading ability of a group of student-athletes at Carolina who had been screened for possible learning differences or learning disabilities between 2004 and 2012.
We took the claims seriously and committed publicly to analyzing the data set. Outside experts examined the data and found no evidence to support the literacy claims that have been widely reported in news media accounts and via social media.
We fully accept that there are many important questions about the best way to balance athletics and academics both here at Carolina and across the nation. It is important for the media and the community to be able to ask tough questions and to have a respectful debate on these issues. Carolina is committed to participating actively in that conversation. We also understand that there are lingering questions about our programs, which is why we initiated an independent inquiry that is ongoing.
We are proud of the work that our student-athletes do every day in the classroom, in the community and on the playing field and are grateful that they represent Carolina.
Chancellor Folt and I thank you for being an engaged and vital part of our community.
James W. Dean, Jr.
Executive Vice Chancellor and Provost
News Release: Outside experts find data set doesn’t support claims about reading levels of student-athletes
Outside experts found no evidence to support public claims about widespread low literacy levels of a group of first-year University of North Carolina at Chapel Hill student-athletes who had been screened for possible learning differences or learning disabilities between 2004 and 2012. Since January, those claims were widely reported in news media accounts and via social media.
To assess the validity of the claims and the results of an internal review, the University engaged three experts to independently analyze and report on the data set, which was provided on Jan. 13 to the University’s provost, the chief academic officer. The data set comprised results of a Scholastic Abilities Test for Adults (SATA) Reading Vocabulary subtest — a 25-question, multiple choice vocabulary test — given to 176 new student-athletes during the eight-year period.
According to an executive summary, the outside experts “also determined that the majority of the students referenced in the public claims scored at or above college entry level on the SATA Reading Vocabulary subtest. The data set was based on those scores.”
The reports, produced by faculty in psychology or education at the universities of Minnesota and Virginia and Georgia State University, and the executive summary are posted at http://carolinacommitment.unc.edu. The executive summary was produced by the Office of the Provost and approved by the independent experts.
The 176 student-athletes in the data set represented a small fraction of 1,800 total student-athletes who attended UNC between 2004 and 2012. Those students took the SATA Reading Vocabulary subtest shortly after arriving on campus as part of a screening process to identify possible learning differences or learning disabilities. This is common practice at many NCAA Division I universities.
The University hired the experts based on their knowledge of adult literacy, assessment and measurement in education, and multivariate analysis. They were Dr. Nathan Kuncel, distinguished professor of psychology at the University of Minnesota; Dr. Lee Alan Branum-Martin, associate professor of psychology and co-investigator in the Center for the Study of Adult Literacy at Georgia State University; and Dennis Kramer, assistant professor of higher education at the University of Virginia. (For details, refer to the end of this release.)
Although the experts worked independently of one another, they reached similar conclusions in their reports. The executive summary reported the key findings and results as follows:
- “The SATA RV subtest, a 25-question multiple choice vocabulary test is not a true reading test and should not be used to draw conclusions about student reading ability.”
- “The data do not support the public claims about the students’ reading ability.”
- “Reading ability should not be reported as grade equivalents.”
- “The difference in demographics between the SATA test norm and the demographics of the UNC student-athletes is important to understanding conclusions that can be drawn from the data.”
- “The SATA subtests were administered in low-stakes settings, meaning that the result of the test had relatively unimportant consequences to the taker. Low-stakes settings are thought to influence test results.”
- “While SATA RV (the 25-question, multiple choice vocabulary subtest) results can be informative as part of screening for learning differences and/or disabilities, they are not accepted by the psychological community as an appropriate measure of reading grade level and literacy.”
Note: The three independent experts are receptive to reporters’ interview requests and prefer to be contacted via email: Nathan Kuncel , email@example.com; Lee Branum-Martin, firstname.lastname@example.org; and Dennis Kramer, email@example.com.
UNC Contact: Karen Moon, (919) 962-8595, firstname.lastname@example.org
Biographical Information: Three Independent Experts
Nathan R. Kuncel is the Marvin D. Dunnette Distinguished Professor of Industrial-Organizational Psychology at the University of Minnesota, where he earned his Ph.D. in industrial-organizational psychology. Kuncel’s research focuses broadly on how individual characteristics (intelligence, personality, interests) influence subsequent academic, work, and life success as well as efforts to model and measure success. His research has appeared in Science, Psychological Bulletin, Review of Educational Research, Perspectives on Psychological Science, and Psychological Science. He edited the industrial and organization section of the three-volume APA (American Psychological Association) Handbook of Testing and Assessment in Psychology. Kuncel received the Cattell Early Career Research Award from the Society of Multivariate Experimental Psychology and the Anne Anastasi Early Career Award from the American Psychological Association – Division 5, Evaluation, Measurement, and Statistics.
Lee Branum-Martin is an associate professor in the Department of Psychology at Georgia State University. He received his Ph.D. in educational psychology at the University of Houston. His research focuses on measurement issues in language and literacy. He is co-investigator in Georgia State’s Center for the Study of Adult Literacy, which received a five-year, $10 million grant from the U.S. Department of Education to support research focusing on ways to improve adult literacy in the United States. His work and additional qualifications can be found on Google Scholar, ResearchGate.net, and Academia.edu.
Dennis Kramer is an assistant professor of higher education at the University of Virginia and program adviser for the intercollegiate athletics master’s concentration. Kramer specializes in advanced quantitative and econometric research methods for studying such issues as the impact of financial aid policies on secondary and postsecondary success, and the economic influences on the college choice process. He also has a strong research interest in institutional decision-making and financial trends in college athletics. Kramer has previously worked with Knight Commission on Intercollegiate Athletics and his scholarly work has been published in The Chronicle of Higher Education and Inside Higher Ed. He was senior research and policy analyst for the Georgia Department of Education, where he managed policy development and evaluation research. Next fall, Kramer will join the University Florida as a tenure-track assistant professor and associate director of the Institute of Higher Education.