(Originally Published in Inside Higher Ed’s “Admissions Insider” September 20, 2021

Ronald Reagan famously used the phrase “Trust, but verify” to describe his posture toward discussing nuclear disarmament with the Soviet Union. 

 

His use of that phrase was brilliant on a couple of levels.  Talking about trusting an adversary was on one level an expression of good faith, but adding verification made it clear that any idealism was also tempered with a dose of realism.  The additional genius of the phrase as applied to the Soviets was that it was adapted from a Russian proverb, “Doveryai, no proveryai.”

 

The Operation Varsity Blues scandal trial currently taking place serves as a reminder of the fine line between trust and verification in college admission.  Colleges trust applicants to be honest and truthful in what they list on their application.  While we wouldn’t want that to change, Operation Varsity Blues serves as a cautionary tale.  The widespread fraud, including constructing elaborate and false athletic resumes for sports the students involved didn’t even play, was not uncovered by admission offices.  Fool me once, shame on you.  Fool me twice…

 

The trial is not the only “ripped from today’s headlines” story that provides a test case for the interplay of trust and verification.  Last week U.S. News and World Report published its annual “America’s Best Colleges” rankings.  I have since received numerous emails from colleges trumpeting their ranking, and my local newspaper has published its annual story highlighting small changes in local institutional rankings as if they signified major news. 

 

This year there was considerable speculation about how U.S. News would treat test scores in its rankings recipe, given the rise of test-optional policies during the last admissions cycle.  U.S. News resisted calls to remove consideration of test scores from the formula. Colleges receive “full credit” for test scores if 50% of entrants reported scores (the figure had previously been 75%).  Colleges where fewer than 50% of entrants submitted scores received a “discount” of 15% in the impact of scores on its ranking.  According to U.S. News, that affected 4% of institutions.

 

The focus on how many places Wossamotta U. (alma mater of Bullwinkle J. Moose) may have moved up or down in the rankings, and the attention given to minor changes in U.S. Newsmethodology, may be obscuring a more important issue.

 

Over the weekend I was doing research on the relationship between admissions selectivity (rejectivity may be the better term) and prestige, thinking about how number of applications and admit rate drive institutional behavior.  In the course of the research I stumbled upon a U.S. News list of the Top 100 colleges with the lowest acceptance rates according to the 2022 rankings..

 

That list included eight institutions identified as having admit rates below 20% that I found surprising.  Alice Lloyd College in Pippa Passes, Kentucky was listed as having a 7% admit rate, making it seemingly as selective as MIT and Yale.  The other surprises include the University of Science and Arts of Oklahoma (13%); the College of the Ozarks in Missouri, Limestone University in South Carolina, and Ottawa University in Kansas, all at 14%; Wiley College in Texas and Bacone College in Oklahoma (15%); and Texas Wesleyan University (19%).  

 

As already stated, I was surprised by, and perhaps even suspicious of, those numbers.  All are regional institutions that serve a valuable role in the landscape of higher education, but it seems odd that they would be as selective as the national universities and liberal arts colleges that populate the U.S. News list. 

 

Eight or nine years ago I recall some colleges doing creative accounting to lower their admit rate, counting inquiries as applications. At that time one college corrected data regarding applications received and students admitted that resulted in changing its admit rate from 27.4 to 89.1.  That institution explained the discrepancy as “counting in a different way.” U.S News subsequently moved that college into the “Unranked” category. For the record, I wish U.S. News would place all colleges and universities in the Unranked category.

 

I was intrigued by the reported low admit rates for those eight schools, and decided to follow up by comparing the U.S. News data with the data for each school on the Common Data Set (a collaborative initiative jointly sponsored by the College Board, U.S. News, and Peterson’s) and IPEDS (Integrated Postsecondary Education Data System), an arm of the National Center for Education Statistics that is part of the U.S. Department of Education.  Any institution receiving federal aid is required to report data in a number of areas, and I assume there are significant consequences for reporting false information.

 

It will probably nor surprise readers that I found discrepancies between what U.S. News is showing and what was reported to IPEDS, for there would no reason to write about this if all the data squared.  With a couple of exceptions, the IPEDS reporting for each college varies significantly from what U.S. News shows.

 

According to the IPEDS data for 2019-20, Alice Lloyd’s admit rate is 28%, not 7%.  Limestone’s is 51% rather than 14%, Bacone’s is 72% rather than 15%, Texas Wesleyan’s is 42%, not 19%, and the University of Science and Arts in Oklahoma is 36% rather than 13%.  Wiley College is listed in IPEDS as being open enrollment.  That’s quite an accomplishment, an open enrollment institution with a 15% admit rate.

 

There are two outliers among the outliers, both of whom share an interesting characteristic.  Ottawa University in Kansas actually shows up on the U.S. News Top 100 list twice, once at 14% and once at 24%.  Ottawa has an online component as well as satellite campuses in Kansas City, Milwaukee, Phoenix, and Surprise, Arizona.  The main campus reports an admit rate of 15% but a yield rate of 66%.

 

The College of the Ozarks in Point Lookout, Missouri, a conservative Christian institution that brands itself as “Work Hard U,” has a similar interesting statistical anomaly.  Its reported admit rate on IPEDS is 10%, actually lower than credited by U.S. News, but it also reports a yield of 91%.  I’m by no means a statistical expert, but that extremely low admit rate and extremely high yield rate suggest they have a different kind of admissions process than most other institutions.

 

I contacted U.S. News to see if there is an explanation for the discrepancies.  A spokesperson responded by pointing out that “acceptance rate is not part of the methodology,” and also added this note about U.S. News’s approach to quality assurance.

 

“For quality assurance, rankings data that schools reported to U.S. News were algorithmically compared with previous years' submissions to flag large change statistical outliers. Respondents were required to review, possibly revise and verify any flagged data to submit their surveys. For the third year in a row, they were also instructed to have a top academic official sign off on the accuracy of the data. Schools that declined to do this step could still be ranked but display a footnote on their U.S. News profile on usnews.com.

After submitting, U.S. News assessed the veracity of data submitted on a factor-by-factor level and contacted select schools to confirm or revise data. Schools that did not respond or were unable to confirm their data's accuracy may have had the data in question unpublished and unused in the calculations.”

 

 

If I am reading that correctly, U.S. News uses an algorithm that flags large changes in data from year to year, and then has institutions revise data as needed.  But what about data that doesn’t change dramatically? Does U.S. News attempt to verify all the information submitted (which would obviously be a huge job) or does it operate on an honor system, trusting that institutions will answer truthfully?

 

The larger issue here is not whether acceptance rate is part of the ranking methodology, but why the U.S. News data doesn’t match the IPEDS data.  Is the admit data an anomaly, or is there other questionable data U.S. News uses in its ranking methodology? Where is the line between trust and verification? And should we trust rankings based on data that is self-reported and unverified?