Does “US News” check its ranking? (opinion)



[ad_1]

Ronald Reagan used the phrase “Trust, but verify” to describe his position on nuclear disarmament discussions with the Soviet Union.

His use of this phrase was brilliant on several levels. Talking about trusting an opponent was on some level an expression of good faith, but the addition of verification made it clear that any idealism was also tempered by a dose of realism. The added genius of the phrase applied to the Soviets was that it was adapted from a Russian proverb, “Doveryai, no proof.

The ongoing Operation Varsity Blues scandal trials are a reminder of the fine line between trust and verification in college admission. Colleges trust applicants to be honest and truthful in what they mention in their applications. While we wouldn’t want that to change, Operation Varsity Blues serves as a caveat. Widespread fraud, including constructing elaborate and fake resumes for sports that the students involved did not even play, was not discovered by admissions offices. Deceive me once, shame on you. Cheat on me twice …

The lawsuit isn’t the only “headline-grabbing” article that provides a test case for the interplay of trust and verification. Last week American News and World Report published its annual ranking “America’s Best Colleges”. I have since received numerous emails from universities touting their rankings, and my local newspaper ran their annual article highlighting small changes in local institutional rankings as if they meant big news.

This year there has been a lot of speculation about how US News would address test results in its ranking recipe, given the rise in voluntary testing policies during the last intake cycle. US News resisted calls to withdraw consideration formula test results. Colleges receive full credit for test scores if 50 percent of participants reported scores (the figure was previously 75 percent). Colleges where less than 50 percent of participants submitted scores received a 15 percent discount on the impact of scores on their rankings. According to US News, which affected 4% of establishments.

The emphasis on the number of places Wossamotta U (Bullwinkle J. Moose’s alma mater) was able to move up or down in the rankings, and the attention to minor changes in the rankings. US News methodology, may hide a more important question.

Over the weekend, I researched the relationship between admissions selectivity (rejection is perhaps the best term) and prestige, thinking about how the number of applications and the rate of admission. admission determine institutional behavior. During my research I came across a US News list of the top 100 colleges with the lowest acceptance rates according to the 2022 ranking.

This list included eight institutions identified as having admission rates below 20 percent that I found surprising. Alice Lloyd College in Pippa Passes, Ky., Has been listed as having an admission rate of 7%, which apparently makes it as selective as the Massachusetts Institute of Technology and Yale University. Other surprises include Oklahoma University of Science and Arts (13%); the College of the Ozarks in Missouri, Limestone University in South Carolina and the University of Ottawa in Kansas, all at 14 percent; Wiley College in Texas and Bacone College in Oklahoma (15%); and Texas Wesleyan University (19 percent).

As I said before, I was surprised, and perhaps even suspicious, at these numbers. All are regional institutions that play a valuable role in the higher education landscape, but it seems odd that they are as selective as the national universities and liberal arts colleges that populate the world. US News listing.

Eight or nine years ago, I remember some colleges doing creative accounting to lower their admissions rate, counting inquiries as requests. Around this time, a college corrected data regarding applications received and students admitted, which resulted in a change in its admission rate from 27.4 percent to 89.1 percent. This institution explained the gap as “counting in a different way”. US News subsequently, this college was placed in the “unranked” category. For the record, I wish US News would place all colleges and universities in the unranked category.

I was intrigued by the low admission rates reported for these eight schools and decided to continue by comparing the US News data with each school’s data on the Common Data Set (a collaborative initiative jointly sponsored by the College Board, US News and Peterson) and IPEDS (Integrated Postsecondary Education Data System), a branch of the National Center for Education Statistics which is part of the US Department of Education. Any institution receiving federal aid is required to report data in a number of areas, and I suspect reporting false information has significant consequences.

It probably won’t surprise readers that I have found discrepancies between what US News shows and what was reported to IPEDS, as there would be no reason to write about it if all the data were squared. With a few exceptions, the IPEDS reports for each college vary widely from what US News shows.

According to IPEDS data for 2019-2020, Alice Lloyd’s admission rate is 28%, not 7%. Limestone’s rate is 51% instead of 14%, Bacone’s 72% instead of 15%, Texas Wesleyan’s 42% instead of 19%, and the University of Science and Arts of Oklahoma 36% instead of 13%. Wiley College is listed in IPEDS as open enrollment. That’s quite an accomplishment – an open enrollment institution with a 15 percent admission rate.

There are two outliers among outliers, both of which share an interesting characteristic. The University of Ottawa in Kansas actually presents itself on the US News Top 100 list twice, once at 14 percent and once at 24 percent. Ottawa has an online component as well as satellite campuses in Overland Park, Kan .; Milwaukee; Phoenix; and Surprise, Arizona. The main campus reports an admission rate of 15 percent but a rate of return of 66 percent.

The College of the Ozarks in Point Lookout, Missouri, a conservative Christian institution that bills itself as “Work Hard U,” exhibits a similar interesting statistical anomaly. His admission rate reported on IPEDS is 10 percent, in fact lower than that credited by US News, but it also reports a yield of 91%. I am by no means an expert in statistics, but this extremely low admission rate and extremely high rate of return suggests that they have a different type of admission process than most other institutions.

I have contacted US News to see if there is an explanation for the discrepancies. A spokesperson responded by pointing out that “the acceptance rate is not part of the methodology”, and also added the note to US Newsapproach to quality assurance.

“For quality assurance, ranking data that schools reported to US News were algorithmically compared with submissions from previous years to report statistical outliers of material change. Respondents were required to review, possibly revise and verify all reported data to submit their surveys. For the third year in a row, they were also asked to have a senior academic official sign the accuracy of the data. Schools that refused to take this step could still be ranked but display a footnote on their US News profile on usnews.com. After submitting, US News assessed the veracity of the submitted data factor by factor and contacted selected schools to confirm or revise the data. Schools that did not respond or were unable to confirm the accuracy of their data may have the data in question unpublished and not used in the calculations.

If I read correctly, US News uses an algorithm that reports significant changes in data from year to year, then asks institutions to revise the data as needed. But what about data that doesn’t change drastically? Made US News trying to verify all the information submitted (which would obviously be a huge job) or does it run on an honor system, hoping that the institutions respond honestly?

The bigger issue here is not whether the acceptance rate is part of the ranking methodology, but why the US News data does not match IPEDS data. Is the admission data an anomaly, or is there other questionable data US News uses in its ranking methodology? Where is the line between trust and verification? And should we trust rankings based on self-reported, unverified data?

Editor’s Note: Inside higher education reached out to US News to comment on this column, and Robert Morse, who heads the magazine’s ranking project, responded by email: “When schools submit data to US News, they are instructed to ask a senior university official to validate the accuracy of the data.

[ad_2]

Source link