U.S. News Data is Skewed Towards the Richest Institutions
Do you look at college rankings to decide which college to attend or where to send your kid? Stop it! It’s a waste of time and the information is not reliable. That’s the opinion of Valerie Strauss, education reporter for the Washington Post. Strauss examined the 2018 U.S. News & World Report annual rankings and concludes the relevance of the data-based rankings are questionable. It’s a case of “garbage in, garbage out.” [My words, not hers].In other words, the devil is in the details.
One problem with these rankings is those who do the ranking decide on the criteria to use. It’s hard to imagine schools are on a level playing field with respect to these rankings. Here are the rankings in case you care.
These are the top 10 national universities in the rankings — and notice the many ties:
- Princeton University
- Harvard University
- Columbia University
- Massachusetts Institute of Technology
- University of Chicago
- Yale University
- Stanford University
- Duke University
- University of Pennsylvania
- Johns Hopkins University
- Northwestern University
And these are the top 10 for national liberal arts colleges:
- Williams College
- Amherst College
- Swarthmore College
- Wellesley College
- Bowdoin College
- Carleton College
- Middlebury College
- Pomona College
- Claremont McKenna College
- Davidson College
The criteria used from my point of view fail to consider some of the most important ingredients in judging the worth of a college. For example, how many hours of direct contact exist between faculty and students. If you go to a college where teaching assistants teach a lot of the mega-size classes, that’s not a good thing. Students need to learn from their professors, many of whom have great knowledge of their field of study.
Another fault, I believe, is it doesn’t (and really can’t) measure work ethic. Do students going to the highest ranked colleges have a strong work ethic upon graduation? Isn’t this a key ingredient in success in life?
Enough said about that. Let’s look at the ranking criteria used by U.S. News:
- Graduation Rates (35%). This is in a six year period. Why six years? What ever happened to four years? Might that indicate a stronger work ethic? Maybe not. Maybe the six-year grads worked during their college years so it took six years. No way to measure this important factor.
- Faculty Resources (20%). This looks at class size, faculty salary, faculty with the highest degrees in their fields, student-faculty ratio (does this mean teaching faculty or does it include researchers who teach one class a year?). It also includes the proportion of faculty who are full-time. There’s some bias here towards the richest universities but does that make them the best?
- Expert Opinion (20%). Supposedly, this is from “top academics.” I consider myself a top academic; no one asked me. Enough said.
- Financial Resources (10%). Another benefit for the rich institutions.
- Student Excellence (10%). A new part of this ranking is “social-mobility indicators,” Don’t ask.
- Alumni Giving (5%). See my comments in items 2 and 4.
If you still need convincing that these ranking are worthless, on January 7 it was announced that Temple University settled a class action lawsuit from students who were outraged to learn that the business school’s top ranking for its online MBA program by U.S. News was based on false data. The university will pay $4 million to those who were students in that program and several other master’s degrees for false and misleading information.
Finally, I have to question what was in the mind of Temple administrators when it created a $5,000 scholarship “for a student who demonstrated interest in the study of ethics who is enrolled in any of the programs that are part of the settlement.” Can they afford the $5,000 with an endowment of $513.6 million as of 2016?