The appearance of autumnal colors signals the annual onslaught of best college lists. US News & World Report, which seemingly exists solely as a brand for college rankings, provides one of the better known lists along with the more comprehensive Princeton Review lists, and the zanier Washington Monthly list. Joe Nocera dives head first into this pile of lists with this rank money quote:
U.S. News likes to claim that it uses rigorous methodology, but, honestly, it’s just a list put together by magazine editors. The whole exercise is a little silly. Or rather, it would be if it weren’t so pernicious.
US News & World reports' ranking method has been under attack since the list made its first appearance. The attacks do have merits as Nocera's column illustrate.
Furthermore, schools game these lists including public schools like Clemson, selective private schools like Claremont McKenna, and law schools like the University of Illinois. Other tricks to improve spots on these rankings include encouraging lots of students to apply for an admission spot, and ignoring test scores from certain applicant groups.
The rankings remain silly. As with any ranking, distance between observations have been lost. For example, US News & World Report ranked Yale third behind Harvard and Princeton. However, no one knows if the distance between Yale, and Harvard and Princeton is one point or a hundred points. Yale's closeness to Harvard and Princeton remains a mystery.
Ultimately, these lists tell us nothing about the schools. We want to believe that the Ivy League schools are better than their brethren who comprise the New England Small College Athletic Conference. The comparison, though, is apples (e.g., Harvard, Princeton, Penn) to oranges (e.g., Williams, Amherst, Trinity). Yet, by definition, rankings impose a structure on the roughly 500 institutes of higher learning based in the United States.
Instead of arbitrarily and capriciously creating a ranking out of whole cloth, high school juniors and their parents would be better serve if schools were grouped, or clustered, by some attributes. The attributes could be varied and vast, including:
- Acceptance rate
- Median ACT/SAT scores
- Mean, median amount of student loan debt
- Number of states and countries represented in prior entering classes
- Mean, median time to degree completion
- Number of graduate degrees offered
- Presence of professional graduate schools
The above list of attributes represents a starting, not an ending point. With the attributes coded and measured, a savvy analyst could use factor analysis, which SPSS offers, or discriminant analysis, which Marketing Engineering offers, to create the clusters. Finally, the analyst would label, or name, each cluster with something clever such as "Leafy Rural Elite," and "Everybody Welcomed."
Also, the cluster approach minimizes schools' attempts to game the system. A school could continue to shade reporting numbers. With so many attributes, these efforts would have minimal effect.
Finally, this analytical tool would reveal groupings that otherwise would not be apparent. For example, based on these attributes, Miami University could be considered part of the same group as the University of Texas and the University of Michigan. Initially, we would be hesitant to create such a grouping. By analyzing the data, we would have support for such an argument.
Looking at each cluster, high school juniors could rely on more meaningful information. It makes no difference that Harvard is ranked ahead of Yale by a criterion. A cluster that includes Harvard and Yale along with a separate, and distinct cluster that contains Eastern Connecticut State and Midwestern State would give a better indication of where students and their families should be begin collecting information, scheduling campus tours, and focusing their efforts.