College Rankings Subvert the Purpose of Higher Education

Discussion in 'Off-Topic Discussions' started by Tracy Gies, Oct 15, 2003.

Loading...
  1. Tracy Gies

    Tracy Gies New Member

    Many students rely on U.S. News and World Report's annual rankings, at least in part, to determine which schools are the best. In an article for the November issue of The Atlanic Monthly, Nicoholas Confessore explains why that may not be such a good idea. He writes that some professionals involved with higher education are getting worried.

    For education analysts, teachers, and a handful of outspoken university presidents, however, the growing influence of college rankings has for years been a source of deep concern. They believe that rankings not only have distorted the admissions process but are symptomatic of a broader corruption of American universities: administrators, they say, have reshaped their institutions to pursue goals that may not aid—in fact, may actively subvert—the purpose of higher education.

    Their concern lies in the fact that rankings, such as those produced by U.S. News and World Report, don't result from analyzing data that are good indicators of excellence. They do not measure outputs, rather, they measure inputs. Each of the criteria US News uses to arrive at its rankings relates to a school's wealth of resources, rather than how well they are teaching students.

    Futhermore, according to Confessore, the metrics that US News does use aren't necessarily good even for measuring inputs. He cautions us not to assume that a higher ranking means that a school and its students are more advantaged than lower-ranking schools and their students. This is because the specific measures, and the assumptions behind, them are flawed.

    Take for example, faculty resources, which accounts for 20 percent of a school's overall ranking in the US News fromula. Schools that hire more full-time professors rank higher than those that rely more upon lower-paid, part-time professors. Confessore explains why this is the wrong measurement to take:

    The thinking here seems plausible enough: the higher-paid professor is more likely than the lower-paid one to have an impressive curriculum vitae and be a good teacher, and a full-time professor has more time to teach and prepare for classes than a harried adjunct.

    But in practice the things that make a professor well known in his field—published articles, groundbreaking research—must compete for his time and attention with teaching obligations. Few schools reward their faculty members for being good classroom teachers; it is universal, however, that a scholar's prospects for tenure and other advancement suffer if he or she doesn't publish frequently enough. Consequently, notes Alexander Astin, a widely respected scholar of learning and the longtime head of UCLA's Higher Education Research Institute, there's actually 'an inverse relationship between how much contact students have with faculty and how much professors publish.' In fact, famous professors may not teach much at all, leaving the work to graduate students. Not surprisingly, Astin's research shows that students at the larger and more elite institutions—that is, the institutions better able to lure high-priced academic talent—tend to have 'less satisfaction with relationships with faculty and less satisfaction with teaching.' In other words, a university may well be rich in faculty resources and poor in actual teaching.

    Other measurements that go into the equation, such as the level of student talent, and the size of a school's coffer are also flawed for various reasons, which are pointed out in the article.

    Confessore includes a discussion of a different system of "ranking" schools, one that may be superior to U.S. News-type rankings. It is called the National Survey of Student Engagement (NSSE). The NSSE seeks to gather, from students, data that may relate more directly to how well universities teach students. It uses student surveys to measure such things as how well teaching is applied to students' lives, and how "engaged" students are at their schools.

    Personally, I agree with Confessore's conclusions about the inadequecy of U.S. News' rankings for all the flaws he points out. The one big flaw with the NSSE, though, from a student's perspective, is that the data is usually not available to the public. NSSE results are reported directly to the schools, who generally don't release them.* The best we can hope for, is that schools participating in the NSSE actually use the results to help improve their performance. Also, I would like to point out that NSSE is probably still not a good way to rank DL programs or schools. This is because the NSSE regards things such as working on campus and taking part in extracurricular activities as engagement enhancers, while such things as working off campus and taking care of dependents do not enhance engagement. Most DLer's, therefore, would probably not be considered to be highly engaged by NSSE standards.

    _________
    *One exception that Confessore points out, is California State University--Monterey Bay, which posts its NSSE results on its website.
     
    Last edited by a moderator: Oct 15, 2003
  2. telefax

    telefax Member

    A very interesting article. Thanks, Tracy.
     

Share This Page