My take on the U.S. News rankings

Discussion in 'General Distance Learning Discussions' started by mamorse, Aug 12, 2001.

Loading...
  1. mamorse

    mamorse New Member

    We’ve seen in other threads, the importance (or lack of importance) various posters attribute to the educational rankings in U.S. News and World Report. Below are the criteria used for the ranking of undergraduate institutions and my comments.

    From http://www.usnews.com/usnews/edu/college/rankings/collmeth.htm

    USNEWS: “Our overall ranking system rests on two pillars. First, it relies on quantitative measures that education experts have proposed as reliable indicators of academic quality. Second, the rankings are based on our nonpartisan views of what matters in education. <SNIP>
    The indicators we use to capture academic quality fall into seven categories: academic reputation, retention of students, faculty resources, student selectivity, financial resources, alumni giving, and graduation rate performance.”

    USNEWS: “Academic reputation. The U.S. News ranking formula gives greatest weight (25 percent) to reputation because a degree from a distinguished college so clearly helps graduates get good jobs or gain admission to top graduate programs. The reputation survey also allows top academics to account for intangibles, such as faculty dedication to teaching. A school's reputation is determined by surveying the presidents, provosts, and deans of admission at institutions in a single category. Each individual was asked to rate peer schools' academic programs on a scale from 1 (marginal) to 5 (distinguished). Those individuals who didn't know enough about a school to evaluate it fairly were asked to mark "don't know." Market Facts Inc., an opinion-research firm based near Chicago, collected the reputational data; 67 percent of the 3,969 people sent questionnaires responded.”

    Mark: Academic reputation is, of course, a legitimate criterion to use. The fact that only administrators are queried is somewhat puzzling. Regular faculty should be included as well, as they are often more directly aware of the strengths and weaknesses of their peers. The best approach would be the inclusion of those faculty that actively participate in accreditation site visits; these individuals have considerable first-hand knowledge of the quality of the institutions that they accredit. Note that newer institutions and those institutions that employ nontraditional learning delivery modes will invariably earn lower scores in this category (although attitudes toward DL are changing).

    [Note that for the remaining criteria, U.S. News does not indicate the weight that has been assigned to each criterion.]

    USNEWS: “Retention. The higher the proportion of freshmen who return to campus the following year and eventually graduate, the better a school may be at offering the classes and services students need to succeed. This measure has two components: six––year graduation rate (80 percent of the retention score) and its freshman retention rate (20 percent of the score). The graduation rate indicates the average proportion of a graduating class who earn a degree in six years or less; we considered freshman classes that started between 1990 and 1993. Freshman retention indicates the average proportion of freshmen entering between 1995 and 1998 who returned the following fall.”

    Mark: I have some problems with the retention/graduation rate criterion. While students will naturally find institutions with high rates desirable, high rates are not guarantees of excellence in instruction or academic rigor. Indeed, ivy league-caliber faculty routinely complain about the pressures they experience to inflate undergraduate grades to assure high retention rates. Secondly, this criterion greatly penalizes an institution with an open admissions policy that is committed to academic rigor. Such institutions will invariably have lower retention and graduation rates than those institutions with more selective admissions standards if both have equal academic rigor. Thus, institutions with open admissions policies are penalized twice: once for nonselectivity (see student selectivity below) and again for maintaining academic standards. I take it as an article of faith that degree mills have astonishingly high graduation rates! (To be fair to U.S. News, it must be pointed out that U.S. News only ranks accredited institutions.)

    USNEWS: “Faculty resources. Research shows that the more satisfied students are with their contact with professors, the more they will learn and the more likely it is they will graduate. We use six factors from the 1999––2000 academic year to assess a school's commitment to instruction. Class size has two components: One represents the proportion of classes with fewer than 20 students (30 percent of the faculty resources score); the second represents the proportion with 50 or more students (10 percent of the score). Faculty salary (35 percent) is the average faculty pay, plus benefits, during the 1998––99 and 1999––2000 academic years, adjusted for regional differences in the cost of living (using indexes from Runzheimer International). We also weigh the proportion of professors with the highest degree in their fields (15 percent of the score), the student-faculty ratio (5 percent), and the proportion of the faculty who are full time (5 percent).”

    Mark: The class size components presumably award brownie points for the number of classes under 20 and penalties for the number of classes over 50. If so, then the conclusion is that under all circumstances, small is good. In my student days, I always preferred a large class with an excellent instructor to a small class with a fair-to-lousy instructor. However, I’ll agree that all other things being equal, smaller class size should hopefully translate into higher quality contact with the faculty member. Interestingly, faculty salary is the most highly weighted factor in a category that purports to measure satisfaction of student-faculty interaction. I can just imagine the comments: “I appreciate my contacts with my ivy league professors because I know they make oodles more than those losers at Podunk State U.” However, higher faculty salary rank is an indication that the institution is committed to attracting and retaining the best faculty it can. But, faculty hires are inevitably made on the basis of research prowess, which does not necessarily equate to excellence in instruction or high accessibility to an institution’s undergraduate population. The proportion of professors with the highest degree in their fields again represents institutional commitment, but not necessarily student satisfaction with their instruction. The student-faculty ratio should have been largely addressed in the class size component. There are, however many institutions with large numbers of nonteaching faculty. It is unclear what, if any adjustments, U.S. News makes in these cases.

    USNEWS: “Student selectivity. A school's academic atmosphere is determined in part by the abilities and ambitions of the student body. We therefore factor in test scores of enrollees on the SAT or ACT tests (40 percent of this ranking factor); the proportion of enrolled freshmen who graduated in the top 10 percent of their high school classes for the national institutions and the top 25 percent for the regional schools (35 percent of the score); the acceptance rate, or the ratio of students admitted to applicants (15 percent); and the yield, or the ratio of students who enroll to those admitted (10 percent). The data are for the fall 1999 entering class.”

    Mark: Here U.S. News should take into account the mission of the institution. Many of the U.S. land grant universities have state-imposed constraints on the type of students they may admit, normally favoring in-state applicants. Some universities literally must accept virtually any high school graduate of its own state if the individual applies prior to the deadline, regardless of that individual’s qualifications. Naturally, such institutions will have lower average high school GPA’s and lower SAT and ACT scores than private institutions which only accept the academic elite. Likewise, the acceptance rate can be misleading. The assumption here is that a low acceptance rate equals high quality. I’m aware of many mediocre institutions with very open admissions policies that have fairly low acceptance rates owing to restricted class size. So too, is the enrolled/admitted ratio. The rationale is that a high quality institution will have a large proportion of admitted students that will enroll. Several of the mid-first tier universities have lower ratios than some third and fourth tier institutions merely because they are forced to compete with more highly prestigious institutions. Indeed, some universities with open admissions will have high enrolled/admitted ratios because their applicants have far fewer options.

    USNEWS: “Financial resources. Generous per-student spending indicates that a college is able to offer a wide variety of programs and services. U.S. News measures the average spending per student on instruction, research, student services, and related educational expenditures during the 1998 and 1999 fiscal years.”

    Mark: It is not clear if different weights are assigned to spending on instruction, research, etc., or if a raw total is used. Does more spending necessarily equal greater academic quality? If we extend this logic to grades K-12, the Washington D.C. school districts should be among the finest in the nation. Alas, they are not even close. One factor that should be taken into account is the amount of remediation that some state-supported institutions must perform; such activities can significantly increase total spending per student. Also, most of the research spending will have little direct impact on undergraduate education. A major component of instructional spending is faculty salaries. Does U.S. News use cost the Runzheimer indices here? Ultimately, of course, I would agree that I would feel more comfortable with an institution with greater financial resources.

    USNEWS: “Graduation rate performance. This indicator of "added value" was developed to capture the effect of the college's programs and policies on the graduation rate of students after controlling for spending and student aptitude. We measure the difference between a school's six-year graduation rate for the class that entered in 1993 and the predicted rate for the class. The predicted rate takes into account the standardized test scores of these students as incoming freshmen and the school's expenditures on them. If the actual graduation rate is higher than the predicted rate, the college is enhancing achievement.”

    Mark: Without seeing the precise formulae used for this criterion, it is impossible for me to comment.

    USNEWS: “Alumni giving rate. The percentage of alumni who gave to their school during the 1998 and 1999 academic years is an indirect measure of alumni satisfaction.”

    Mark: Quite indirect, indeed. Alumni giving can be influenced by a number of factors. One might expect alumni donations from institutions that are primarily engineering schools to be considerably higher than those from traditional liberal arts institutions. Also, state institutions often experience huge multi-year increases in alumni giving after a national championship in football or basketball. Were those individuals more satisfied with their educational experience before or after the championship? Perhaps a 5-year evaluation would be more realistic. U.S. News calculates both an alumni giving rate and an alumni giving rank. I assume that rate is on a per capita basis. I’m not sure whether or not rank is based on total donations or not. If so, then larger institutions are clearly favored.


    Two additional major criticisms:

    (1) U.S. News provides no direct measure of actual teaching quality. Unfortunately, our decentralized U.S. system of higher learning has no direct counterpart to the U.K.’s Quality Assurance Agency for Higher Education. This agency takes assesses the following criteria in their teaching assessment: (1) curriculum design, content and organization; (2) teaching, learning and assessment; (3) student progression and achievement; (4) student support and guidance; (5) learning resources; and (6) quality management and enhancement. Somehow, U.S. News must find a way to include these factors. Note that American universities must keep records of such measures to maintain accreditation. It should also be noted that a number of lower tier institutions tend to fare quite well in these measures.

    (2) U.S. News does not include undergraduate outcomes measures, such as the proportion of students employed after graduation, average starting salary, the proportion of students that go on top graduate schools, the proportion that become leaders in their fields. etc. All institutions track such outcomes closely for their graduate programs but usually not for their undergraduate programs, which probably explains the omission. Still, any ranking of academic institutions is incomplete with an outcomes assessment.

    To conclude, I’ll accept the U.S. News ranking system as a good-faith attempt, but also a sytem with a number of flaws. While I agree that the U.S. News top tier institutions are excellent institutions, I’m not ready to dismiss the other institutions as little more than degree mills solely on the basis of the U.S. News criteria.

    Mark
     
  2. John Bear

    John Bear Senior Member

    Did they address the matter of schools that refuse to cooperate with them, and how they are ranked? I wonder how many of those there are? I know the school I spent my first three years, Reed College, is one. Do I remember reading that Stanford is another?

    As Washington Monthly reported, "When Reed College refused on principle to submit data in 1995, U.S. News summarily dropped it to the lowest tier; despite having the 18th best academic reputation of all national liberal arts colleges in U.S. News' reputational survey, Reed was listed right next to Richard Stockton College of New Jersey which had the 153rd place."
    (http://www.washingtonmonthly.com/features/2000/0009.thompson.html)
    (
     
  3. Lewchuk

    Lewchuk member

    I don't disagree but we should remember two things:
    a) In the vast majority of cases the rankings will be directionally correct... very few top schools will actually be far bottom tier and vice-versa.
    b) These rankings do much to define perception and often perception is reality.

     
  4. BillDayson

    BillDayson New Member

    Excellent post, Mark. It raises so many issues that it will take several posts to do justice to it. Some general comments:

    One problem is that not all schools are equally strong in all specialties of all fields at all levels. That's particularly true on the graduate level.
    So I wonder how useful an overall reputation score is when choosing between programs in a particular field. A high-prestige school might not even offer a program in your specialty, so it is foolish to announce flatly that it's "better" than a less prestigious school with a strong program in your area.

    What's more, the USNews reputation scores are for undergraduate programs. Graduate-only institutions are not even included in the rankings. I'm not sure that all strong undergraduate schools have strong graduate programs or vice-versa.

    Good points. Yeah, I think that reputation, particularly overall institutional reputation among administrators rather than subject matter specialists, is a trailing indicator of quality. So it will tend to favor established programs over new ones.

    But more important to us here at Degreeinfo is the general point that these USNews rankings are not entirely appropriate for judging the quality of adult distance education.

    With regards to prestige, I think that there is a bias among educators against distance education in particular, and against vocational education in general. Schools with specialties in classical art history and ancient Assyrian will usually be perceived as "better" than schools specializing in business or agriculture.

    That's only compounded when such a practical school specializes in educating a non-traditional student popuation using non-traditional delivery media.

    You said it better than I could. The way that US News sets up their system, a good school with relatively open admissions would probably fall into the third or fourth "tiers" for those reasons alone.

    The problem is only made worse when you are dealing with a population of already-employed adults who are attending part-time. The drop-out rates are probably pretty high.

    So apparently if you give a large raise to the faculty at Third Tier U., you can knock it up a tier without changing anything else. Same faculty, same courses, same students. The faculty labor unions must love that one.

    Another excellent point. Coompare that to a school that employs many part-time faculty who also work in industry. They may be better instructors in applied fields than "ivory-tower" researchers would be, but hiring them can kill a school's USNews score.

    How relevant are high school grades and SAT scores when the student applicant is a 35-year-old working full time in his or her field? USNews seems not to have even considered the existence of university students who are not adolescents fresh out of high-school.

    If a program has a high percentage of part-time students, fewer will graduate in the prescribed six years, hence the school will be penalized.

    And interestingly this is exactly how most people interpret the rankings: Higher ranked schools offer better teaching.

    I agree 100%. The USNews rankings are useful and valuable.

    But they pretty explicitly embody a vision of what a "good" education should be, and judge universities by how closely they approximate that model.

    Unfortunately for Degreeinfo, the model that USNews has chosen is full-time on-campus education of highly selected teenagers, preferably in research institutions.

    USNews seems to have paid no attention at all to continuing education of adults, let alone to distance education. They seem to have a low opinion of expanding educational oportunity in general.

    And it only compounds the problem to try to use these general institution-wide undergraduate "tier rankings" to judge teaching quality in graduate-level distance education programs in particular fields.
     
  5. Lewchuk

    Lewchuk member

    1) We need to realize that such rankings, as imperfect as they are, do often reflect the perception in the marketplace. Although such discussions should be had, they are not much use in an immediate pragmatic sense. It is difficult to explain to people to people that the program is actually excellent and it is the perception that is flawed. Example, there are situations where a professor taught subject "a" at an Ivy then taught subject "a" at a "lesser" school via DL. There is no rational reason to beleive that the course is now inferior to the Ivy but you can be assured that it will not be seen as such. That is life... we should learn to deal with it.
    2) Little is gained by separating DL programs from on-campus programs. When I compete in the marketplace I may well be competing with those with a traditional education. If you can pursue a DL from a 1st tier or a 4th tier it may be possible that the 4th tier offers a higher quality DL experience... but you will be making a mistake.

     
  6. Lewchuk

    Lewchuk member

    A valid question is why this is such a passionate issue with some. The answer is quite obvious. Although DL courses are available from schools throughout the tiers, complete DL degrees are generally from the fourth tier schools. I typical regressive response is to say how meaningless the rankings are at that "we are just as good as they are". A more proactive response would be suggestions how DL degrees can be offered by the finest institutions in the country (incidently, this is being done... see Harvard, Duke, Stanford... but on a very limited scale and with residency).


     
  7. Bruce

    Bruce Moderator

    Mark, an excellent and well thought-out post. My bottom line with the US News rankings is that they're useful as a general guide, but shouldn't be taken as gospel. My sister-in-law went to Harvard & Harvard Business School, and she told me that her freshman year consisted of huge lecture classes where the professor barely knew your face, never mind your name.

    Of course the class sizes went down & the quality of instruction went up as she went through the program, but was the end result really worth taking out the student loans she's still paying off? In reputation certainly, but educational experience I'm less sure.

    In a perfect world where cost didn't matter, I'd prefer my kids go to a quality, small liberal arts college like Amherst for undergrad, and a research university for graduate work. Wait...let me amend that, in a perfect world I would have done that also. [​IMG]

    Bruce
     
  8. mamorse

    mamorse New Member

    John, all that is stated at the U.S. News website http://www.usnews.com/usnews/edu/college/rankings/collmeth.htm is the following:

    “Most of the data come from the colleges--although U.S. News takes pains to ensure their accuracy. This year, 94 percent of the schools returned surveys. We assessed the data and obtained missing data from sources such as Wintergreen/Orchard House, the American Association of University Professors, and the National Collegiate Athletic Association.”

    The U.S. News folks are certainly vindictive!

    Mark
     
  9. mamorse

    mamorse New Member

    In general, I would agree.

    I would agree. However, to a large extent, the rankings are primarily a measure of the prestige of the institution. If you check the U.S. News rankings in certain graduate school disciplines (e.g., nursing), the only component of the assigned score is the academic reputation. While I agree that academic reputation is important, I much prefer concrete measures of academic quality.

    Mark
     
  10. mamorse

    mamorse New Member

    Hello, Bill - you’ve posted a very comprehensive response here!

    This is quite true. When I went to graduate school 20 years ago, I attended what is now considered a second tier institution; I would have loved to have attended an ivy, but no ivy had a strong program in my field at the time. (Many do NOW, however.)

    I agree that there are lot of dinosaurs out there that look down on non-traditional education. However, at some point they’ll either adapt or become extinct. I just hope that we live long enough to see it!

    This is true. Comparing a program comprised primarily of recent high school grads with one comprised mostly of adult part-time learners is essentially comparing apples to oranges. I would prefer to see U.S. News rank institutions on the basis of traditional and nontraditional teaching modes. They apparently only do this for a few of their graduate listings (e.g., part-time MBA programs). Even here, many of the programs deliver content via traditional, albeit part-time programs.

    I would suggest that the faculty resources criterion be replaced by an instructional quality criterion.

    You’re quite correct - the U.S. News rankings are purely designed for college-bound high school seniors. At some point, U.S. News will wake up to the fact that an adult educational revolution has been underway for some time. Perhaps the U.S. News rankings staff could stand a bit of continuing education...

    Mark
     
  11. mamorse

    mamorse New Member

    Bruce, that's precisely the way I feel as well.

    But something tells me she'll eventually come out ahead financially!

    Amen!

    Mark
     
  12. BillDayson

    BillDayson New Member

    I hear anecdotally that the number is rising. But it's probably mostly talk.

    Though Stanford may criticize the USNews methodology, the "top tier" schools love the high rankings they get. They are unlikely to endanger that by failing to cooperate.

    The schools that try to extend educational opportunity and get shit on for their efforts are more likely to stop cooperating. But if that failure to cooperate just mires them deeper in the rankings, there is little upside in doing it.

    What might be happening more often is manipulation of the data that is reported. A number of schools admit up to 20% of their freshmen students under "special admissions categories" that include things like affirmative action and admission of children of prominent alumni and celebrities. But a number of universities (including the University of California I believe) simply don't average the grades and SATs of these students in with the rest. What you get are impressively high average SATs (except for all the rest).

    I think that this is particularly relevant to distance education because by its very nature, too much adult education is going to kill a university's tier ranking. We see many prestige universities spinning their adult and DL offerings off into "extension divisions" whose figures are not reported. The University of California does that. Harvard's external degrees are offered by extension, as is Stanford's on-campus master's in liberal arts for adults.

    The University of Maryland seems to have gone one better, spinning their adult and DL offerings off into a separate "campus" (U. Md. University College) that nevertheless is physically located at the main College Park U. Md. facility. But its statistics are separate and don't pollute the main branch.
     

Share This Page