"Tier 1" vs. "Tier 4" vs. ?

Discussion in 'General Distance Learning Discussions' started by Anthony Pina, May 2, 2004.

Loading...
  1. Anthony Pina

    Anthony Pina Active Member

    Many Degreeinfo threads speak of “Tier 1”, “2”, “3” or “4” schools, referring to the rankings found in the annual edition of US News & World Report "America's Best Colleges". While the US News rankings are one of many useful tools, there are some issues that should be considered when using these rankings as an exclusive measure of quality:

    1. The National Center of Education Statistics lists 4,130 higher education institutions in the US. If we subtract the 1,710 community colleges, we end up with 2,420 4-year institutions. About 1,400 of these actually show up in the US News edition. Thomas Edison and Excelsior are included in the directory of colleges (in the back of the magazine) but are not found anywhere in the rankings. About 1,000 institutions, including the likes of Walden, Jones International, Capella, Western Governors, Charter Oak and about 1,000 others are not found anywhere in the US News edition--not even in its directory of colleges.

    2. The biggest chunk of the score used to rank the institutions (25%) is based on a subjective measure, namely the opinions of college presidents, provosts and deans of admissions about a given college's or university’s academic reputation.

    3. Colleges are split into four distinct groups: national-doctorate, masters, liberal arts-bachelors and comprehensive bachelors. Schools are ranked only within their assigned group (not between groups). So it would be useless to say that a “2nd Tier” masters is better than a “4th Tier” national-doctorate--The US News ranking system does not provide us with that kind of information.

    4. The national-doctorate and liberal arts-bachelor categories use a differently weighted scale than the masters and comprehensive-bachelor categories, thus making comparisons between categories even less possible.

    5. When the liberal arts college category was changed from regional to national a few years ago, Albertson College dropped from being a “Tier 1” school to a “Tier 4” school in just one year. This kind of rapid drop hurts the reliability of the US News scale.

    6. The US News survey measures some very specific outcomes at the institutional level (e.g. admission rates, retention, SAT scores of enrollees, graduation rates, faculty salaries, alumni giving rates). These measures, while useful, do not provide any data regarding the quality of the coursework, scholarly productivity of faculty and students, reputation and quality of individual disciplines within the university, success of graduates or other useful measures.

    Nobody cares more about where people earn their degrees than we higher ed folk. I have sat on many hiring boards for college faculty and administrators. Although having degrees from Harvard, UCLA, Columbia and other “top tier” schools tends to provide extra consideration for candidates, I have NEVER heard the US News “tier” rankings used as a measure to judge between two similar candidates (e.g. “We’d better hire the one with the PhD from U. of Nevada-Reno, because that’s one tier higher than the PhD from U. Nevada-LasVegas”).

    Tony Piña
    Faculty, CSU San Bernardino
    (CSUSB usually gets killed in US News rankings because someone at our University either forgets or refuses to fill out the survey each year)
     
  2. Guest

    Guest Guest

    If you graduated from a Tier one school then you like to use those rankings. If you graduated from a Tier 4 then you probably feel they stink and if you want to spend time attacking NSU then you use the Tier.

    You are right there are many variables and it has put pressure on the schools some of whom get quite angry at USNews (McLeans' in Canada). Because the rankings get such big press on National TV news and NPR ect it has put pressure on some of the Universities to do things to adjust their scores. They figured out what to play with in order to elevate their rankings. If I recall someone from the U of Sask. said as much (ie that they were going to tweak certain of the criteria used for measurement to result in higher scores). He seemed to think this was goofy but necessary to elevate the ranking.

    North
     
  3. -kevin-

    -kevin- Resident Redneck

    US News states that only 293 of 377 AACSB schools responded for the 2005 version. Kind of skews the results. Until I found this board I never realized the politics of higher education.
     
  4. Anthony Pina

    Anthony Pina Active Member

     
  5. Anthony Pina

    Anthony Pina Active Member

    North gives some good examples of the politics of rankings. Not only are many AACSB schools missing, but nearly 40% of all 4-year colleges and universities are absent from the rankings. The other issue with survey research is that you must assume that the institutions are honest in the data that they provide.

    With all its limitations, the US News is making an honest and herculean effort to provide as godd a raking system as it can. A 60% return rate is decent for most survey research. One must exercise caution, however, in using this data in ways that the data cannot support, such as saying that a given "Tier 4" doctoral university is closer in raking to an unaccredited school than to a "Tier 1" doctoral university. There is nothing in the US News ranking system that would provide data to support such a statement.

    Tony Pina
    CSU San Bernardino
     
  6. GUNSMOKE

    GUNSMOKE New Member

    BAD NEWS: IT AIN'T JUST HIGHER ED!

    I have a lovely, highly intelligent hard-working neice who left a very good position with a large energy company to teach elementary school so she "...wouldn't have to deal with politics!"

    She was the one who got the education and is now back on the much tamer less competative corporate ground.
     
  7. Tom57

    Tom57 Member

    Yes. The schools are put in a tough place. Most admit the rankings are extremely limited in usefulness, or maybe accuracy, yet so many people take them seriously. The schools have to put time and energy into something they don't believe in.

    As for honesty in surveys, you can bet that if you can imagine a little fudging of numbers, it's being done somewhere.
     
  8. Andy Borchers

    Andy Borchers New Member

    Tony - great thread on an interesting topic.

    My take is that one shouldn't make much of a point of small differences in ranking. Big differences - like comparing NSU with Stanford - probably are meaningful.

    It strikes me that different students are drawn to different schools for a variety of reasons. 4th tier schools may make good sense for some students.

    I had an interesting example come up with a friend of mine that showed the usefulness of the data. His daughter was torn between University of Michigan and USC and was leaning to USC based on the allure of Southern California. I had on-line access to the US News figures. We carefully studied the data looking at every possible quantitative measure - and concluded that there wasn't much difference between the two schools. He strongly encourage his daughter to go to U of M. Why? Michigan's tuition was dramatically lower as were travel and living costs. In the end his daughter is happy with Michigan - and his pocketbook is a lot thicker.

    Regards - Andy

     
  9. Tom57

    Tom57 Member

    Except that not many of us need help distinguishing NSU from Stanford. The rankings are accurate precisely in areas where it doesn't matter too much.

    Maybe I do need the rankings, though. In a debate between Michigan and USC, I would take Michigan every time. At least as far as research goes, Michigan is top tier. Perhaps the private school bias in the survey? Which brings up an interesting topic, does Michigan's whopping lead in research make up for all of the niceities that private schools offer? So much rests on how US News decides to weight each category, and do their subjective preferences match yours?
     
  10. Jack Tracey

    Jack Tracey New Member

    Tony - I'm happy to concede that you know more about this subject than I do but it seems clear that the US News survey is seriously flawed. I might even go so far as to say that the only conclusions that one might reliably reach from this are conclusions everyone had made previously (like the idea that Harvard is generally better respected than say Framingham State College). Perhaps I'm being overly cynical but I feel the need to remind everyone that US News is in the business of making money. They run a distant third place in the weekly news magazine race and they use these "special editions" to prop up their sagging weekly sales. This is not a serious piece of research, it's something Moms buy at the supermarket the day after their kid takes the SAT's. I know I'm being sour but it's hard for me to understand why this ranking is taken seriously.
    Jack
     
  11. BillDayson

    BillDayson New Member

    I believe that the US News classifications are just the Carnegie classifications repackaged in more misleading terms.

    If you combine the 'doctoral', 'masters' and 'bachelors' Carnegie classifications, you get 1478 institutions. The largest groups that US News excludes are the 'associates' colleges and the 'specialized' institutions. Many of our DL favorites fall in the latter group.

    It's important to remember that the US News rankings are for undergraduate programs. That's why you find schools like Pepperdine cracking the top tier doctoral ranks even though their graduate programs aren't very extensive, while major research universities like Arizona State or the University of Hawaii only make the third tier. I mean, there's no way that Pepperdine even comes close to Hawaii in graduate-level astronomy, geophysics or Asian philosophy, but Pepperdine probably does have stronger undergraduate programs.

    It's also useful to remember that college presidents' and deans of admissions' opinions of entire institutions are a pretty blunt instrument if we seek to judge departmental reputations in specific subjects.

    I've wondered about that. If we can't compare them straight across, what kind of correction factor should we introduce?

    I attended SF State. It's a masters school that got a reputation score of 3.3 and only made second tier, though I think that it had the highest reputation score in that tier. Frankly, I know of no reason to consider SF State superior to San Diego State, but SD State, which qualifies as a doctoral school, only made fourth tier with a rep score of 2.8.

    That suggests that maybe we should introduce a correction factor of 0.5 between masters and doctoral scores. That would make SF State's undergraduate programs a peer to those at doctoral schools at 2.8, and besides SDSU these include U. Mo. Rolla, U. Arkansas, Catholic U., LSU and Northeastern. I can believe that comparison.

    Another example is a couple of blocks from my home. College of Notre Dame was a small top-tier Catholic masters level institution. Then it changed its name to Notre Dame de Namur University. Same campus, same students, same classes, same faculty. But its reputation score crashed and it dropped like a stone out of the top tier. I assume that's because all those presidents and deans didn't recognize the new name and figured that if they had never heard of it, it couldn't be any good.

    It also enforces a particular model of an ideal university.

    US News wants high initial selectivity. They want that selectivity measured in terms of high school class rank and SATs, which in turn demands an adolescent student body for whom those measures are relevant. Subsequent to admission, they want full time enrollment, low attrition and high graduation rates.

    If a school offers relatively open admissions to expand educational opportunity, recruits working adults that study part time, then lets difficult coursework weed out students during the course of the program, that school is going to score low in everything that US News is looking for and will inevitably be fourth tier.

    That in turn suggests that many fourth tier schools aren't bad necessarily, it's just that they use the "open university" model that US News hates.
     
    Last edited by a moderator: May 2, 2004
  12. chris

    chris New Member

    People pay way too much attention to USN&WR

    A few years back I scheduled a visit with my daughter to Eastern Illinois University. It is a very respected regional university but it had been in the second tier for many years. However, a week before our visit the USN&WR rankings came out with them listed in the top tier. The day we visited they had 210% turnout. The vast majority just showed up, many travelling from Chicago. I guess it was a better school after it got the rankings stamp of approval.
     
  13. Dennis Ruhl

    Dennis Ruhl member

    I love taking courses from a prof that wrote the text. I remember one particular prof who sold 30,000 political science texts, an amazing number for Canada. He was good and dumb luck put me in his class which was almost impossible to get into.

    I am sure that in higher tier schools one is considerably more likely to run into such profs.

    I start another course in Sepember. As I have already taken a course from the prof, I emailed him to ask the name of the text.

    He replied that it was by his favourite author, himself. I emailed back and said that coincidentally it was my favourite author too. He replied that he values suck-up points as it shows the right attitude.
     
  14. -kevin-

    -kevin- Resident Redneck

    Dennis,

    "He replied that he values suck-up points as it shows the right attitude."

    exactly why I prefer DL to on ground classes, the suck up factor is reduced and more objectivity comes into play for your grade.

    Good luck in your class....
     
  15. Dennis Ruhl

    Dennis Ruhl member

    I can't remember all the criteria for the Maclean's magazine rankings in Canada but one is average grade of students admitted.

    The average grade of a high school graduate in Alberta is 13 % lower than Ontario even though Alberta students score higher on standardized exams and half the Alberta students are streamed into a non-university entance programs. There are nothing like SATs in Canada.

    Another measure is federal research grants received by the school. Alberta is perpetually in political opposition to the government and for any federal money to actually filter west of the Lakehead is rare.

    Result - first four schools are from Ontario plus McGill in Montreal.

    Choosing and weighing criteria slightly differently could lead to vastly different results.

    Rankings are interesting on a gross basis but to say the 1st is better than the 5th is probably not meaningful.

    My school the U of Alberta was 5th of 15 doctoral/medical schools. As they are all provincially funded I doubt the quality of education differs much between the 1st and 15th.

    Just a note, there are a lot less Canadian universities per capita than in the US and on average they are much larger. Other than a few small exceptions they are government funded.
     
  16. Orson

    Orson New Member

    Self-assessment come into play here. Both Michigan and USC are huge schools. But a private school will offer greater hand-holding, should it be required. It's part of what one pays for. Finding the luxury of such individual attention at a public university is rare and cannot be counted upon. If a student doesn't need it - so much the better. But if a student does and cannot get it - woe!

    Finally, when it come to the unranked and DL schools, then anyone who cares can examine the content and grades of the transcript can assess and decide. Falling back upon such details instead of widely held reputations simply takes more time. When a university has a fine "Harvard" like reputation, the short-hand reputation effect itself simply saves time, and is therefore economical.

    Another way of putting it is that it substitutes for status - like saying you are a "Doctor" - a member of an esteemed society, needing to be questioned only for cause. It saves time that trust building otherwise requires.

    --Orson
     
  17. PaulC

    PaulC Member

    I have yet to read a list of metrics used to assess the value of the various tiers. For example, what might be measured that could be used to make quantitative comparisons between Tier 2 and Tier 3 schools, or tier 2 and tier 4, for that matter? If there is nothing measurably different, then what is the value of any of the qualifiers?

    I can "imagine" the type of metrics that might be used (e.g., average salary of graduates employed in very similar jobs), but I have not seen or heard of the studies that may have produced quantitative data identifying the differences in outcomes.

    Anyone else see this type of data?
     
  18. Mike Albrecht

    Mike Albrecht New Member

  19. BillDayson

    BillDayson New Member

    The big difference that I see is that Carnegie produces a classification, while USNews produces rankings.

    Carnegie gives you lists of schools that offer multiple doctoral programs, schools that concentrate on bachelors degrees, art and music specialty schools and so on. It doesn't try to compare the undergraduate programs that are offered by all of these different kinds of schools against its own model of the ideal undergraduate program.
     
  20. Anthony Pina

    Anthony Pina Active Member

    Anytime rankings introduce subjective data, as US News does, it introduces a weakness in the measure (you say "flawed"--you may be right). I agree 100% that the ranking is done for money making purposes. I think that the most useful part of the edition is the guide to the colleges in the back--some good and useful info is there. However, a college guide isn't nearly as sexy as college rankings.

    Let's face it, we "Americanos" love rankings, often to the point of silliness. Comparisons of how we score comapred to other states and other countries generates catchy press.

    Tony

    As you can tell from my initial post, I am no great fan of rankings. I think that US News makes an honest effort to rank using multiple measures (some subjective, some objective).
     

Share This Page