Ornery.org

SEARCH  OA   Ornery.org   The Internet    

ADVERTISEMENT

FRONT PAGE
ABOUT ORNERY
WORLD WATCH
GUEST ESSAYS
FORUMS
CONTACT US

How to Submit Essays

Receive Ornery.org headlines via our XML/RSS feed

RSS FeedsRSS Feeds


Print this page
E-mail this page


WorldWatch
First appeared in print in The Rhinoceros Times, Greensboro, NC
By Orson Scott Card April 21, 2011

Lists -- and What They Don't Tell Us

Ever since the U.S. News & World Report began ranking colleges and universities, I've been a reader -- even though I had nothing at stake.

I had long told my children that while I would happily fund their college education at a serious school, I would not waste a dime on big-name prestige schools. "You won't be getting any classes with the big-name professors at Harvard or Yale or Stanford, not as undergrads," I told them.

"Besides, their reputations are built on the fact that everybody wants to go there, and they can turn away most applicants. So they have a reputation for turning out brilliant students because they only admit students who are already brilliant.

"You go to a school that's actually about teaching instead of getting research grants, and build up their reputation by being brilliant."

It seemed to me, you see, that the reputation of a Harvard, Yale, or Stanford was a self-fulfilling prophecy. Students come out brilliant because they went in brilliant; students come out well-connected because they grew up in well-connected families, which is why they could afford the tuition.

I didn't even begin to understand how right I was.

Back in the February 14 & 21 issue of The New Yorker, Malcolm Gladwell wrote a piece called "The Order of Things," which took on the college rankings to show us exactly what they don't do well: anything at all.

Gladwell begins his article by pointing out how absurd it is to compare unlike things, using Car & Driver's rankings of sporty cars as an example. If you don't include price as part of the equation, one car emerges on top. But the differences in quality are minuscule, and the moment you add price in as a factor in the comparison, a different car emerges far ahead.

Since price matters to most people, what's wrong with telling them that the markedly cheaper car is actually very nearly as good on all the other points as the cars-that-cost-more-than-your-house?

Then Gladwell moves on to college rankings, but his clearest example comes from Jeffrey Stake's website called "the Ranking Game." Google it, or use this link: http://monoborg.law.indiana.edu/LawRank/

Stake, a professor at Indiana University law school, plugs in all kinds of data about a lot of law schools, and then lets visitors to the site decide just how much weight to give to each factor.

For instance, if you do what the U.S.News college ranking does, and give equal weight to "academic reputation," "LSAT scores at the 75th percentile," "student-faculty ratio," and "faculty law-review publishing," you get this list:

1. University of Chicago

2. Yale

3. Harvard

4. Stanford

5. Columbia

6. Northwestern

7. Cornell

8. University of Pennsylvania

9. New York University (NYU)

10. University of California, Berkeley

But, just as with the sports cars, price isn't a consideration in that list. Moreover, Gladwell has already demonstrated in his article that "academic reputation" is a bogus category: Academics at one school don't really know all that much about the quality of teaching at other schools. And they're inclined to inflate the "reputation" of the school they themselves graduated from.

So, on the theory that schools ought to be rewarded for being affordable, let's throw out "academic reputation" and replace it with "price," and we get a very different list, as Gladwell points out:

1. Chicago

2. Yale

3. Harvard

4. Stanford

5. Northwestern

So far, quite familiar, yes? But then:

6. Brigham Young

7. Cornell

8. U of Colorado

9. Pennsylvania

10. Columbia

Where did Brigham Young and Colorado come from? Get rid of the student-faculty ratio and rely only on price, LSAT scores of admitted students, and faculty publishing, and you get:

1. Chicago

2. Brigham Young

3. Harvard

4. Yale

5. Texas

6. Virginia

7. Colorado

8. Alabama

9. Stanford

10. Pennsylvania

Think about that for a minute. What if that were the law school list that everybody relied on? Instantly, all the expensive schools would start working to bring down their price.

So now let's go back, as Gladwell did, to the U.S. News ranking of undergraduate school. The list means nothing unless you understand the criteria. But when Gladwell looked through the criteria, he discovered that in measuring the "quality of instruction" they include things like faculty salary and benefits.

Other categories included the size of the endowment fund, percentage of alumni who donate each year, and how much the school spends per student. In other words, one way to rise on the list is to be a very rich school.

But I don't care about sending my kids to the richest school, I care about sending them to the school that will give them the best education.

Here's where we really run into problems. One strong factor in ranking is how selective a school is. If the students you admit are already the top-ranked American high school students, then how can we tell how much you're actually helping them? In other words, if you only admit the best, all you have to be is adequate to graduate students who are among "the best and the brightest."

Gladwell points out that there's another factor to consider: Efficacy. In other words, how well does the school do at graduating students whose high school and admission test performance puts them at high risk of not graduating from college? If you graduate a higher percentage of these high-risk students, it can reasonably be concluded that you are actually making a difference instead of just capitalizing on the already-high quality of your applicants.

You can't do well on selectivity and on efficacy, Gladwell asserts. How can Yale help anybody when the predicted graduation rate of the students Yale admits is 96 percent? But Penn State, which is the least selective of the top-fifty schools, does a brilliant job of graduating its more-diverse students. They have a predicted graduation rate of 73 percent, but has an actual graduation rate of 85 percent -- a score of plus-12. They make a difference!

"No other school in the U.S. News top fifty comes close," Gladwell points out. But Penn State is far from number one -- even though it's the most efficacious of the top fifty. (And there are probably many other schools, far from the top fifty, who do a very good job of graduating students who are not predicted to be sure things.)

Then there's the nightmare of academic reputation. Here's why it's absolute nonsense: the people rating different schools' academic reputation are basing their evaluations primarily on ... you guessed it ... the U.S. News ranking!

In other words, the way to do really well on the U.S. News ranking is to do well on the U.S. News ranking last year!

Why even bother to buy the magazine, when the list is as much about money, selectivity, and how it was ranked before as about quality of education?

As an undergraduate, I was a theatre major in a take-anybody theatre program. I learned a lot, sometimes without much cooperation from an irritated faculty (have I ever mentioned what an irritating student I was?).

But now I imagine a theatre program where you can't be admitted as an acting or musical theatre student unless you can prove that you are already so proficient at your art that you don't actually need to study it in college. Very exclusive.

Of course, if you admit only Broadway- or Hollywood-ready students, then yes, you might get a "good reputation" (a high ranking!) but you won't be very efficacious -- you won't be raising the skill level of the students. What's the point? Getting admitted is the primary value. Whereas I actually learned stuff as a student -- which is what I think should be the primary value.

In fact, in the writing classes I teach, if an applicant's sample fiction shows they are already very good at the skills I teach, I refuse to take their money or let them into the class. It would be a waste of their time, and I can use that slot for somebody who actually needs the training I offer.

Exclusivity is really about being able to say that you went to Yale or Harvard or Stanford. It's not about getting a good education.

Personally, I think that the editors at U.S. News would have rejected any ranking system that did not put Yale, Harvard, Stanford, and such at the top of their list. Why? Because their job is selling copies, and if people opened their magazine and found Texas and Alabama and Penn State at the top, and Harvard and Yale well down in the list, they'd assume that the list was absurd ...

Because "everybody knows" that what the top schools are, even if they aren't actually all that good at what they do.

For all I know, endowment size, faculty salaries, exclusivity, and reputation weigh so heavily in the U.S. News rankings, and efficacy is so relatively unimportant, because until the list was set up this way, it had too many shocking surprises.

The most ethereally beautiful woman I ever saw was a teenage girl I happened to see as I walked past her garden on a street in Campinas, Brazil, in 1972. But you've never heard of her, so she has never been mentioned in People Magazine's most-beautiful-woman issue. (Also, People Magazine did not exist then.)

My point is that unless people already agree with your ranking, you won't sell copies of your rankings. I don't know about you, but I always skipped over the top-ranked schools -- there were no surprises there -- and looked for schools I actually knew, to see where they were this year.

And you know what? They were already in the same place.

Now let me turn to yet another article about rankings. This is Scott W. Atlas's article "The Worst Study Ever?" in the April 2011 issue of Commentary. In this article, Atlas closely examines the World Health Organization's World Health Report 2000, which "ranked the health-care systems of nearly 200 nations."

The reason this study matters is that it ranked the U.S. health-care system as being substantially behind dozens of other countries.

This study, treated as if it were actually scientific, has been the basis of some pretty sweeping legislation designed to help us "catch up." Including, of course, Obama-care.

But there are some deep and serious flaws with these rankings. For instance, our life-expectancy was shockingly low compared to other countries. What they neglected to point out was that they were including deaths by fatal injury right along with deaths from natural causes.

We're the most automobilized nation on Earth, by far, and cars kill. Also, we have this weird thing about guns. It's absurd to rank national health care "as if an ideal health-care system could turn back time to undo car crashes and prevent homicides," Atlas says.

If you remove fatal injuries from the list, so that you're only comparing deaths from natural causes, guess which country has the highest life expectancy in the world? No, really, guess.

Yeah. The United States.

But here's where the U.S. News lesson comes into play. What mileage would the WHO have gotten from a report on world health that ranked America as number one in life expectancy? None!

Nobody in countries with socialized medicine wants to hear that when you let people sort of have some kind of voice in their health care choices (it's not as if in this nation of HMOs we're actually in a free market), you get better results.

If America is ranked number one, you might as well not publish your study, for all the attention it will get. Only if America can be pointed out as a "failure" is your study going to get a lot of attention, justifying your existence.

And here's how they did it, according to Atlas. First, they based their rankings mostly on internal WHO documents and evaluations that had never gone through any kind of scientific vetting. They only had complete data from 35 of the 191 countries -- but somehow they generated rankings for all of them. Also, most of the studies they worked from were written by the people creating the report -- they were citing themselves!

In other words, they were free to make stuff up. No actual scientific research organization would ever have been able to get away with such practices. In fact, nobody on a university faculty would have been allowed to remain employed if this was how they did "research."

Also, one of the criteria for determining a country's ranking in the list was a country's ranking on the subject of "health inequality," "responsiveness," and "fair financing."

What do you think that means? Yep: Free points for already having socialized medicine!

So let's see -- if you knock down a country for not having socialized medicine, then it seems likely that countries with socialized medicine will lead the list, regardless of the relative quality of the health care.

There is no way America was going to get a fair ranking in a mess like this.

Yet this study has been cited over and over as a justification for treating American health care as being a desperate failure which has to be saved ... by making it more like countries that have worse health care than we do.

U.S. News has a pretty much useless ranking of universities because their results reaffirm prejudices and sell copies.

But to base American health care policy and spend billions of dollars on the basis of "facts" that were generated by methods designed to give a bad report on the American system is either stupid, negligent, or deceitful. Which is a pretty good description of a significant percentage of American politicians, many of whom embody all three attributes.

Meanwhile, Obama's people disparage all critics of these "facts" the way eco-puritans disparage critics of the absurd claims about global warming: They're "ideological," when in fact the critics are precisely the ones who are not ideological, unless you mean the ideology of "accuracy" and "scientific rigor."

It's standard operating procedure. Accuse your opponents of precisely the things that you yourself are guilty of.

But you're smart people. You can look up these articles and read them for yourself. You can also look up their sources and read them. Then you can decide for yourself whose statements are reliable and who is ideology-driven.


Your Comments
Print This Page
E-mail This Page

OA Featured Columnist
World Watch
Recent Columns:
    By Orson Scott Card
More World Watch
OA Recent Guest Essays
 The Israel-Palestine Conflict and Tribalism
By Brian Meinders
July 31, 2014
 Liberal Principles for all of us
By Greg Davidson
May 5, 2014
 Conservative Principles and the Common Man
By David M. Huntwork
February 21, 2014
More Guest Essays
OA Links of Interest
• Many people have asked OSC where they can get the facts behind the rhetoric about the war. A good starting place is: "Who Is Lying About Iraq?" by Norman Podhoretz, who takes on the "Bush Lied, People Died" slogan.
Past Links
Ornery.Org



Copyright © 2017 Hatrack River Enterprises Inc. All rights reserved.
Reproduction in whole or in part without permission is prohibited.
  Front Page   |   About Ornery.org   |   World Watch   |   Guest Essays   |   Forums   |   Contact Us
Web Site Hosted and Designed by WebBoulevard.com