When Deepti Chandrasekaran, an MBA student from India, was researching which business schools to apply to, one of the first places she turned to were the many MBA rankings lists. And one in particular caught her eye.
"I looked at the rankings as a starting point. Queen's was on top in the Business Week [international]rankings. I didn't really look at the data the different surveys included. It was just finding out who ranked well and then researching from there," she recalls.
And while Queen's School of Business might have been a beneficiary in this case - Ms. Chandrasekaran, 25, is now in its MBA program - the Kingston, Ont. school (and indeed many of its competitors) view the rankings game with distinctly mixed feelings.
Competition is hotter than ever between the world's business schools to attract the brightest students and the attention of wealthy benefactors. And rankings have become a fact of life for academics who find themselves participating, albeit often reluctantly, in highly publicized surveys they hope will put their MBA programs in the spotlight.
Meanwhile, MBA and EMBA rankings have turned into a highly lucrative industry for the major media outlets that produce them, including Business Week, Forbes, The Financial Times, The Wall Street Journal and The Economist, as well as many non-English publications.
But MBA rankings can bring as much confusion as clarity.
Different rankings use very different survey tools, data and methodologies. A particular school may be placed several points below another with only a micro-point per cent separating them. Or, like Queen's, for example, they may sometimes take themselves out of the running intentionally.
While Queen's is participating in the Business Week MBA rankings (where they place first among non-U.S. schools), the school is not currently involved with the Financial Times list of full-time MBAs, widely considered to be one of the most authoritative.
"The Financial Times survey is focused on salary," says Queen's School of Business Dean David Saunders. "We completely redesigned our program three years ago and our program is not structured to concentrate on salary. I have never met a student in 26 years who said the primary reason they were attending an MBA program is to increase their salary."
Dr. Saunders says they chose to participate in Business Week "after we designed our program, because [the survey]is based on satisfaction. It asks if you got what you wanted, are you happy with it." As to whether satisfaction can be equated with quality, "it does correlate at some point," Dr Saunders says, but adds: "it's only one piece."
Accreditation is a more important gauge of MBA quality, according to most school officials, but there's no doubt rankings play a role in attracting new students (especially those from abroad) and keeping a warm glow of satisfaction from alumni.
Ms. Chandrasekaran, says rankings played one key role: helping her create a shortlist of schools she'd like to check out for her graduate degree.
After that, she says, her "real research was in speaking to alumni, reading reports about the schools, reading blogs about curriculum, faculty, length of the program."
She says she views rankings as "a lot of PR and marketing because there's a huge market for MBA students." That said, she admits she would "feel badly if we fell in rank. Students do know that we're on top in Business Week and we're happy about it. We would definitely want to maintain our ranking."
This seeming contradiction - the rankings don't really mean much but we want (or need) to be among them - sums up how many in the MBA schools see the rankings.
On one end of a critical continuum is Prof. Jerold Zimmerman of the University of Rochester's Simon Graduate School of Business. As co-author of a 2005 research paper entitled What's Really Wrong With U.S. Business Schools, Mr. Zimmerman took aim at the growing influence MBA rankings were having in business schools and the danger they represented.
Business schools, he wrote, "are locked in a dysfunctional competition for rankings that diverts resources - faculty and administrative time, money and energy - from research and from undergraduate and PhD programs into short-term strategies aimed at improving their competitive position."
In strong language Prof. Zimmerman and his co-authors go on to outline the illegitimacy of MBA rankings: not only do they drain resources, they shape what is taught, who schools admit to programs, how long deans remain in their posts. And they're full of bias.
Far from being scientific in nature, Prof. Zimmerman described the rankings as "statistical noise."
Five years later, he says the situation has only gotten worse.
There are more surveys and more schools "that believe they can play in the big leagues." They gather the wrong information, he believes, and offer one-size-fits-all information, as well as determine how that information gets weighted.
"If this were a survey of automobiles it would rate things like fuel consumption, acceleration rates, safety," he says. "If you're on a budget you might weight fuel consumption higher than acceleration. On the other hand if you're a mother of two young kids that you have to drive around, safety might be your priority."
University of Alberta business professor Cheryl McWatters is familiar with her colleague's critique and says she's substantially in agreement with Mr. Zimmerman.
"I see rankings used as more of a recruitment strategy," she says, "but are they valid? I've always been suspicious about how schools can be so high on one survey and so low on another. Who is giving them the data and does it really tell you if a school is good?"
Prof. McWatters says she worries that scarce resources spent on ranking participation drains money away from doctoral programs, thus having an impact on the pool of qualified teachers and whether academic research is funded.
Like many of her Canadian colleagues, Prof. McWatters believes that competition is fiercer in the U.S. and the rankings viewed more intensely there. However, she sees the creeping influence they are having here.
"In some ways, they don't really matter. But they do matter because we've made them matter, she says, adding that because other schools are there, it becomes important to be "on the list."
"It's a matter of image and of branding. It's not a matter of quality."
She agrees that the prime value of rankings is in a school "making the first cut" when students are researching MBA programs.
Simon Fraser's associate dean of the Segal Graduate School of Business says they try to focus on other markers of quality to demonstrate to students and to distinguish their program.
"It's true that students do ask about our rankings," says Edward Bukszar. "Is it credible information? Sure, but it's only one piece of information. The schools that are ranked well are all pretty good schools, but are they the only good schools? No."
Prof. Bukszar says Simon Fraser puts on information sessions, attends MBA fairs and increasingly uses blogging. He acknowledges that "it's complicated for students to figure out all the differences between programs without some kind of simplifying method. Rankings provide some of that simplification."
Or, as Queen's dean Dr. Saunders characterizes it, "Rankings are fast food and we are a full-service restaurant."
SOME OF THE RANKINGS
Business Week and the Financial Times MBA rankings are two among a growing array of MBA rankings. They measure differently and have their own criteria for inclusion, minimum response rates and methodologies. Comparisons have been described as between apples, oranges and grapes - in other words, take each on its own merits.
The Financial Times surveys 150 MBA schools, of which 11 are Canadian, with six schools making the 100 ranking. The survey of alumni is done three years after graduation and looks at salary growth and career progress. The Financial Times has a broad scope of measurement and a complex system of weighting data.
Business Week focuses its rankings survey on two groups: students and corporate recruiters. These results account for 90 per cent of the basis for the rankings with an additional 10 per cent based on what they term "intellectual capital" - primarily faculty publication in 20 academic journals and other books and articles.
The Economist's rankings are based on data culled from a survey of a pre-selected 135 schools and their students, measuring career opportunities, personal development, educational experience, increase in salary, and potential to network.
Other rankings include U.S. News and World Report, the Wall Street Journal, and Forbes.
Special to The Globe and Mail
Follow us on Twitter: