Why I don't trust the rankings


teddykgb

This is why I don't trust the rankings. Quotes taken from "Behind the Business Week rankings" on businessweek.com

"...we asked more than 85,000 graduating seniors at those schools to complete a 50-question survey on everything from the quality of teaching to recreational facilities."

FACT: The rankings rely heavily on "student rankings." How can students possibly be objective about their own school? Wouldn't this be a bit like asking Joe-Bob, the proud owner of a Buick Regal, how he likes it? Joe-Bob's not going to say, "I invested my hard-earned money in this Buick Regal after researching all my options, and I've come to the conclusion that after all my research, this Regal was a bad investment." And more importantly, how could a student possibly know if his program is better than another? It's not like, as with cars, they can "test drive" different programs.

"In addition to surveying students, BusinessWeek polled 580 corporate recruiters for companies that hire thousands of business majors each year. We asked them to tell us which programs turn out the best graduates, and which schools have the most innovative curriculums and most effective career services. In the end, 194 recruiters responded, a response rate of about 33%."

FACT: only 33% ofbusiness recuiters responded to polls regarding Business Week rankings. What happened to the other 67% of the recruiters? Did they say, "It's not about where the candidate went to school, it's the candidate that matters? Don't bother getting an MBA." If so, I would like to know. In no way would 33% ever give an accurate representation of competitiveness of the job market. Also, these statistics could not possibly take into account variations in recruiting values or areas of focus.

"To learn which schools' students land the best-paying jobs, we asked each institution to tell us the median starting salary for their most recent graduating class. In addition, we drew on our 2004, 2006, and 2008 MBA surveys to create a "feeder school" measure showing which schools send the most grads to the 35 top MBA programs identified in previous BusinessWeek rankings.

Finally, we created an academic quality gauge of five equally weighted measures. From the schools themselves, we obtained average SAT scores, the ratio of full-time faculty to students, and average class size. The student survey supplied the percentage of business majors with internships and the hours students spend every week on school work."

These benchmarks seem more reasonable and, ultimately, objective than the others. I would me much more willing to trust a ranking system that relied solely on class sizes, SAT scores, and salaries of graduates. Unfortuantely, Business Week gives no indication to how much weight these factors give in the overall rankings.

And the last reason that I don't trust the rankings is that they're financially motivated. If the best business schools have been the best schools over the course of, say, a decade - what incentives would Business Week and other publications have to publish from year to year? The inherrent flaw in annual publications is that they need to change things around enough so that it's fresh enough to sell new issues.

This is why I don't trust the rankings. Quotes taken from "Behind the Business Week rankings" on businessweek.com

"...we asked more than 85,000 graduating seniors at those schools to complete a 50-question survey on everything from the quality of teaching to recreational facilities."

FACT: The rankings rely heavily on "student rankings." How can students possibly be objective about their own school? Wouldn't this be a bit like asking Joe-Bob, the proud owner of a Buick Regal, how he likes it? Joe-Bob's not going to say, "I invested my hard-earned money in this Buick Regal after researching all my options, and I've come to the conclusion that after all my research, this Regal was a bad investment." And more importantly, how could a student possibly know if his program is better than another? It's not like, as with cars, they can "test drive" different programs.

"In addition to surveying students, BusinessWeek polled 580 corporate recruiters for companies that hire thousands of business majors each year. We asked them to tell us which programs turn out the best graduates, and which schools have the most innovative curriculums and most effective career services. In the end, 194 recruiters responded, a response rate of about 33%."

FACT: only 33% ofbusiness recuiters responded to polls regarding Business Week rankings. What happened to the other 67% of the recruiters? Did they say, "It's not about where the candidate went to school, it's the candidate that matters? Don't bother getting an MBA." If so, I would like to know. In no way would 33% ever give an accurate representation of competitiveness of the job market. Also, these statistics could not possibly take into account variations in recruiting values or areas of focus.

"To learn which schools' students land the best-paying jobs, we asked each institution to tell us the median starting salary for their most recent graduating class. In addition, we drew on our 2004, 2006, and 2008 MBA surveys to create a "feeder school" measure showing which schools send the most grads to the 35 top MBA programs identified in previous BusinessWeek rankings.

Finally, we created an academic quality gauge of five equally weighted measures. From the schools themselves, we obtained average SAT scores, the ratio of full-time faculty to students, and average class size. The student survey supplied the percentage of business majors with internships and the hours students spend every week on school work."

These benchmarks seem more reasonable and, ultimately, objective than the others. I would me much more willing to trust a ranking system that relied solely on class sizes, SAT scores, and salaries of graduates. Unfortuantely, Business Week gives no indication to how much weight these factors give in the overall rankings.

And the last reason that I don't trust the rankings is that they're financially motivated. If the best business schools have been the best schools over the course of, say, a decade - what incentives would Business Week and other publications have to publish from year to year? The inherrent flaw in annual publications is that they need to change things around enough so that it's fresh enough to sell new issues.
quote
saint

Reading your remarks about rankings are so correct, the whole business education as gone crazy like the premiership football league in the UK. I truly believe that we are miss lead and the best way to select a school is by visiting the school, sit in a class, speak with current students and alumni and ask them how they would improve there programme if they were in charge, make sure the school as AMBA accreditation and finally see if you are heard and listened to and treated as a potential candidate. My best friend works in the Netherlands at one of the top 3 business schools as a recruiter and the advice he gives me is very inspiring and comforting to hear with no pressure to apply.
Saint

Reading your remarks about rankings are so correct, the whole business education as gone crazy like the premiership football league in the UK. I truly believe that we are miss lead and the best way to select a school is by visiting the school, sit in a class, speak with current students and alumni and ask them how they would improve there programme if they were in charge, make sure the school as AMBA accreditation and finally see if you are heard and listened to and treated as a potential candidate. My best friend works in the Netherlands at one of the top 3 business schools as a recruiter and the advice he gives me is very inspiring and comforting to hear with no pressure to apply.
Saint
quote

Reply to Post

Other Related Content

MBA Rankings: What You Need to Know

Article Sep 06, 2016

How different rankings work and what they can tell you about business schools and MBA programs