Author Topic: Only two RP universities made it to the world’s 500 best; but must the rankings  (Read 1135 times)

Scott

  • INTERN
  • **
  • Posts: 674
  • "Love your opponents and suffer to convert them."
    • View Profile
    • living in a baCkpack :: the chronicles of a Bisdak explorer
lifted from: http://up.edu.ph/features.php?i=87


Only two RP universities made it to the world’s 500 best; but must the rankings be taken seriously? 
Sunday, August 3, 2008
Alicor L. Panao 




No doubt everyone wants to score well in a beauty contest for academic institutions. But should a university’s worth be judged on reputation alone?

In the recently released 2007 Times Higher Education survey only two Philippine universities made it to its list of the world’s top 500. The University of the Philippines, the national university, was ranked at 398, about a hundred notches down from its 2006 rank of 299. Jesuit-run Ateneo de Manila University came in at 451. The University of Santo Tomas and the De La Salle University, which were both included in the 2006 list, dropped from the radar.

It did not take long for the media reports to elicit reactions from academicians, policymakers and alumni. Journalists and columnists pitched their own speculations as to what has been dragging down the country’s universities. However, the question that was not being asked was: what was the basis of the ranking?

Published by The Times Higher Education Supplement in collaboration with Quacquarelli Symonds (THES-QS), the rankings are meant supposedly to serve as “the definitive guide to universities around the world which truly excel.” In evaluating institutions, however, the THES-QS computes half of the index based on its reputation as perceived by academics (peer review, 40%) and global employers (recruiter review, 10%). Since it is not specified who were surveyed or what questions were asked, the methodology is obviously vulnerable to manipulation.

Even peers need some standardized input data to review. But according to the October 2007 study International Ranking Systems for Universities and Institutions: A Critical Appraisal published in BioMed Central, the Times simply asks 190,000 experts to list what they regard as the top 30 universities in their field of expertise without offering input data on any performance indicators (http://www.biomedcentral.com/1741-7015/5/30). Moreover, the survey response rate among the selected experts was found to be below 1%. In other words, on the basis of possible selection biases alone the validity of the measurement is shaky.

The other half of the index is based on such indicators as student-to-faculty ratio, the number of foreign faculty and students in the university, and the number of academic works by university researchers that have been cited internationally. Data for these indicators, however, typically depend on the information that participating institutions submit. An institution’s index, in other words, may be easily distorted if the institution fails to submit data for the pertinent indicators, or if it chooses not to participate.

According to UP Vice President for Public Affairs Dr. Cristina Pantoja Hidalgo, the University of the Philippines manifested its refusal to participate in THES-QS survey in writing as early as July 2007 because Times could not explain where it got the figures on which UP’s rank was based in 2006. In response to UP’s objection, however, Quacquarelli Symonds research assistant Saad Shabbir simply wrote back that if it did not receive the information by the deadline then it would be “forced to use last year’s data or some form of average.”

Apparently for the QS researchers, old data would do—data that had been questioned precisely because its source was dubious. UP has not participated in any international survey of academic institutions since 2000. To date, the country’s National University has neither released official statistics for survey purposes nor consented to any survey undertaking by a local or international body.

Ateneo de Manila University, whose current rank was supposedly an improvement from last year, cautions its alumni and the public to view the results with some degree of prudence. In her comments on the 2007 THES-QS survey published in the Ateneo de Manila University website (http://www.ateneo.edu/index.php?p=120&type=2&aid=4489), Vice President for the Loyola Schools Dr. Ma. Assunta C. Cuyengkeng writes that the rankings do not even reflect Ateneo’s vision and mission. Instead of getting distracted, Cuyengkeng encourages its community to work instead towards strengthening the leadership and excellence of faculty and students as its contribution to national development.

Interestingly, THES-QS are causing quite a stir in other higher education institutions in many countries, not necessarily because of the controversial rankings it annually publishes, but because of what some experts deride as its remarkable flair for making mistakes.

The University of Malaya’s (UM) decline from 89 in 2004 to 169 in 2005, for instance, caused a political turmoil in Malaysia and cost the career of one of the university’s Vice Chancellors. It turned out that QS had counted all the Malaysian Chinese and Malaysian Indians as foreign students (one of the criteria in their rankings) in 2004. According to The Star on a Nov. 18, 2005 news report (http://www.thestar.com.my/news/story.asp?file=/2005/11/18/nation/12625998&sec=nation), QS was apparently under the impression that a larger number of foreigners were studying and teaching at UM when actually there were just a lot of Malaysian citizens of Indian and Chinese descent. QS tried to correct this mistake in 2005, resulting to a steep drop in the university’s ranking.

Meanwhile, between 2004 and 2005 Duke University’s rank rose dramatically in the THES-QS survey, thanks to a nasty clerical error. Duke was listed to have 6,244 faculty members, which was well beyond what the university had declared on its website. It turned out, according to Prof. Richard Holmes of the MARA University of Technology in Malaysia, that this was the number of undergraduate students enrolled at Duke in the fall of 2005 (http://rankingwatch.blogspot.com/2007/01/more-about-duke-on-january-23rd-i.html). Obviously, somebody made the mistake of copying the figure for undergrad students and counted them as faculty, giving Duke four times the number of faculty it actually had.

Quacquarelli Symonds is relatively new in the ranking business but the impact of its surveys can be far-reaching. Some institutions use their results—their methodological flaws notwithstanding—as basis for the distribution of finances over departments and to bolster rivalries among institutions. The October 2007 BioMed study cautions that this could actually be more harmful to science and education and may even encourage global brain drain. For instance, if ranking affects funding policies, institutions and scientists may seek to excel only in the specific criteria used to determine excellence.

There is no debate about the importance of defining, measuring, interpreting and improving institutional excellence. But current international rankings contained in the THES-QS cannot be taken seriously until it establishes its own credibility with the academic community.

 


Linkback: https://tubagbohol.mikeligalig.com/index.php?topic=14651.0
I grew up in this town, my poetry was born between the hill and the river, it took its voice from the rain, and like the timber, it steeped itself in the forests.--Pablo Neruda

unionbank online loan application low interest, credit card, easy and fast approval

Tags: