Friday, March 23, 2018

More evidence of the rise of China

A regular story in the ranking world is the rise of Asia, usually as a warning to stingy Western governments who fail to give their universities the money that they desperately need to be world-class.

Sometimes the rise of Asia turns out to be nothing more than a methodological tweaking or a bug that allows minor fluctuations to be amplified. Asia often turns out to be just East Asia or sometimes even just Shanghai and Peking. But it still remains true that China, followed perhaps by South Korea, Taiwan, Singapore and Hong Kong, is steadily becoming a scientific superpower and that the USA and Europe are entering a period of relative decline.

This blog has already noted that China has overtaken the West in supercomputing power and in the total output of scientific publications.

David Goldman of Asia Times, writing in Breitbart, has reported another sign of the rise of China: the number of doctorates in STEM subjects is well ahead of the USA. And we should remember that many of those doctorates are Chinese nationals or of Chinese descent who may or may not remain in the US.

“What I’m concerned about is the fact that China is testing a railgun mounted on a navy ship before the United States is and that China has the biggest quantum computing facility in the world about to open,” said Goldman. “It probably has more advanced research in quantum communications than we have, and they’re graduating twice as many doctorates in STEM fields than we are. That’s what really frightens me.”

There are, of course, some areas where US researchers reign supreme such as gaming research and gender, queer and trans studies. But I suspect that is not something that will help the US win the coming trade wars or any other sort of war.

Tuesday, March 13, 2018

Anglia Ruskin University sued for awarding Mickey Mouse degrees

Pok Wong, or Fiona Pok Wong, a graduate of Anglia Ruskin University (ARU) in Cambridge, wants 60,000 pounds for a breach of contract and fraudulent misrepresentation and false imprisonment after a protest at the graduation ceremony.

ARU has appeared in this blog before following its spectacular performance in the research impact indicator in the THE world rankings. It has had the common sense to keep quiet about this rather quirky result.

Ms Wong has claimed that her degree in International Business Strategy was just a "Mickey Mouse" degree and that the teaching was of poor quality with one lecturer coming late and leaving early and sometimes even telling the students to self study in the library. She is reported to claim that "since graduating ... it has been proven that the degree ... does not play a role to help secure a rewarding job with good prospects."

It seems that in 2013 she had a job as a Financial Planner with AIA International so her degree from ARU did not leave her totally unemployable. Between 2013 and 2016 she studied for Graduate Diplomas in Law and Paralegal Legal Practice at BPP University College of Professional Studies, which does not appear in the national UK rankings but is ranked 5,499th in the world by Webometrics.

I doubt that the suit will succeed. It is of course regrettable if ARU has been lax about its teaching quality but whether that has much to do with Ms Wong not getting the job she thinks she deserves is debatable. ARU is not among the elite universities of England and its score for graduate employment is particularly bad. It is not a selective university so the question arises why Ms Wong did not apply to a better university with a better reputation.

The university would be justified if it pointed out that publishing photos proclaiming "ARU sucks" may not be the best way of selling yourself to potential employers.

If she does succeed it would be a disastrous precedent for British universities who would be vulnerable to every graduate who failed to get suitable employment or any employment at all.

But the affair should be a warning to all universities to be careful about the claims they make in advertising their products. Prospective students should also take a critical look at the data in all the indicators in all the rankings before banking in their tuition fees.

Monday, March 12, 2018

Salaries, rankings, academic quality, racism, sexism, and heightism at Renssalaer Polytechnic Institute

From time to time the question of the salaries of university administrators resurfaces. Last August the issue of the salary of the yacht and Bentley owning  vice-chancellor of the University of Bolton in the UK received national prominence. His  salary of GBP 260,500, including pension contributions and healthcare benefits, seemed to have little relationship to the quality of the university which was not included in the QS and THE world rankings and managed a rank of 1,846 in Webometrics and 2,106 in University Ranking by Academic Performance (URAP). A poll in the local newspaper showed 93% of respondents opposed to the increase.

A previous post in this blog reported  that vice chancellors salaries had no statistically significant relationship to student satisfaction in the UK although they had more than average faculty salaries and the number of faculty with teaching qualifications.

This issue has cropped up in the US where it has been noted that the highest paid university president is Shirley Ann Jackson of the Renssalaer Polytechnic Institute (RPI).

She has come under fire for being overpaid, autocratic and allowing RPI to go into academic decline. Her supporters have argued that her critics are guilty of residual racism, sexism and even heightism.  A letter in the Troy Times Union from David Hershberg uses the Times Higher Education (THE) world rankings to chastise Jackson

"RPI was always in the top 5 of undergraduate engineering schools. Now it's No. 30 in U.S. News and World Report's latest rankings. Despite the continued loss of stature of my alma mater, the school's president, Shirley Ann Jackson, is the highest paid president of a university by far and on some 10 other boards that supplement her $7 million salary and other compensation. This is RPI's rankings the last eight years in the Times Higher Education World University Rankings: 2011, 104; 2012, 144; 2013, 174; 2014, 181; 2015, 226-250; 2016, 251-300; 2017, 251-300; and 2018, 301-350. Further, U.S. News & World Report has RPI at No. 434 globally and No. 195 engineering school. This warrants a change at the top. This is what matters, not gender or race."

It seems that for some people in the USA international rankings, especially THE's,  have become the measure of university excellence..

First, it must be said that the THE World University Rankings are not a good measure of university quality.  These rankings have seen dramatic rises and falls in recent years. Between 2014-15 and 2015-16, for example, Middle East Technical University (METU) in Ankara fell from 85th place to the 501-600 band while many French, Japanese, Korean and other Turkish universities fell dozens of places. This had nothing to do with the quality of the universities and everything to do with methodological changes, especially to the citations indicator.

The verdict of the US News America's best Colleges is simple. RPI was 42nd in 2007 and it is 42nd in the 2018 rankings, although apparently alumni giving has gone.down.

Comparing data from US News in 2007 and 2015, RPI is more selective with more applicants of whom a smaller proportion are admitted. SAT scores are higher and more students come from the top 10% of their high school. There are more women and more international and out of state students.

The school may, however, have become less equitable. The percentage of Black students has fallen from 4% to 2% and that of students needing financial aid from 70% to 65%.

As a national university with an undergraduate teaching mission RPI is certainly not declining in any sense although it may be less welcoming for poor and Black students and it is definitely becoming more expensive for everybody.

The international rankings, especially those based on research, tell a different story. RPI is slipping everywhere: from 243 in 2014 to 301 in 2017 in the CWUR rankings, from 589 in 2010-11 to 618 in 2017 in URAP, from 341 in 2013 to 390 in 2017 in Nature Index, from 128 in 2010 to 193 in 2017 in the Round University Rankings.

In the Shanghai rankings, RPI fell from the 151-200 band to the 501-600, partly because of the loss of a couple of highly cited researchers and the declining value of a Nobel winning alumnus .

RPI's fall in the global rankings is largely a reflection of the general decline of the US and the rise of China, which has overtaken the US in research output and supercomputing. But there is more. In the indicator that measures research quality in the CWTS Leiden ranking, percentage of papers in the top 10% of journals, RPI has fallen from 23 in 2011-12 to 194 in 2017.

It seems that RPI is holding its own or a bit more as an American teaching university. Whether that is worth the biggest salary in the country is for others to argue about. But it is definitely losing out to international competition as far as research quality is concerned. That, however, is an American problem and RPI's difficulties are hardly unique.

Friday, March 09, 2018

Rankings and the financialisation of higher education

University rankings are now being used for purposes that would have been inconceivable a decade ago. The latest is supporting the large scale borrowing of money by UK universities.

The Financial Times has an interesting article by Thomas Hale about the growing financialisation of British higher education. He reports that some universities such as Portsmouth, Bristol, Cardiff and Oxford are resorting to capital markets for financing supposedly because of declining government support.

The University of Portsmouth has borrowed GBP 100 million from two North American institutional investors. The placement agent was Lloyds and  PricewaterhouseCoopers (PwC) the advisors.

"The money will be spent on the first phase of “estate development”. It is expected to involve a number of buildings, including an indoor sports facility, the extension of a lecture hall, and a flagship “teaching and learning building”."

It seems that this is just part of a larger trend.

"The private placement market – by definition, more opaque than its public counterpart — is a particularly attractive option for universities, and a popular target of investment for US pension and insurance money seeking long-term projects. Lloyds estimates that more than £3bn has been borrowed by UK universities since 2016 on capital markets, with around half of that coming via private placements.
The market is small by the standards of capital markets, but significant in relation to the overall size of the country’s higher education sector, which has a total annual income of close to £30bn, according to the Higher Education Funding Council for England. "

The press release explicitly referred to Portsmouth as being first in the UK for boosting graduate salaries, by which is meant earning above expectations based on things like social background and exam results. That could reflect credit on the university although a cynic might wonder whether that is just because expectations were very low to start off with. In addition, the university is ranked 37th among UK universities in the Guardian University Guide and in the top 100 in the Times Higher Education (THE) Young Universities Rankings.

If millions of pounds have been advanced in part because of a 98th place in the THE young universities rankings that might not be a wise decision. These rankings are quite credible for the top 20 or 30 but go down a bit more and in 74th place is Veltech University in India which has a perfect score for research impact based entirely on the publications of exactly one serial self-citer.

The profile of the University of Portsmouth shows a fairly high score for citations and a low one for research, which is often a sign that its position has little to do with research excellence and more to do with getting into high-citation, multi-author astrophysics and medical projects. That does appear to be the case with Portsmouth and it could mean that the university's place in the young university rankings is precarious since it could be undermined by methodological changes or by the departure of a few highly cited researchers.

The role of PwC as advisor is interesting since that company is also charged with auditing the THE world rankings.

Tuesday, February 27, 2018

Are the rankings biased?

Louise Richardson, vice-chancellor of the University of Oxford has published an article in the Financial Times proclaiming that British universities are a national asset and that their researchers deserve that same adulation as athletes and actors.

"Listening to the public discourse one could be forgiven for thinking that the British higher education system is a failure. It is not. It is the envy of the world."

That is an unfortunate phrase. It used to be asserted that the National Health Service was the envy of the world.

She cites as evidence for university excellence the Times Higher Education World University Rankings which have three British universities in the world's top ten and twelve in the top one hundred. These rankings also, although she does not mention it here, put Oxford in first place.

There are now, according to IREG, 21 global university rankings. One wonders why a world-class scholar and head of a world-class university would choose rankings that regularly produce absurdities such as Anglia Ruskin University ahead of Oxford for research impact and Babol Noshirvani University of Technology its equal.

But perhaps it is not really surprising since of those rankings THE is the only one to put Oxford  in first place. In the others it ranges from third place in the URAP rankings published in Ankara to seventh in the Shanghai Rankings (ARWU), Webometrics (WEB) and Round University Ranking (RUR) from Russia

That leads to the question of how far the rankings are biased in favor of universities in their own countries.

Below is a quick and simple comparison of how top universities perform in rankings published in the countries where they located and in other rankings.

I have looked at the rank of the top scoring home country university in each of eleven global rankings and then at how well that university does in the other rankings. The table below gives the overall rank of each "national flagship" in the most recent eleven global university rankings. The rank in the home country rankings is in red.

We can see that Oxford does better in the Times Higher Education (THE) world  rankings where it is first than in the others where its rank ranges from 3rd  to 7th. Similarly, Cambridge is the best performing UK university in the QS rankings where it is 4th. It is also 4th in  the Center for World University Rankings (CWUR), now published in the UAE, and 3rd in ARWU. In the other rankings it does less well.

ARWU, the US News Best Global Universities (BGU), Scimago (SCI), Webometrics (WEB), URAP, the National Taiwan University Rankings (NTU), and RUR do not seem to be biased in favour of their country's flagship universities. For example, URAP ranks Middle East Technical University (METU) 532nd which is  lower than five other rankings  and higher than three.

CWUR  used to be published from Jeddah in Saudi Arabia but has now moved to the Emirates so I count the whole Arabian peninsula as its home. The top home university is therefore King Saud University (KSU), which is ranked 560th, worse than in any other ranking except for THE.

The GreenMetric Rankings, produced by Universitas Indonesia (UI), have that university in 23rd place, which is very much better than any other.

It looks like THE, GreenMetric and, to a lesser extent QS, are biased towards their top home country institutions.

This only refers to the best universities and we might get different result looking at all the ranked universities.

There is a paper by Chris Claassen that does this although it covers fewer rankings.

Lomonosov  MSU

Tuesday, February 20, 2018

Is Erdogan Destroying Turkish Universities?

An article by Andrew Wilks in The National claims that the position of Turkish universities in the Times Higher Education (THE) world rankings, especially that of Middle East Technical University (METU) has been declining as a result of the crackdown by president Erdogan following the unsuccessful coup of July 2016.

He claims that Turkish universities are now sliding down the international rankings and that this is because of the decline of academic freedom, the dismissal or emigration of many academics and a decline in its academic reputation.

'Turkish universities were once seen as a benchmark of the country’s progress, steadily climbing international rankings to compete with the world’s elite.
But since the introduction of emergency powers following a failed coup against President Recep Tayyip Erdogan in July 2016, the government’s grip on academic freedom has tightened.
A slide in the nation's academic reputation is now indisputable. Three years ago, six Turkish institutions [actually five] were in the Times Higher Education’s global top 300. Ankara's Middle East Technical University was ranked 85th. Now, with Oxford and Cambridge leading the standings, no Turkish university sits in the top 300.
Experts say at least part of the reason is that since the coup attempt more than 5,800 academics have been dismissed from their jobs. Mr Erdogan has also increased his leeway in selecting university rectors.
Gulcin Ozkan, formerly of Middle East Technical University but now teaching economics at York University in Britain, said the wave of dismissals and arrests has "forced some of the best brains out of the country".'
I have no great regard for Erdogan but in this case he is entirely innocent.

There has been a massive decline in METU's position in the THE rankings since 2014 but that is entirely the fault of THE's methodology. 

In the world rankings of 2014-15, published in 2014, METU was 85th in the world, with a whopping score of 92.0 for citations, which carries an official weighting of 30%. That score was the result of METU's participation in the Large Hadron Collider (LHC) project which produces papers with hundreds or thousands of authors and hundreds and thousands of citations. In 2014 THE counted every single contributor as receiving all of the citations. Added to this was a regional modification that boosted the scores of universities located in countries with a low citations impact score.

In 2015, THE revamped its methodology by not counting the citations to these mega-papers and by applying the regional modification to only half of the research impact score.

As a result, in the 2015-16 rankings METU crashed to the 501-600 band, with a score for citations of only 28.8. Other Turkish universities had also been involved in the LHC project and benefited from the citations bonus and they too plummeted. There was now only one Turkish university in the THE top 300.

The exalted position of METU in the THE 2014-15 rankings was the result of THE's odd methodology and its spectacular tumble was the result of changing that methodology. In other popular rankings METU seems to be slipping a bit but it never goes as high as in THE in 2014 or as low as in 2015

In the QS world rankings for 2014-15 METU was in the 401-410 band and by 2017-18 it had fallen to  471-480 in 2017

The Russian Round University Rankings have it 375 in 2014 and 407 in 407. The US News Best Global Universities placed it 314th last year.

Erdogan had nothing to do with it.

Saturday, February 17, 2018

It's happened: China overtakes USA in scientific research

Last November I noted that the USA was barely managing to hold onto its lead over China in scientific research as measured by articles in the Scopus database. At the time, there were 346,425 articles with a Chinese affiliation and 352,275 with a US affiliation for 2017.

As of today, there are 395,597 Chinese and 406,200 US articles dated 2017.

For 2018 so far, the numbers are 53,941 Chinese and 49,428 US.

There are other document types listed in Scopus and perhaps the situation may change over the course of the year.

Also, the United States still has a smaller population so it maintains its lead in per capita research production. For the moment.

Sunday, February 11, 2018

Influence of Rankings on State Policy: India

In case you are wondering why the Indian media get so excited about the THE and QS rankings and not about those that are just as good or better such as Leiden Ranking, RUR or Nature Index, see this from the University Grants Commission.

Note that it says "any time" and that only the Big Three rankings count for getting Assistant Professor jobs.

"NEW DELHI:  University Grants Commission (UGC) has come up with, UGC Regulations 2018, which exempts PhD candidates from having NET qualification for direct recruitment to Assistant Professor post. This new draft regulation is known as Minimum Qualifications for Appointment of Teachers and Other Academic Staff in Universities and Colleges and Measures for the Maintenance of Standards in Higher Education. Further the Commission has also listed 'Ph.D degree from a university/ institution with a ranking in top 500 in the World University ranking (at any time) by Quacquarelli Symonds (QS), the Times Higher Education (THE) and Academic Ranking of World Universities (ARWU) of the Shanghai Jiao Tong University (Shanghai),' as one of the criteria for Assistant Professor appointment."

Friday, February 09, 2018

Playing the Rankings Game in Pakistan

This article by Pervez Hoodbhoy from October 2016 is worth reading:

"A recently released report by Thomson-Reuters, a Canada based multinational media firm, says, “In the last decade, Pakistan’s scientific research productivity has increased by more than 4 times, from approximately 2000 articles per year in 2006 to more than 9000 articles in 2015. During this time, the number of Highly Cited Papers (HCPs) featuring Pakistan based authors increased tenfold from 9 articles in 2006 to 98 in 2015.”
This puts Pakistan well ahead of Brazil, Russia, India, and China in terms of HCPs. As the reader surely knows, every citation is an acknowledgement by other researchers of important research or useful new findings. The more citations a researcher earns, the more impact he/she is supposed to have had upon that field. Research evaluations, through multiple pathways, count for 50-70 percent of a university’s ranking (if not more).
If Thomson-Reuters has it right, then Pakistanis should be overjoyed. India has been beaten hollow. Better still, two of the world’s supposedly most advanced countries–Russia and China–are way behind. This steroid propelled growth means Pakistan will overtake America in just a decade or two.
But just a little analysis shows something is amiss. Surely a four-fold increase in scientific productivity must have some obvious manifestations. Does one see science laboratories in Pakistani universities four times busier? Are there four times as many seminars presenting new results? Does one hear animated discussions on scientific topics four times more frequently?
Nothing’s visible. Academic activity on Pakistani campuses might be unchanged or perhaps even less today, but is certainly not higher than ten years ago. So where–and why–are the authors of the HCP’s hiding? Could it be that these hugely prolific researchers are too bashful to present their results in departmental seminars or public lectures? The answer is not too difficult to guess."

Thursday, February 08, 2018

Should Pakistan Celebrate the Latest THE Asian Rankings?

This is an updating and revision of a post from a few days ago

There appears to be no end to the craze for university rankings. The media in many parts of the world show almost as much interest in global university rankings as in the Olympics or the World Cup. They are now used to set requirements for immigration, chose research collaborators, external examiners, international partners and for marketing, public relations, and recruitment.

Pakistan has not escaped the craze although it was perhaps a bit slower than some other places. Recently, we have seen headlines announcing that ten Pakistani universities are included in the latest Times Higher Education (THE) Asian rankings and highlighting the achievement of Quaid-i-Azam University (QAU) in Islamabad reaching the top 100.

Rankings are unavoidable and sometimes they have beneficial results. The first publication of the research-based Shanghai rankings in 2003, for example, was a salutary shock to continental European universities and a clear demonstration of how far China had to go to catch up with the West in the natural sciences. But rankings do need to be treated with caution especially when ranking metrics are badly and obviously flawed.

THE note that there are now ten Pakistani universities in the Asian rankings and one, QAU, in 79th place, which would appear to be evidence of academic progress.

Unfortunately, Pakistani universities, especially QAU, do very much better in the THE rankings than in others. QAU is in the 401-500 band in the THE world rankings, which use the same indicators as the Asian rankings. But in the QS World University Rankings it is in the 650-700 band. It does not even get into the 800 ranked universities In the Shanghai rankings, the 903 in the Leiden Ranking, or the 763 in the Russian Round University Rankings. In the University Ranking by Academic Performance, published in Ankara, it is 605th, in the Center for World University Rankings list 870th.

How can we explain QAU’s success in the THE world and Asian rankings, one that is so much greater than any other ranking? It is in large part the result of a flawed methodology.

Take a look at the scores that QAU got in the THE rankings. In all cases the top scoring university gets 100.

For Teaching, combining five indicators, it was 25.7 which is not very good. For international outlook it was 42.1. Since QAU has very few international staff or students this mediocre score is very probably the result of a high score for international collaboration.

For research income from industry it was 31.8. This is probably an estimate since exactly the same score is given for four other Pakistani universities.

Now we come to something very odd. QAU’s research score was 1.3. It was the lowest of the 350 universities in the Asian rankings, very much lower than the next worse, Ibaraki University in Japan with 6.6.  The research score is composed of research reputation, publications per faculty and research income per faculty. This probably means that QAU’s score for research reputation was zero or close to zero.

In contrast, QAU’s score of 81.2 for research impact measured by citations is among the best in Asia. Indeed, in this respect it would appear to be truly world class with a better score than Monash University, the Chinese University of Hong Kong, the University of Bologna or the University of Nottingham.

How is it being possible that QAU could be 7th in Asia for research impact but 350th for research?

The answer is that THE’s research impact indicator is extremely misleading. It does not simply measure the number of citations but the number of citations in over 300 fields, five years of publication and up to six years of citations. This means that a few highly cited papers in a strategic discipline at a strategic time can have a disproportionate effect on the impact score especially if the total number of papers is low.

Added to this is THE’s regional modification which means that the citation impact score of a university is divided by the square root of the score of the whole country in which they university is located. That means that the score of universities in the top scoring country remain the same but that of all the others goes up, the worse the country the bigger the increase. The effect of this is to give a big boost to countries like Pakistan. THE used to apply this bonus to all of the citations indicator but now only to 50%.

Then we have to consider how THE deals with mega-papers mainly in physics and medicine, those with hundred even thousands of authors and hundreds and thousands of citations.

Until the world rankings of 2015-16 THE treated every single author of such papers as though he or she were the only author of the papers. Then they stopped counting citations to these papers and then in 2016-17 they awarded each institution a minimum 5% for citations.

The effect of the citations metric has been to make a mockery of the THE Asian and world rankings. A succession of unlikely places has been propelled to the top of the indicator because of contributions to mega-papers or because of a few or even a single prolific author combined with a low overall number of papers. We have seen Alexandria University, Anglia Ruskin University, Moscow State Engineering Physics Institute, Tokyo Metropolitan University rise to the top of this indicator. In last year’s Asian rankings, Veltech University in India appeared to be first for research impact.

QAU has been involved in the Large Hadron Collider (LHC) project, which produces papers with hundreds or thousands of authors and hundreds or thousands of citations, and has provided authors for several papers. One 2012 paper derived from this project received 4094 citations so that QU would be credited with 205 citations just for this paper.

In addition to this QAU employs an extremely productive mathematician, Tasawar Hayat, who is among the world’s elite of researchers in Clarivate Analytics list of Highly Cited Researchers where his primary affiliation is King Abdulaziz University in Saudi Arabia and QAU is his secondary affiliation. Professor Hayat is extremely prolific: in 2017 alone, he was author or co-author of 384 scientific documents, articles, reviews, notes and so on.

There is nothing wrong with QAU taking part in the LHC project and I am unable to comment on the quality of his research. It should, however, be understood that if Professor Hayat left QAU or QAU withdrew from the LHC project or THE changed its methodology then QAU could suffer a dramatic fall in the rankings similar to those suffered by some Japanese, Turkish or Korean universities in recent years. This is an achievement built on desperately weak foundations.

It would be very unwise to use these rankings as evidence for the excellence of QAU or any other university.