Monday, September 30, 2013

Struggling (and sometimes cheating) to catch up

Scientific research is maturing in a number of developing nations, which are trying to join North American and European nations as recognized centres of research. As recent stories show, the pressure to fulfill this vision--and to publish in English-language, international journals--has led to some large-scale schemes to commit academic fraud, in addition to cases of run-of-the-mill academic dishonesty.

In China, a widely-discussed incident involved criminals with a sideline in the production of fake journal articles, and even fake versions of existing medical journals in which authors could buy slots for their articles. China has been criticized for widespread academic problems for some time, for example, 2010 the New York Times published a report suggesting academic fraud (plagiarism, falsification or fabrication) was rampant in China and would hold the country back in its goal to become an important scientific contributor. In the other recent incident, four Brazilian medical journals were caught “citation stacking”, where each journal cited the other three excessively, thus avoiding notice for obvious journal self-citation, while still increasing their journal’s impact factor. These four journals were among 14 having their impact factors suspended for a year, with other possible incidences that were flagged but could not be proven involved Italian, a Chinese, and a Swiss journal.

There are some important facts that might provide context to these outbreaks of cheating. Both Brazil and China are nations where to be a successful scientist in the national system, you need to prove that you are capable of success on the world stage. This is probably a tall order in countries where scientific research has not traditionally had an international profile and most researchers do not speak English as their first language. In particular it leads to focus on values which are comparable across the globe, such as journal impact factors, as measures of success. In China, there is great pressure to publish in journals included on the Science Citation Index (SCI), a list of leading international journals. When researcher, department, and university success is quantified with impact factors and SCI publications, it becomes a numbers game, a GDP of research. Further, bonuses for publications in high caliber journals can double a poorly-paid researcher’s salary: a 2004 survey found that for nearly half of Chinese researchers, performance based pay was 50+ percent of their income. In Brazil, the government similarly emphasizes publications in Western journals as evidence of researcher quality.

It’s easy to dismiss these problems as specific to China or Brazil, and there are some aspects of the issue that are naturally country-specific. On the other hand, if you peruse Ivan Oransky’s Retraction Watch website, you’ll notice that academic dishonesty leading to article retraction is hardly restricted to researchers from developing countries. At the moment, the leading four countries in retractions due to fraud are the US, Germany, Japan, and then China, suggesting that Western science isn’t free from guilt. But in developing nations the conditions are ripe to produce fraud. Nationalistic ambition is funnelled into pressure on national scientists to succeed on the international stage; disproportionate focus on metrics of international success; high monetary rewards to otherwise poorly paid individuals for achieving these measures of success; combined with the reality that it is particularly difficult for researchers who were educated in a less competitive scientific system and who may lack English language skills, to publish in top journals. The benefits of success for these researchers are large, but the obstacles preventing their success are also huge. Combine that with a measure of success (impact factor, h-index) that is open to being gamed, and essentially relies on honesty and shared scientific principles, and it is not surprising that system fails.

Medical research was at the heart of both of these scandals, probably because the stakes (money, prestige) are high. Fortunately (or not) for ecology and evolutionary biology, the financial incentives for fraud are rather smaller, and thus organized academic fraud is probably less common. But the ingredients that seem to lead to these issues – national pressures to succeed on the world stage and difficulty in obtaining such success; combined with reliance on susceptible metrics  – would threaten any field of science. And issues of language and culture are so rarely considered by English-language science (eg.), that it can be difficult for scientists from smaller countries to integrate into global academia. There are really two ways for the scientific community to respond to these issues of fraud and dishonesty – either treat these nations as second-class scientific citizens and assume their research may be unreliable, or else be available and willing to play an active role in their development. There are a number of ways the latter could happen. For example, some reputable national journals invite submissions from established international researchers to improve the visibility of their journals. In some nations (Portugal, Iceland, Czech Republic, etc), international scientists review funding proposals, so that an unbiased and external voice on the quality of work is provided. Indeed, the most hopeful fact is that top students from many developing nations attend graduate school in Europe and North America, and then return home with the knowledge and connections they gained. Obviously this is not a total solution, but we need to recognize fraud as problem affecting and interacting with all of academia, rather than solely an issue of a few problem nations.

No comments: