The most prestigious peer-reviewed journals in the world, such as Cell, Nature, Science, and the Journal of the American Medical Association (JAMA), have less and less influence amongst scientists, according to a paper co-authored by Vincent Larivière, a professor at the University of Montreal's School of Library and Information Sciences. He questions the relationship between journal "impact factor" and number of citations subsequently received by papers. "In 1990, 45% of the top 5% most cited articles were published in the top 5% highest impact factor journals. In 2009, this rate was only 36%," Larivière said. "This means that the most cited articles are published less exclusively in high impact factor journals." The proportion of these articles published in major scholarly journals has sharply declined over the last twenty years. His study was based on a sample of more than 820 million citations and 25 million articles published between 1902 and 2009. The findings were published in the Journal of the American Society for Information Science and Technology.
For each year analysed in the study, Larivière evaluated the strength of the relationship between article citations in the two years following publication against the journal impact factor. Then, he compared the proportion of the most cited articles published in the highest impact factor journals. "Using various measures, the goal was to see whether the 'predictive' power of impact factor on citations received by articles has changed over the years," Larivière said. "From 1902 to 1990, major findings were reported in the most prominent journals," notes Larivière. But this relationship is less true today."
Larivière and his colleagues George Lozano and Yves Gingras of UQAM's Observatoire des sciences et des technologies also found that the decline in high impact factor journals began in the early 90s, when the Internet experienced rapid growth within the scientific community. "Digital technology has changed the way researchers are informed about scientific texts. Historically, we all subscribed to paper journals. Periodicals were the main source for articles, and we didn't have to look outside the major journals," Larivière noted. "Since the advent of Google Scholar, for example, the process of searching information has completely changed. Search engines provide access to all articles, whether or not they are published in prestigious journals."
Impact factor as a measure of a journal's influence was developed in the 1960s by Eugene Garfield, one of the founders of bibliometrics. "It is basically the average number of times a journal's articles are cited over a two-year period," Larivière explained. "Initially, this indicator was used to help libraries decide which journals to subscribe to. But over time, it began to be used to evaluate researchers and determine the value of their publications." The importance of impact factor is so ingrained in academia's collective consciousness that researchers themselves use impact factor to decide which journals they will submit their articles to.
Various experts in bibliometrics have criticized the use of impact factor as a measure of an academic journal's visibility. A common criticism is that the indicator contains a calculation error. "Citations from all types of documents published by journal are counted," Larivière said, "but they are divided only by the number of articles and research notes. Impact factor is thus overestimated for journals that publish a good deal of editorials, letters to the editor, and science news, such as Science and Nature."
Another criticism is that the time frame in which citations are counted in calculating impact factor is too short. "There are research areas in which knowledge dissemination is faster than it is in others," Larivière said. "We cannot, for example, expect to get the same kind of impact factor in engineering and biomedical sciences." Yet journal impact factor is established in the two-year period following publication of articles regardless of the discipline.
The research results reveal some interesting points. On the one hand, journals are increasingly poor predictors of the number of citations an article can expect to receive. "Not only has the predictive power of impact factor declined, but also, impact factor is no longer suitable for evaluating research," Larivière argued. In his opinion, if we want to evaluate researchers and their work, it is best to use citations, which are a true measure of an article's impact. "This indicator is more accurate. It is not an estimation based on the hierarchy of journals." On the other hand, his work confirms that the dynamics of scholarly journals is changing, due especially to the open access of knowledge made possible by the Internet. "What then is the present function of scholarly journals?" Larivière asked. "One remains: peer review."
University of Montreal: http://bit.ly/mNqklw
This press release was posted to serve as a topic for discussion. Please comment below. We try our best to only post press releases that are associated with peer reviewed scientific literature. Critical discussions of the research are appreciated. If you need help finding a link to the original article, please contact us on twitter or via e-mail.
Today sees the publication of the report of an independent review of the contentious use of metrics — numerical indicators of performance — in the assessment of UK research and researchers. Can it plot a sensible course in a world increasingly obsessed with numbers?
The university’s ruling council will release a statement today, although the scientist will not be reinstated, provost Michael Arthur has said
Shutting other people up when you’re powerful but frightened isn’t defending academic freedom. It’s repressing itTwo weeks ago, a Nobel Laureate made some ill-advised remarks in front of the World Conference of Science Journalism.Whether or not these were intended as a joke is irrelevant at this stage; the remarks were made, people got offended, and the rest is history. Sir Tim Hunt offered his resignation from an honorary position (with no responsibilities and no salary) at University College London, and it was accepted. Again, whether he was pushed or whether he jumped is today of little concern. Continue reading...
In the wake of #Huntgate, here is a handy list of actions that individuals could commit to if they really want to see a change in the working environment coupled with a genuine move towards equality
Until women are given more of a voice and power in traditional organisations, calling out sexism on social media often remains our only recourse
Today I was thrilled to be announced as the recipient of this year’s Royal Academy of Engineering Rooke Award, but I’m concerned that engineering in general is hiding in plain sight
With the announcement of Tim Hunt’s resignation from UCL comes an opportunity to reflect on the women in science who were part of his success
Female scientsts take to Twitter to respond to the Nobel laureate’s comments about women crying in labs Female scientists have taken to Twitter to mock Tim Hunt’s suggestion that science would benefit from “single-sex labs” by posting pictures of themselves at work using the #distractinglysexy hashtag.
The Nobel prizewinner’s ‘trouble with girls’ comments are toxic as well as nonsense – discrimination in science is endemic
Tim Hunt complained that female scientists "cry" and make male colleagues fall in love with them