[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: WSJ in impact factor



Joe,

Your point about page views is very interesting and well-stated in below. However, how would page views give an objective assessment? Unscrupulous researchers, or their friends, might artificially inflate page counts by multiple accesses.

I think citation counts still remain a good way to measure impact of research. However, given the potential abuses of impact factors, I'm inclined to think that professional associations should play a significant role in interpreting for their members appropriate uses of such factors. Also, I'm inclined to think that professional associations should work on creating disciplinary-specific metrics (including impact factors) for their own disciplines, based on data within databases for their disciplines (e.g., MathSciNet, or CAS).

This is not to say that ISI data should not also be used, just that a clearer or more comprehensive picture of things can be obtained by supplementing it with metrics derived from other databases. And then there is the question how Google Scholar and other publicly accessible citation count sources can figure in here. Brian Simboli


Joseph J. Esposito wrote:

No doubt many of the members of this list will already have seen the
article in today's Wall St. Journal on "gaming" the impact factor for
science journals.  As the WSJ site requires a subscription, the link
is useless, though presumably many on this list have access through
institutional subscriptions.  The byline is Sharon Begley, the
headline is:  Science Journals Artfully Try To Boost Their Rankings.
It is dated June 5 and appears on page B1 of the hardcopy edition (so
the online citation says).

The gist of the article is that some journals are trying to increase
their citation count in somewhat devious ways, thus improving their
impact factor as measured by ISI. I doubt any of this comes as a
surprise to anyone but a journalist, who, like FEMA, always get to the
action ten years too late.

What should be clear, however, is that impact factors and ISI's
unofficial role as umpire for the academy are coming under heavy
challenges and may indeed be bankrupt.  New measurements are needed,
but of what kind?  I am myself biased toward page views, which speak
to readership rather than authorship.  One of the benefits of using
page views is that there is a huge Internet industry in the consumer
sector that has already built the tools for counting and auditing page
views.  I am sure there are other ideas worth considering.

And, yes, this has important implications.  Page views put an emphasis
on findability, which means more search engine optimization and less
hierarchical Web site architectures.  Open Access lends itself to
findability--indeed, it is OA's principal merit.  Page views militate
against mediating interfaces, whether the portal of a publisher or a
library.

Universities "in-source" many things and "out-source" others; the
logic behind some of these decisions is not always self-evident. What
is truly odd, however, is the outsourcing of the certification process
to publishers, whether commercial or not-for-profit, leaving ISI to
stand behind home plate and call the balls and strikes.  Someone who
wants to transform scholarly communications would start by selecting a
new umpire.

Joe Esposito