[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: citations as indicators of quality
- To: liblicense-l@lists.yale.edu
- Subject: Re: citations as indicators of quality
- From: David Goodman <dgoodman@princeton.edu>
- Date: Tue, 27 Nov 2007 00:00:50 EST
- Reply-to: liblicense-l@lists.yale.edu
- Sender: owner-liblicense-l@lists.yale.edu
No librarian or publisher-- nobody but an uninformed academic bureaucrat-- would ever attempt to compare the quality of journals between different fields, or the work of faculty between different fields, using publication counts or citation metrics, regardless of attempts at normalization. There may be rational objective methods for the distribution of resources within individual academic subjects, but the distribution of library or research or education resources among the different subjects is a political question. It is for example reasonable to attempt a rational discussion of which developmental molecular biologists do the best research, or the relative importance of the different publication media in developmental molecular biology, but to decide the relative importance of researchers in that subject with respect to the other fields of biology--let alone to mathematics--or even more absurdly, comparative literature-- is not a question for calculation. But Sandy falls into the fallacy of attributing unimportance to rejected work. The disputes over the Bell Curve, or cold fusion, are what drive further inquiry. We progress in all fields of science by scientifically disproving error. David Goodman, Ph.D., M.L.S. previously: Bibliographer and Research Librarian Princeton University Library dgoodman@princeton.edu ----- Original Message ----- From: Sandy Thatcher <sgt3@psu.edu> Date: Friday, November 23, 2007 11:22 pm Subject: Re: citations as indicators of quality To: liblicense-l@lists.yale.edu > But the authors of the article I cited raise a very crucial > point in demonstrating that citation practices differ across > disciplines and subfields within disciplines. It surely makes > no sense to rank a journal higher, or keep subscribing to it, > because scholars in that subfield, like international > relations, simply cite more than their colleagues in other > subfields. (If this were the main criterion, I suppose law > journals would always rank highly because they contain massive > numbers of citations, with many pages having more footnotes > than text, though of course we all know that they are not > really peer reviewed, being edited by law students.) This is > one among several reasons these authors put forward to argue > for using reputational analysis, too, in order to make up in > part for the shortcomings of pure citational analysis. > > The reductio ad absurdum of citational analysis would be works > like "The Bell Curve," which received a tremendous amount of > attention, most of it quite negative, or articles touting "cold > fusion," an equally controversial topic, or "intelligent > design." One would surely have to use scare quotes in > describing any of these kinds of works as having "value." > > Sandy Thatcher > Penn State University Press > >>The reason for abandoning the very time-consuming, qualitative >>text analysis approach was that it never resulted in >>substantially more information about the value of an article >>than a straight citation count. Whether positive, negative, or >>neutral, the citing of another's work reflects a type of >>intellectual payment on the part of the author. In a >>communication market that rewards attention, even citing an >>"execrable paper" (Sandy's example) is an indication that the >>article is worth some form of attention. Most execrable papers >>are categorically ignored. >> >>Reasons for citing: Weinstock's list (1971) >>1. Paying homage to pioneers >>2. Giving credit for related work >>3. Identifying methodology, equipment etc. >>4. Providing background reading >>5. Correcting one's own work >>6. Correcting the work of others >>7. Criticizing previous work >>8. Substantiating claims >>9. Alerting researchers to forthcoming work >>10. Providing leads to poorly disseminated, poorly indexed, or > uncited work >>11. Authenticating data and classes of fact - physical constants, > etc.>12. Identifying original publications in which an idea or > concept was >>discussed >>13. Identifying the original publication describing an eponymic > concept>or term, e.g., Hodgkin's disease >>14. Disclaiming work or ideas of others >>15. Disputing priority claims of others >> >>--Phil Davis
- Prev by Date: World Scientific Journal Updates
- Next by Date: Re: citations as indicators of quality
- Previous by thread: Re: citations as indicators of quality
- Next by thread: Re: citations as indicators of quality
- Index(es):