[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: citations as indicators of quality
- To: liblicense-l@lists.yale.edu
- Subject: Re: citations as indicators of quality
- From: Sandy Thatcher <sgt3@psu.edu>
- Date: Fri, 23 Nov 2007 22:47:50 EST
- Reply-to: liblicense-l@lists.yale.edu
- Sender: owner-liblicense-l@lists.yale.edu
But the authors of the article I cited raise a very crucial point in demonstrating that citation practices differ across disciplines and subfields within disciplines. It surely makes no sense to rank a journal higher, or keep subscribing to it, because scholars in that subfield, like international relations, simply cite more than their colleagues in other subfields. (If this were the main criterion, I suppose law journals would always rank highly because they contain massive numbers of citations, with many pages having more footnotes than text, though of course we all know that they are not really peer reviewed, being edited by law students.) This is one among several reasons these authors put forward to argue for using reputational analysis, too, in order to make up in part for the shortcomings of pure citational analysis.
The reductio ad absurdum of citational analysis would be works like "The Bell Curve," which received a tremendous amount of attention, most of it quite negative, or articles touting "cold fusion," an equally controversial topic, or "intelligent design." One would surely have to use scare quotes in describing any of these kinds of works as having "value."
Sandy Thatcher
Penn State University Press
The reason for abandoning the very time-consuming, qualitative text analysis approach was that it never resulted in substantially more information about the value of an article than a straight citation count. Whether positive, negative, or neutral, the citing of another's work reflects a type of intellectual payment on the part of the author. In a communication market that rewards attention, even citing an "execrable paper" (Sandy's example) is an indication that the article is worth some form of attention. Most execrable papers are categorically ignored.
Reasons for citing: Weinstock's list (1971)
1. Paying homage to pioneers
2. Giving credit for related work
3. Identifying methodology, equipment etc.
4. Providing background reading
5. Correcting one's own work
6. Correcting the work of others
7. Criticizing previous work
8. Substantiating claims
9. Alerting researchers to forthcoming work
10. Providing leads to poorly disseminated, poorly indexed, or uncited work
11. Authenticating data and classes of fact - physical constants, etc.
12. Identifying original publications in which an idea or concept was
discussed
13. Identifying the original publication describing an eponymic concept
or term, e.g., Hodgkin's disease
14. Disclaiming work or ideas of others
15. Disputing priority claims of others
--Phil Davis
B.G. Sloan wrote:
Sandy Thatcher said: "It begins by noting one fundamental flaw in any citation analysis by quoting another author thus: 'if [Journal X] published an execrable paper that attracted a million critical citations as an example of appalling practice, all other papers previously and later published in that journal would suddenly be much more highly ranked.'" This reminds me of something I asked about a couple of years ago in another forum... Most of the citation analysis studies I see nowadays involve quantitative analyses for the most part. Just wondering if many people are into studying citations from a qualitative standpoint? For example, in a lot of studies a citation is a citation is a citation, with little concern for how a given paper was cited qualitatively within the context of the citing paper. For example, an author could cite a paper very positively, or the citation could be pretty much value-neutral, or, as Sandy notes, the citation could be negative. But in a quantitative analysis these various types of citations pretty much all carry the same> weight. > > When I looked into this several years ago, a number of people > alerted me to some qualitative citation studies. The interestingthing is that most of these studies were maybe 20 years old, at least. It almost seemed like people got away from doing qualitative citation analyses as it got easier to do quantitative> analyses, i.e., as databases such as the ISI indices becameavailable in electronic form. Anyway, I am interested in hearing about relatively recent qualitative citation analysis. Thanks, Bernie Sloan
- Prev by Date: Re: NIH mandate - institutional repositories
- Next by Date: STM Responds to the Council of European Union
- Previous by thread: Re: citations as indicators of quality
- Next by thread: Re: citations as indicators of quality
- Index(es):