Scientists Should Understand The Limitations As Well As The Virtues Of Citation Analysis

AUTHOR: Eugene Garfield
DATE: July 12, 1993

(The Scientist, Vol:7, #14, p.12, July 12, 1993. Copyright, The Scientist, Inc.)
Reprinted in Essays of an Information Scientist: Of Nobel Class, women in science, Citation Classics, and other essays, Vol:15, p.423, 1992-93

Two letters challenging the function and value of citation analysis arrived in my office recently, and I believe they merit a response for all of The Scientist's readers to ponder. Monitoring the scientific literature and developing statistics on the extent to which articles are referenced by subsequent authors have been consuming interests of mine for more than 30 years. I have consistently maintained that it is both intellectually worthwhile and professionally beneficial for scientists to understand the limitations as well as the virtues of the field of scientometrics.

One of the letters echoes a surprisingly widespread notion that citation analysis encourages mediocrity, or "consensus science." The writer asks: "If success is measured by citation analysis, won't scientists shy away from original and venturesome fields that don't produce strong citation records?" That is, will scientists abandon an exciting, potentially ground-breaking project or area of study that, when reported on, is unlikely to gain immediate attention from other researchers?

For responsible, intellectually honest scientists, the notion of statistics functioning as a determinant, rather than a reflection, of behavior is absurd. Of course, a few dubiously motivated researchers, hoping to grab in any way they can the attention of department heads or potential employers, will seek out any device that may enhance their name recognition and beef up their c.v.'s. But this is clearly an abuse of citation analysis. Blaming a measurement method for its abuses is like blaming a badly used slide rule for the collapse of the Tacoma Bridge.

The truth is, citation analysis, like peer review, can deliver a variety of messages--good, bad, or indifferent. It's a valuable indicator; it's one way of revealing where the action is in science; it's not a directive to the research community to drop what it's doing and suddenly switch focus.

It should be noted that, over the years, Current Contents has identified through citation frequency analysis thousands of published articles that attracted a relatively high number of references--works we call "Citation Classics." However, along with the many strikingly "original and venturesome" articles that drew immediate attention were hundreds that the science establishment at first refused to acknowledge. They weren't on the "hit" charts at all. Indeed, leading journals such as Nature, Science, and the New England Journal of Medicine rejected many manuscripts that ultimately were published elsewhere. Over time, they went on to be highly cited and very influential. Citation analysis is a messenger. It doesn't "prescribe" anything; it merely "describes." If thousands of scientists around the world suddenly start to "shy away" from inspired, authentically driven research--don't blame this messenger!

The second communication, from an academic scientist in Arizona, urged me to "stop writing such nonsense." He was referring to an article on page 14 of the April 19, 1993, issue of The Scientist, which listed the most cited--"hot," in our terminology--papers of 1992. He was particularly aggrieved that most of the papers had been published early in 1992, thus giving them more time than others to be referenced in subsequent works. Well, that's true-- but we make note of that in our text. Anyone who has studied the Institute for Scientific Information's data over the years knows enough to make allowances for this obvious qualification.

On the other hand, we've learned from experience that a high percentage of papers in the annual citation "hit parade" go on to be cited in an extraordinarily large number of publications over the long run.

Again, our effort is to present a useful and interesting indicator of trends and achievement--and what we do present is incontrovertibly true. We present the facts, the numbers--the "stats," if you will. What others do with those facts is a function of their intellectual curiosity, their wit, their "human interest" in the activity of the international scientific community, and their fascination with what is transpiring in their own disciplines and the ones that may be far afield.

To my two correspondents: Citation analysis provides a perspective for examining scientific activity; only if it is misunderstood and misused could it possibly discourage scientific inventiveness or justify its characterization as "nonsense."

(The Scientist, Vol:7, #14, p.12, July 12, 1993)
(Copyright, The Scientist, Inc.)
Reprinted in Essays of an Information Scientist: Of Nobel Class, women in science, Citation Classics, and other essays, Vol:15, p.423, 1992-93

WE WELCOME YOUR OPINION. IF YOU WOULD LIKE TO COMMENT ON THIS STORY, PLEASE WRITE TO US AT EITHER ONE OF THE FOLLOWING ADDRESSES:
garfield@aurora.cis.upenn.edu
71764.2561@compuserve.com

The Scientist, 3600 Market Street, Suite 450, Philadelphia, PA 19104, U.S.A.