Scientist Vol:10(17) p.13, September
t has been five years since I reported on the "journal myth"-that is, the popular perception that researchers are being overwhelmed by an ever-increasing flood of scientific journals (E. Garfield, The Scientist, Sept. 2, 1991, page 11). Experience has taught me it is essential that a simple message be repeated regularly: A small number of journals accounts for the bulk of significant scientific results.
Despite the voluminous literature that supports this point, which need not be cited here, people in the research, library, and information science communities continue to claim that there are 40,000 or more "journals" in existence. The problem is that such claims do not provide either qualitative or quantitative criteria for the definition of a journal. Furthermore, thousands of deceased journals are included among these claims.
These unsubstantiated claims have the unfortunate effect of raising researchers' anxiety about their ability to keep up with an ever-rising tide of literature. They also spawn outrageous extrapolations about the geometric expansion of the literature to a point well beyond anyone's hope of keeping up with it. For example, 15 years ago biomedical researchers were told that more than 2 million articles were being published each year in their field. So if researchers tried to stay current by reading two articles per day, in one year they would fall 55 centuries behind! In other words, if they tried to read everything of possible biomedical relevance, they would have to read 5,500 articles per day.
But these claims are at best overstated guesstimates and, at worst, specious speculations. Simply put, it is irrelevant how many journals are really out there. Why? Because a surprisingly small number of journals generate the majority of both what is cited and what is published.
It is remarkable that this concentration effect is stable over time. For example, a similar graph derived from 3,200 journals indexed in the 1987 SCI showed virtually identical results. That is, 500 journals account for half of what is published and more than 70 percent of what is cited, and 2,000 journals published about 85 percent of all SCI-indexed articles that year and 95 percent of cited articles.
It should be kept in mind that these cited and source journals are not necessarily the same. However, rankings of the most frequently cited and most productive journals are also stable over time, as is shown in the two lists.
Table 1 shows the top 50 journals in the 1994 SCI in terms of total citations received that year. The two columns on the right show the number of citations each journal received in 1994 and 1989, respectively. These 50 journals received 3.75 million citations in 1994, which represents 33 percent of all references indexed in the SCI that year.
These citations are to any papers the journal published, regardless of the year they appeared in print or the type of paper. Most scientific journals primarily publish original research articles. But other types of editorial items include literature reviews, technical notes, editorials, letters, abstracts, book reviews, discussions, and other material. Some journals, such as the New England Journal of Medicine, JAMA-Journal of the American Medical Association, Lancet, British Medical Journal, and other clinically oriented publications publish substantial numbers of often well-cited letters. And some, such as Science, Nature, Cell, and other multidisciplinary journals, include many high-impact review articles.
Journals must be examined with these differences in mind in order to avoid invidious comparisons. Only a differentiated audit of each category of editorial material can distinguish among the unique contributions of various publications to the advancement of science. This point was stressed in my 1986 analysis of medical journals (E. Garfield, Annals of Internal Medicine, 105:313-20). Also, a report emphasizing some of these differences for journals in general appeared last year in the Journal of the American Society for Information Science (H.F. Moed et al., 46:461-7, 1995).
The first column on the left of Table 1 shows the 1994 ranking of each journal, while the second column shows their 1989 rankings. They demonstrate that the top 50 cited journals in both years are essentially the same, with 48 journals appearing on both rankings. The only differences are that Molecular and Cellular Biology and the Journal of Virology-which, respectively, ranked 54th and 59th in 1989-moved into the top 50 for 1994. They replaced the Annals of Internal Medicine, which ranked 51st in 1994, and Physical Review. The latter title has been displaced because it split into five sections years ago.
A comment on the Journal of Biological Chemistry, the top-ranked journal on the list, is needed. Of its 265,000 citations in 1994, about 7,000, or 3 percent, were to Oliver Lowry's classic 1951 protein determination paper (O.H. Lowry et al., J. Biol. Chem., 193:265-75). Lowry recently passed away, and a New York Times obituary stated that this paper "became one of the most frequently cited studies in the scientific literature" (W. Saxon, July 4, 1996, page B8). Even by the Times' typically conservative standards, this is an incredible understatement. In fact, Lowry's paper is the most-cited of all time, with more than 245,000 explicit citations to date, as noted in a recent tribute in The Scientist (Notebook, Aug. 19, 1996, page 30).
Of course, a journal's overall citedness may be related to its current and cumulated productivity. To examine this relationship, Table 2 shows the top 50 journals in the 1994 SCI in terms of the total number of "source items" they published that year. Source items include original research articles, literature reviews, and technical notes. Letters to the editor, editorials, book reviews, and other such items are not included in the data shown here.
These 50 journals published more than 84,000 1994 source items, which represents about 15 percent of all SCI-indexed items that year. Of these most productive journals, 28 also appear among the 50 most-cited publications in Table 1. Thus, the sheer size of a journal does not appear to be directly correlated with citation frequency. However, this correlation might become stronger as we extend these lists to the top 500 or 1,000 by source items and citations.
The first column at the left shows the 1994 ranking of each journal, while the second column shows their 1989 rankings. Thirty-nine journals ranked among the top 50 in both 1994 and 1989. The newcomers to the 1994 top 50 list are the Journal of Geophysical Research (ranked 13th in 1994 and 165th in 1989), Physica B (14th in 1994, 54th in 1989), Physical Review E (27th in 1994, no ranking in 1989 because it was established in 1993), Journal of Chromatography A (35th in 1994, no ranking in 1989 because the parent Journal of Chromatography split into subsections in 1993), Macromolecules (37th in 1994, 53rd in 1989), Tetrahedron (39th in 1994, 81st in 1989), Applied Optics (40th in 1994, 58th in 1989), Surface Science (41st in 1994, 69th in 1989), Journal of Physics-Condensed Matter (44th in 1994, no ranking in 1989 because it subsequently was merged with another journal), Nuclear Physics B (46th in 1994, 101st in 1989), and Journal of Virology (49th in 1994, 64th in 1989).
These newcomers replaced the following journals, which ranked among the top 50 in 1989: Nucleic Acids Research (ranked 61st in 1994), Nature (57th in 1994), Solid State Communications (81st), Journal of Organometallic Chemistry (80th), Physics Letters A (52nd), American Journal of Cardiology (113th), ACS Symposium Series (58th), Cancer (64th), Endocrinology (91st), and Phytochemistry (59th).
The data presented here demonstrate that the scientific literature, however large it is or is claimed to be, is manageable. Whether there are 40,000 or 10,000 research journals extant, just a small fraction of "core" journals accounts for the majority of what is cited. This trend in the general scientific literature also holds true for specialty and subspecialty literature. Thus, while it is impossible to read everything of possible relevance published on most topics, the diligent and resourceful researcher can readily follow the comparatively small core of significant literature. This core literature, and then some, may be identified through a variety of information tools that serve as filters to the flood of literature.
In subsequent articles on journal literature, we will focus on overall and specialty-specific journal rankings by current impact factors. These factors represent the average citation frequency of a journal's articles over a recent two-year period. However, because there are differences between fields in the rates at which knowledge is accumulated, we will also present five- or 10- or even 25-year cumulative impact calculations. In addition, we will continue to present citation rankings and journal concentration graphs for a variety of specialty and subspecialty journals.
1996, The Scientist, Inc. All rights reserved.
We welcome your opinion. If you would like to comment on this article, please write us at mailto:email@example.com?Subject=The Scientist, Sep. 02, 1996, The Significant Scientific Literature Appears In A Small Core Of Journals
| Opinions &
Letters | Research
| Hot Papers
About The Scientist | Jobs | Classified | Web Registration | Print Subscriptions | Advertiser Information