Impact Factor Impact Factor
Diese Seite wurde seit 2 Jahren inhaltlich nicht mehr aktualisiert.
Unter Umständen ist sie nicht mehr aktuell.
BiblioMap
Definitionen
The journal Impact Factor is a certain type of mean citation rate, namely a
synchronous one based on citations received in year y by papers published in the
2 previous years, i.e., y-1 and y-2.
Von Stefanie Haustein, Vincent Larivière im Buch Incentives and Performance (2015) im Text The Use of Bibliometrics for Assessing Research A measure which has become particularly popular is the so-called “Impact
Factor” (Alberts 2013). Nowadays this factor is commonly used in order to assess
the “quality” of a journal. The Impact Factor of a particular journal is a quotient
where the numerator represents the number of citations of articles published in that
particular journal during previous years (mostly over the last 2 years) in a series of
selected journals in a given year. The denominator represents the total number of
articles published in that journal within the same period of time. For example, if a
journal has an Impact Factor of 1.5 in 2013, this tells us that papers published in this
journal in 2011 and 2012 were cited 1.5 times on average in the selected journals
in 2013.
Von Mathias Binswanger im Buch Incentives and Performance im Text How Nonsense Became Excellence (2015) auf Seite 26Ein Maß, welches es den Zahlenfetischisten besonders angetan hat, ist der sogenannte Impact Faktor, der heute in großem Stil dazu verwendet wird, die „Qualität“ von Zeitschriften zu berechnen. Der Impact Faktor einer bestimmten Zeitschrift ist ein Quotient, bei dem im Zähler die Anzahl der Zitate in einer Reihe ausgewählter Zeitschriften angegeben ist, die in einem bestimmten Jahr auf Artikel entfielen, welche über einen Zeitraum (meist über die letzten zwei Jahre) in der Zeitschrift erschienen sind. Im Nenner steht die Gesamtzahl der innerhalb des gleichen Zeitraums in der Zeitschrift publizierten Artikel. Ist der Impact Faktor einer Zeitschrift im Jahre 2010 beispielsweise 1,5, dann bedeutet dies, dass ein in dieser Zeitschrift in den Jahren 2008 und 2009 erschienener Artikel im Jahre 2010 im Durchschnitt 1,5 mal zitiert wurde.
Von Mathias Binswanger im Buch Sinnlose Wettbewerbe (2010) im Text Beispiel Wissenschaft Bemerkungen
The problem with using the journal Impact Factors as an expected citation
rate is that due to the underlying skewed distributions, it is neither a predictor nor
good representative of actual document citations (Seglen 1997a; Moed 2002).
Von Stefanie Haustein, Vincent Larivière im Buch Incentives and Performance (2015) im Text The Use of Bibliometrics for Assessing Research The comparison of numbers of
publications, citations and impact factors not only between disciplines but also
between sub-disciplines does not make sense (Bornmann et al. 2008; Kieser 2012).
Therefore, using citations or citation-based rankings as indicators for scholarly
performance is highly problematic.
Von Margit Osterloh, Alfred Kieser im Buch Incentives and Performance (2015) im Text Double-Blind Peer Review Due to its importance, the Impact Factor is probably the most misused and manipulated indicator. There are several ways how journal editors “optimize" the Impact Factor of their periodicals, a phenomenon referred to as the ‘numbers game’ (Rogers 2002), ‘Impact Factor game’ (The PLoS Medicine Editors 2006) or even ‘Impact Factor wars’ (Favaloro 2008).
Von Stefanie Haustein, Vincent Larivière im Buch Incentives and Performance (2015) im Text The Use of Bibliometrics for Assessing Research In addition to being a mean citation rate, the Impact Factor has other limitations
and shortcomings. It includes articles, reviews and notes as publication types while
citations to all document types are considered, leading to an asymmetry between
numerator and denominator (Moed and van Leeuwen 1995; Archambault and
Larivie`re 2009). This asymmetry has led journal editors to “optimize” their
journals’ publication behavior (see Sect. 4).
Von Stefanie Haustein, Vincent Larivière im Buch Incentives and Performance (2015) im Text The Use of Bibliometrics for Assessing Research Die relative Bedeutung einer wissenschaftlichen Zeitschrift, gemessen an der Häufigkeit, mit der diese Zeitschrift anderen Ortes zitiert wird, besagt nicht, dass die dort veröffentlichten Artikel diesem Durchschnitt entsprechen. Adler, Ewing & Taylor (2008:14) stellen denn auch in einem Gutachten für die „International Mathematical Union“ fest, dass die Verwendung von Impact-Faktoren zu grossen Fehlerwahrscheinlichkeiten führen kann und „atemberaubend naiv“ sei.
Von Margit Osterloh, Bruno S. Frey im Text Anreize im Wissenschaftssystem (2008) Second, using impact factor as a proxy for the quality of an article published in a
journal is very common, but leads to large error probabilities (Adler et al. 2008).
The “extreme variability in article citedness permits the vast majority of articles—
and journals themselves—to free-ride on a small number of highly cited articles”
(Baum 2011, p. 449). Many top quality articles are published in non-top journals,
and many articles in top journals generate very few citations (Campbell 2008;
Kriegeskorte 2012; Laband and Tollison 2003; Oswald 2007; Singh et al. 2007;
Starbuck 2005).
Von Margit Osterloh, Alfred Kieser im Buch Incentives and Performance (2015) im Text Double-Blind Peer Review The Impact Factors used in science today are calculated annually by the American
company Thomson Scientific and get published in the Journal Citation
Reports. Thomson Scientific in fact became a monopolist in the calculation of
impact factors, although the exact method of calculation is not revealed, which has
been criticized repeatedly (see, e.g., Rossner et al. 2007). “Scientists have allowed
Thomson Scientific to dominate them” (Winiwarter and Luhmann 2009, p. 1). This monopoly enables Thomson Scientific to sell its almost secretly fabricated Impact
Factors to academic institutions at a high price, although in many sciences less than
50 % of today’s existing scientific journals are included in the calculation
(Winiwarter and Luhmann 2009, p. 1).
Von Mathias Binswanger im Buch Incentives and Performance im Text How Nonsense Became Excellence (2015) auf Seite 26Another shortcoming of the journal
Impact Factor is its short citation windows, which goes back to convenience and
cost-efficiency decisions made in the early days of the SCI (Martyn and Gilchrist
1968; Garfield 1972). Garfield (1972) found that the majority of citations are
received within the first 2 years after publication. For some disciplines 2 years
are not long enough to attract a significant number of citations, thus leading to large
distortions (Moed 2005). Since its 2007 edition, the Journal Citation Report (JCR)
includes a 5-year Impact Factor but the 2-year version remains the standard. The
asymmetry between numerator and denominator, which was caused by computational
limitations in the 1960s and could easily be solved by document-based
citation matching, however, still exists.
Von Stefanie Haustein, Vincent Larivière im Buch Incentives and Performance (2015) im Text The Use of Bibliometrics for Assessing Research Central to the introduction
of bibliometrics in research evaluation was the creation of the Science Citation
Index (SCI) in the 1960s, a citation database initially developed for the retrieval of
scientific information. Embedded in this database was the Impact Factor, first used
as a tool for the selection of journals to cover in the SCI, which then became a
synonym for journal quality and academic prestige. Over the last 10 years, this
indicator became powerful enough to influence researchers’ publication patterns in
so far as it became one of the most important criteria to select a publication venue.
Regardless of its many flaws as a journal metric and its inadequacy as a predictor of
citations on the paper level, it became the go-to indicator of research quality and
was used and misused by authors, editors, publishers and research policy makers
alike.
Von Stefanie Haustein, Vincent Larivière im Buch Incentives and Performance (2015) im Text The Use of Bibliometrics for Assessing Research Verwandte Objeke
Verwandte Begriffe (co-word occurance) | Salamipublikationenleast publishable unit(0.09), Peer review Prozess (wissenschaftlich)(0.06), Bibliometriebibliometry(0.05), Aufgeblasene AutorenlistenHonorary Authorship(0.05), PublikationsdruckPublish or Perish!(0.05), h-index(0.04), EigenplagiatSelf-Plagiarism(0.04), Drittmittel(0.04) |
Häufig co-zitierte Personen
Statistisches Begriffsnetz
Zitationsgraph
Zeitleiste
17 Erwähnungen
- «Wir brauchen Daten, noch mehr Daten, bessere Daten» (Sigrid Hartong)
- Anreize im Wissenschaftssystem (Margit Osterloh, Bruno S. Frey) (2008)
- Sinnlose Wettbewerbe - Warum wir immer mehr Unsinn produzieren (Mathias Binswanger) (2010)
- 7. Beispiel Wissenschaft - Immer mehr unsinnige Publikationen
- Fragwürdige Selbstplagiate - Mehrfachpublikationen sind kein Kavaliersdelikt (Christian Speicher) (2011)
- Empfehlungen zur Bewertung und Steuerung von Forschungsleistung (Deutscher Wissenschaftsrat) (2011)
- Too Big to Know - Das Wissen neu denken, denn Fakten sind keine Fakten mehr, die Experten sitzen überall und die schlaueste Person im Raum ist der Raum (David Weinberger) (2012)
- 7. Zu viel Wissenschaft
- Internet - Segen oder Fluch (Kathrin Passig, Sascha Lobo) (2012)
- 15. Mark Zuckerbergs Brille - Filter und Empfehlungen
- Incentives and Performance - Governance of Research Organizations (Isabell M. Welpe, Jutta Wollersheim, Stefanie Ringelhan, Margit Osterloh) (2015)
- 2. How Nonsense Became Excellence - Forcing Professors to Publish (Mathias Binswanger) (2015)
- 8. The Use of Bibliometrics for Assessing Research - Possibilities, Limitations and Adverse Effects (Stefanie Haustein, Vincent Larivière)
- 19. Double-Blind Peer Review - How to Slaughter a Sacred Cow (Margit Osterloh, Alfred Kieser)
- Der Aufstieg der Empirischen Bildungsforschung - Ein Beitrag zur institutionalistischen Wissenschaftssoziologie (2015)
- Zur Sache des Buches (Michael Hagner) (2015)
- 2. Alles umsonst? - Open Access
- #Open_Access - Wie der akademische Kapitalismus die Wissenschaften verändert (Michael Hagner) (2016)
- Horizonte Nr. 111 (2016)
- Eine Flut von akademischem Spam (Edwin Cartlidge)
- Das metrische Wir - Über die Quantifizierung des Sozialen (Steffen Mau) (2017)
- 8. Risiken und Nebenwirkungen
- 9. Transparenz und Disziplinierung
- Horizonte 131 - Publizieren im Umbruch (2021)
- Junge zwischen Ideal und Wirklichkeit (Santina Russo) (2021)