The impact factor is a measure of the importance or the rank of a journal. It is calculated by dividing the number of citations to the journal in a given year by the number of articles published in the journal in the previous two years.
|Citations received in 2018 for all articles published in 2016 and 2017:||120|
|Number of articles published in 2016 and 2017:||50|
|Impact factor for 2018 (published summer 2019):||120/50 = 2.4|
Thus, the articles published in this journal in 2016 and 2017 were cited in 2018, on average, 2.4 times.
It is generally true that journals with a high impact factor have greater visibility. Analyzing this impact factor can help authors decide in which journal to publish (although this should not be their only criterion). This measure was initially developed to help librarians choose which journals to subscribe to and not to determine the value of the contents published in these journals.
Some limitations of the impact factor:
The article The use and misuse of journal metrics presents the good and bad uses of the impact factor.
To date, the Journal Citation Reports (JCR), developed by Clarivate Analytics (previously Thomson Reuters), is the only resource for finding the impact factor of over 8,000 scientific journals. The scientific journals listed in JCR are those indexed by Web of Science.
Impact factors for a given year are usually published in the summer of the following year. For example, the 2019 impact factors will be available in the summer of 2020.
It is important to know there is a North American bias in the selection of journals by Web of Science, which covers mostly English-language journals. A journal that is not indexed in this database and, therefore, does not have an impact factor, is not necessarily a poor journal.
JCR also provides other information on journals, such as their impact factor over five years, the total number of citations, the Eigenfactor score, etc.
You can search by journal title or by subject. For more information, consult this tutorial from Clarivate.
SNIP measures the average citation impact of the publications of a journal, based on the citation data from Scopus in the last three years. This indicator takes into account the different citation practices in different disciplines. This allows better comparison of journals in different fields.
SNIP was developped by Leiden University's CWTS (Centre for Science and Technology Studies). The indicator is freely available for more than 20 000 journals on the CWTS website.
The Eigenfactor score ranks the importance of a journal by assessing its entire network of citations. This indicator is calculated using an algorithm similar to the one used by Google.
Eigenfactor.org also presents citation networks and information on the method used to calculate it, as well as science mapping projects.
The SCImago Journal and Country Rank (SJR) provides an indicator showing the visibility of the journals indexed in the Scopus database. This indicator was developed by SCImago based on the algorithm Google PageRank. SJR also provides bibliometric data (number of citations, self-citations, citations per document, and h-index) for more than 28 000 journals and more than 230 countries.
You can search by the exact title of a journal or by subject.
Access to SJR is free, even without a Scopus subscription.
David Pendlebury, from Thomson Reuters, presents the impact factor.
|Journal||Impact Factor (2019)||SNIP (2019)||SCIMAGO journal rank (2019)|
|European Journal of Operational research||4.2||2.9||2.4|
|Applied Physics Letters||3.6||1.3||1.3|
|Journal of Biomechanics||2.3||1.4||1.0|
A tutorial created by Clarivate Analytics on using Journal Citation Reports.
The San Francisco Declaration on Research Assessment, DORA was written by a group of researchers and scholarly journals publishers in 2012.
DORA puts forward several recommendations to improve the way in which research quality is evaluated, in particular by eliminating the use of journal-based indicators, like the impact factor, to evaluate individual researchers.
Some journals put forward indicators such as the "International Impact Factor" or the "Journal Quality Factor". On the websites of these indicators, you will often notice spelling mistakes, a vague or even plagiarized assessment method, as well as a lack of contact information.
Beware of journals using such indicators and check their credibility (peer review, other articles already published in this journal, other indicators, etc.) before publishing your work there.