This guide is designed to assist faculty members with the quantitative aspects of scholarly publishing, known as bibliometrics. Publishing metrics are removed from the intellectual exercise of scholarly productivity, and yet they are increasingly important as measures used to gauge the weight of publications (usually journal articles). They are commonly a factor in evaluation for promotion, tenure, and research funding.
Quantitative research evaluation is common practice in the sciences and engineering, where journal articles are more frequently used for scholarly communication. In the social sciences and humanities, this kind of bibliometric analysis is more complicated. However, most scholars are interested in selecting venues for their scholarly communication carefully, and in ensuring that their published works are noted and included in any bibliometric analysis.
UNB Libraries can assist you with both of these goals, providing information on the various measures of journal impact factors (Scopus’ Journal Metrics, etc.), and how to use proprietary and open (ResearcherID, Google Scholar) software to ensure that your publications are included in any h-index or other calculation of scholarly impact.
Finally, this site includes information that will help you to identify good and not-so-good venues for publishing your work, using resources like UlrichsWeb, and a handy glossary of terms, to demystify bibliometrics.
If you want to:
- Learn which library-subscribed and freely-accessible resources include bibliometric information, use the Article Metrics tab.
- Find out how various journals are ranked, and what methods are used to assign relative, quantitative measures to individual titles, use the Journal Metrics tab.
- Ensure that your own publications are correctly attributed to you and that your profile as a researcher describes you accurately, use the databases listed under the Author Metrics tab.
- Read up on how bibliometrics are used to analyse scholarly publishing in various disciplines and using various methods, follow the links to reading lists under the Further Readings tab.
- Unravel the language of bibliometrics, see the brief glossary under the Terms and Definitions tab.
Citation analysis is the process of evaluating the research impact of an article, an author, or the research work of an institution.
UNB Libraries subscribe to the following databases that can aid faculty in analyzing their research impact. No single database can provide a complete citation count, therefore it is recommended that you use a combination of databases to discover how many papers are citing your works.
Where available, links to tutorials are provided below. Automatic alerts are also an excellent way to keep current in your field of study. Steps for alerting are noted below and will vary depending on the database.
- Scopus is an Elsevier product that indexes/abstracts more than 19,500 peer-reviewed journals from approximately 5,000 publishers. While journal coverage in Scopus is somewhat broader than Web of Science, citation tracking is limited to articles published after 1996. Therefore, attempts to ascertain the impact of a researcher's work prior to 1996 may not be accurately reflected using Scopus.
Subject coverage includes life sciences (> 6,300 titles), health sciences (> 12,900 titles), physical sciences (>11,700 titles) and social sciences/humanities (> 9,800 titles). Arts & Humanities coverage is not strong (titles from >2002).
Citation Alert: A document, author or affiliation alert can be created within a record by selecting "Set alert" (activity requires registration).
Coverage weaknesses (noted above) have caused some faculty to turn to Google Scholar to track individual citations. Google Scholar offers three metrics: the h-index, the i-10 index (the number of articles with at least ten citations), and the total number of citations to individual articles.
Coverage strength is science/medical literature, open access repositories, and some (but not all) major publishers. Google Scholar has its own coverage weaknesses in the social sciences/humanities.
Faculty should be aware that Google Scholar does not index all scholarly articles. As such, it is possible articles that cite the item under study may not be counted. It is also difficult to determine what sources and time spans are covered in Google Scholar.
Citation Help: https://scholar.google.com/intl/en/scholar/citations.html
- Microsoft Academic Search
Microsoft Academic Search is still in beta and is not particularly well known. However, it boasts of over 38 million references in its database. No other scholarly resource is as concerned with bibliometrics as is MAS. Rankings of individuals, departments, and institutions are front and centre with this product.
Editing of profiles or papers is possible by logging in. The login uses the UNB site for accreditation and login is therefore done by using a UNB ID and PIN.
Subscribing to be notified of new citations is not explicitly a feature of this database. Instead, MAS offers the option of subscribing to an RSS feed for any article. It is presumed that an new citation to the article would generate a change in the RSS feed, but that is yet to be confirmed.
Certain disciplines (especially those within the Social Sciences or Arts/Humanities) may not be well-represented using the citation analysis tools above.
The following UNB-subscribed databases can be useful when attempting to discover how many papers are citing your works.
Tutorial (Word Format):https://support.ebsco.com/uploads/kb/en_host_cited_refs_help_sheet.doc
Finds citations in non-traditional sources such as Web pages. This service bases the search on PubMed IDs, so it will not useful for those whose publications are not listed in PubMed.
- EBSCOhost Databases
Cited reference searching can be undertaken in various discipline-specific databases within EBSCO (e.g., PsycINFO, CINAHL, Historical Abstracts, SocINDEX with Full Text and Business Source Premier). EBSCO databases all share similar search strategies when conducting citation analysis:
- Select "Cited References" from the navigation ribbon across the top of the screen (N.B. "Cited References" is sometimes an option under "More" on the navigation ribbon (e.g., Business Source Premier)
- Search for the article/author under study
- Check the box adjacent to relevant results that has a "Times Cited in this Database" link
- Select the "Find Citing Articles" button at the top of your results to view the articles citing the original article/author
- ProQuest Databases
Cited reference searching can also be undertaken in the discipline-specific databases within ProQuest (e.g., ABI Inform Complete, MLA International Bibliography, PAIS, Worldwide Political Science Abstracts , Sociological Abstracts, Georef, Philosophers Index and Social Services Abstracts).
- Search for article/author under study
- Within the results, select the "Cited by (#)" link to see the articles from ProQuest databases that cite the original article/author's work.
At present, MathSciNet only works with reference lists from a subset of the journals it indexes. This means that the citing lists for any article may appear to be small. Still, this may be valuable information for regular users of the database.
- SciFinder Scholar
This database tracks citations to articles and enables sorting of search results by the number of times articles have been cited. The product requires that every user create an account.
Hein Online offers citation analysis only for the law/criminology journals housed by the database. It is not at the level of Web of Science or Scopus,
For further assistance with citation searching, please contact your Liaison Librarian.
Journal metrics are used extensively by academic institutions and publishers for various reasons:
- As a means of evaluation for faculty promotion, tenure, and/or grant funding
- When ranking journals within a particular discipline
- As a tool for deciding where to publish for maximum impact
To date, no best method for measuring journal quality has been determined. Below are several methods currently in use for measuring journal quality.
Journal Impact Factor
Google Scholar Metrics // Google Scholar has recently introduced their ranking of journals based on the h-index and some other metrics. Their information page states: > Google Scholar Metrics provide an easy way for authors to quickly gauge the visibility and influence of recent articles in scholarly publications. While most researchers are familiar with the well-established journals in their field, that is often not the case with newer publications or publications in related fields - there're simply too many of them to keep track of! Scholar Metrics summarize recent citations to many publications, to help authors as they consider where to publish their new research.
Journal Metrics (Scopus) // Elsevier, the provider of Scopus, has set up a website to rival Journal Citation Reports. Their Journal Metrics page offers two metrics (Source-Normalized Impact per Paper (SNIP) and SCImago Journal Rank (SJR)) to measure the performance and/or impact of a journal. Explanations of these metrics can be found on the website.
Publish Or Perish // Publish or Perish is a free, non-profit software program (courtesy of www.harzing.com) that uses Google Scholar and __Microsoft Academic Search to obtain the raw citations and can provide a variety of statistics that can be viewed and copied into other applications or saved for further analysis.
Eigenfactor Project (University of Washington) // The Eigenfactor Project is a non-commercial academic research project sponsored by the Bergstrom Lab at the University of Washington. It is powered by data from Microsoft Academic Search.
Journal Quality Assessment - Publishing Opportunities
When determining quality outlets for publishing your work, it can be beneficial to determine:
- Manuscript acceptance/rejection rates for articles in the publication
- The composition of the publication's editorial board
- Objectivity of the publication's review process (e.g. blind/editorial/peer review)
Additionally, bad-faith publishers (or, so-called, "predatory publishers") have muddied the waters when it comes to choosing locations to publish. There are a number of steps you can take in evaluating a journal. You are encouraged to review the library's guide to evaluating publishers. Additionally, Think, Check, Submit is an excellent resource for helping to determine the legitimacy of a publisher. You may also wish to use any of the following resources to help you determine the quality of a publisher/publication:
UlrichsWeb: Global Serials Directory // Ulrichsweb is a quality UNB-subscribed database offering detailed, comprehensive and authoritative information about >300,000 periodical titles published around the world. Details include publication type/history, editions/formats, frequency, whether the title is scholarly/refereed, price, abstracting/indexing sources, online availability, publisher contact information.
Cabell's Directory of Publishing Opportunities // Cabell's directories can assist in determining which journals might be the best fit for your manuscript. Currently, UNB Libraries subscribe to the Business Collection via Cabell's. The index within each directory attempts to match the characteristics of your manuscript to the topic areas the journal emphasizes. Cabell's also offers descriptions of the types of review process used by the editor(s), number of reviewers, acceptance rate, the time required for review, and availability of reviewers' comments.
DOAJ - Directory of Open Access Journals // The Directory aims to be comprehensive and cover all open access scientific and scholarly journals that use a quality control system to guarantee the content. DOAJ journals must exercise peer-review or editorial quality control to be included.
There are a wide variety of Web sites that build profiles of researchers. Some focus on social networking (connecting people working on similar topics, for example) while others allow the researcher to display bibliographic production.
Note: All of the databases in this section create metrics (such as the h-index for researchers). This data is publicly available.
Scopus // The researcher ID is a feature that Scopus has incorporated directly into their product. All faculty are urged to search for themselves in Scopus and closely examine the results. The profiles are created by an automatic process and many "imperfections" have been noticed. In many cases, the researcher's publications are split into multiple IDs. There is a "merge" function in Scopus that will allow anyone to fix this problem. For other inaccuracies, Scopus offers the Author Feedback Wizard.
Google Scholar // With a Google account, any researcher can create a Google profile. The profile allows the user to make sure she is properly represented in the GS database. Unlike Web of Science or Scopus, missing references may be added to Google Scholar by typing them into a form.
Microsoft Academic Search // MAS has some of the most powerful bibliometrics options available. It enables the comparisons of institutions or departments within institutions based on their h-indexes. Individuals may create a profile by signing into MAS through the UNB login (yes, your regular old ID and PIN work). Once in, profile maintenance is a snap, as missing items may be uploaded in the form of a BibTex file (which you can create in RefWorks or, see above, EndNote Web).
ScholarUniverse // This service is mainly for researchers in the Social Sciences. It links names with all of our ProQuest databases (such as Sociological Abstracts. To claim and edit your profile, perform a name search and then register for the service.
Metrics in Promotion and Tenure
At UNB, there is no requirement that faculty provide Assessment Committees with bibliometric data when being considered for promotion and tenure. There is no place for this on the UNB CV form. However, there is nothing to stop any faculty member from including metrics when it is to their advantage to do so. Section 7 of the CV form asks for "Other Relevant Information" and knowing that a researcher's work is highly cited would certainly be relevant. Remember, though, that each of the resources listed above will yield a different figure for any metric, depending on the underlying data. The applicant is certainly entitled to use the source that best makes the case for advancement. When submitting such data, the date and service where the metric was observed should be submitted.
Other Sites with Profile Capability
These sites not only allow reference management but create profiles for the researcher. The profiles can lead to contact being made with others in your field or who have a particular expertise.
ORCID // An initiative to provide unique profiles to researchers across different content systems. Recent enhancements by Web of Science and Scopus have incorporated ORCID into these major systems.
Mendeley // Mendeley features stand-alone clients that can be downloaded to any desktop. These clients allow the manipulation of bibliographies and are synched with the data stored on the Mendeley Web site. Mendeley encourages all researchers to upload their publications' metadata and, where permitted, PDFs. This allows "crowd-sourced" searching and discovery. Individual accounts are free. However, some of the better features (such as group creation) are severely limited with free accounts.
Social Science Research Network (SSRN) // An open access repository for those working in the Social Sciences and Humanities. There's also a blog that will keep interested readers up-to-date on the system.
Figshare // Allows researchers to upload data, including negative data that might not be publishable elsewhere. All data is citable and discoverable.
CiteSeerX // "CiteSeerx is an evolving scientific literature digital library and search engine that focuses primarily on the literature in computer and information science". Researchers working in computer science and related areas may want to join and update their information.
ACM Digital Library // The ACM DL creates a Web profile. For those in CS and related fields only.
Reference Management Software
Please see the UNB Libraries guide for Zotero.
Definitions & Readings
This is perhaps the most widely used measure of a scholar's lifetime productivity. It was first proposed by Hirsch in this paper. This schematic, taken from the original paper's web site, illustrates the concept:
Assume the scales on both axes are the same. At the position "1" on the x-axis, record the number of times the most cited paper by the author was cited. At "2", rank the 2nd most cited paer, etc. The 45 degree line from the origin will intersect the resulting curve at a whole number. This is the h-index. It can be read as saying that the researcher has had h papers cited at least h times.
The concept has been expanded to rank departments and institutions as well as individuals.
The concept of using the h-index to measure the influence of a single paper was introduced by Schubert (2009). Egghe proposed that the "h" be capitalized to distinguish between two similar concepts. Egghe describes the single publication H-index as follows: "track all articles that cite this single publication and put them in descending order of citations they received. Then the Hirsch-index of this ranked set is called the single-publication H-index of this single publication."
This is not actually a metric but is a set of articles. As defined by Google Scholar, if a researcher has an h-index of 10, the h-core set of articles would be the articles that had garnered more than 10 citations.
Again, from Google Scholar, this is defined as "the median of the citation counts in [an] h-core ... The h-median is a measure of the distribution of citations to the h-core articles".
h5-index, h5-core, and h5-median
All the metrics confined to the latest 5 calendar years.
This is a simple index that "indicates how many items in the result list were cited at least ten times" (Jascó, 134). Google Scholar uses a variant that calculates the i10-index for citations received in the last 5 years.
This index was first introduced by Egghe. It is a variant on the h-index that gives more weight to very highly cited papers. From the article: "The g-index g is the largest rank (where papers are arranged in decreasing order of the number of citations they received) such that the first g papers have (together) at least g2 citations. (p. 144)
Microsoft Academic Search calculates a g-index for researchers who have established a profile with that service.
m-index or m-quotient
One basic criticism of the h-index is that it takes a lengthy career and many publications to establish a high rating. Cannot a yournger researcher demonstrate an impact quantitatively? This measure takes the researcher's h-index and divides it by the number of years since the publication of the first work. Younger, more active researchers will therefore score better using this measure in comparision to their h-index scores against more established researchers. This measure was first proposed by Hirsch and a further explanation has been contributed by Harzing
A journal's IF is a rating produced by Thompson Reuter's Journal Citation Reports. Both widely used and controversial, it attempts to rate the overall importance of a journal. It is computed by adding the number of citations all the items in the journal garnered and then dividing this by the number of items. Impact Factors are usually computed for a two year period. As well, Google Scholar is using their own set of metrics to rank journals.
SCImago Journal Rank (SJR)
This indicator was developed from the Google PageRank algorithms and uses data from Scopus. It is available for free at the SCImago Journal & Country Rank web site.