This site is designed to assist faculty members with the quantitative aspects of scholarly publishing, known as bibliometrics. Publishing metrics are removed from the intellectual exercise of scholarly productivity, and yet they are increasingly important as measures used to gauge the weight of publications (usually journal articles), and are often a factor in evaluation for promotion, tenure and research funding.
Quantitative research evaluation is common practice in the sciences and engineering, where journal articles are more frequently used for scholarly communication. In the social sciences and humanities, this kind of bibliometric analysis is more complicated. However, most scholars are interested in selecting venues for their scholarly communication carefully, and in ensuring that their published works are noted and included in any bibliometric analysis.
UNB Libraries can assist you with both of these goals, providing information on the various measures of journal impact factors (Scopus’ Journal Metrics, etc.), and how to use proprietary and open (ResearcherID, Google Scholar) software to ensure that your publications are included in any h-index or other calculation of scholarly impact.
Finally, this site includes information that will help you to identify good and not-so-good venues for publishing your work, using resources like UlrichsWeb, and a handy glossary of terms, to demystify bibliometrics.
If you want to:
- Learn which library-subscribed and freely-accessible resources include bibliometric information, use the Article Metrics tab.
- Find out how various journals are ranked, and what methods are used to assign relative, quantitative measures to individual titles, use the Journal Metrics tab.
- Ensure that your own publications are correctly attributed to you, and that your profile as a researcher describes you accurately, use the databases listed under the Author Metrics tab.
- Read up on how bibliometrics are used to analyse scholarly publishing in various disciplines and using various methods, follow the links to reading lists under the Further Readings tab.
- Unravel the language of bibliometrics, see the brief glossary under the Terms and Definitions tab.
Citation analysis is the process of evaluating the research impact of an article, an author, or the research work of an institution.
UNB Libraries subscribe to the following databases that can aid faculty in analyzing their research impact. No single database can provide a complete citation count, therefore it is recommended that you use a combination of databases to discover how many papers are citing your works.
Where available, links to tutorials are provided below. Automatic alerts are also an excellent way to keep current in your field of study. Steps for alerting are noted below and will vary depending on the database.
- Scopus is an Elsevier product that indexes/abstracts more than 19,500 peer-reviewed journals from approximately 5,000 publishers. While journal coverage in Scopus is somewhat broader than Web of Science, citation tracking is limited to articles published after 1996. Therefore, attempts to ascertain the impact of a researcher's work prior to 1996 may not be accurately reflected using Scopus.
Subject coverage includes life sciences (> 6,300 titles), health sciences (> 12,900 titles), physical sciences (>11,700 titles) and social sciences/humanities (> 9,800 titles). Arts & Humanities coverage is not strong (titles from >2002).
Citation Alert: A document, author or affiliation alert can be created within a record by selecting "Set alert" (activity requires registration).
Coverage weaknesses (noted above) have caused some faculty to turn to Google Scholar to track individual citations. Google Scholar offers three metrics: the h-index, the i-10 index (the number of articles with at least ten citations), and the total number of citations to individual articles.
Coverage strength is science/medical literature, open access repositories, and some (but not all) major publishers. Google Scholar has its own coverage weaknesses in the social sciences/humanities.
Faculty should be aware that Google Scholar does not index all scholarly articles. As such, it is possible articles that cite the item under study may not be counted. It is also difficult to determine what sources and time spans are covered in Google Scholar.
Citation Help: http://scholar.google.com/intl/en/scholar/citations.html
- Microsoft Academic Search
Microsoft Academic Search is still in beta and is not particularly well known. However, it boasts of over 38 million references in its database. No other scholarly resource is as concerned with bibliometrics as is MAS. Rankings of individuals, departments, and institutions are front and centre with this product.
Editing of profiles or papers is possible by logging in. The login uses the UNB site for accreditation and login is therefore done by using a UNB ID and PIN.
Subscribing to be notified of new citations is not explicitly a feature of this database. Instead, MAS offers the option of subscribing to an RSS feed for any article. It is presumed that an new citation to the article would generate a change in the RSS feed, but that is yet to be confirmed.
Certain disciplines (especially those within the Social Sciences or Arts/Humanities) may not be well-represented using the citation analysis tools above.
The following UNB-subscribed databases can be useful when attempting to discover how many papers are citing your works.
Tutorial (Word Format):http://support.ebsco.com/uploads/kb/en_host_cited_refs_help_sheet.doc
Finds citations in non-traditional sources such as Web pages. This service bases the search on PubMed IDs, so it will not useful for those whose publications are not listed in PubMed.
- EBSCOhost Databases
Cited reference searching can be undertaken in various discipline-specific databases within EBSCO (e.g., PsycINFO, CINAHL, Historical Abstracts, SocINDEX with Full Text and Business Source Premier). EBSCO databases all share similar search strategies when conducting citation analysis:
- Select "Cited References" from the navigation ribbon across the top of the screen (N.B. "Cited References" is sometimes an option under "More" on the navigation ribbon (e.g., Business Source Premier)
- Search for the article/author under study
- Check the box adjacent to relevant results that has a "Times Cited in this Database" link
- Select the "Find Citing Articles" button at the top of your results to view the articles citing the original article/author
- ProQuest Databases
Cited reference searching can also be undertaken in the discipline-specific databases within ProQuest (e.g., ABI Inform Complete, MLA International Bibliography, PAIS, Worldwide Political Science Abstracts , Sociological Abstracts, Georef, Philosophers Index and Social Services Abstracts).
- Search for article/author under study
- Within the results, select the "Cited by (#)" link to see the articles from ProQuest databases that cite the original article/author's work.
At present, MathSciNet only works with reference lists from a subset of the journals it indexes. This means that the citing lists for any article may appear to be small. Still, this may be valuable information for regular users of the database.
- SciFinder Scholar
This database tracks citations to articles and enables sorting of search results by the number of times articles have been cited. The product requires that every user create an account.
Hein Online offers citation analysis only for the law/criminology journals housed by the database. It is not at the level of Web of Science or Scopus,
For further assistance with citation searching, please contact your Liaison Librarian.
Journal metrics are used extensively by academic institutions and publishers for various reasons:
- As a means of evaluation for faculty promotion, tenure, and/or grant funding
- When ranking journals within a particular discipline
- As a tool for deciding where to publish for maximum impact
To date, no best method for measuring journal quality has been determined. Below are several methods currently in use for measuring journal quality.
Journal Impact Factor
- Google Scholar has recently introduced their ranking of journals based on the h-index and some other metrics. Their information page states:
Google Scholar Metrics provide an easy way for authors to quickly gauge the visibility and influence of recent articles in scholarly publications. While most researchers are familiar with the well-established journals in their field, that is often not the case with newer publications or publications in related fields - there're simply too many of them to keep track of! Scholar Metrics summarize recent citations to many publications, to help authors as they consider where to publish their new research.
- Journal Metrics (Scopus)
- Elsevier, provider of Scopus, has set up a website to rival Journal Citation Reports. Their Journal Metrics page offers two metrics (Source-Normalized Impact per Paper (SNIP) and SCImago Journal Rank (SJR)) to measure the performance and/or impact of a journal. Explanations of these metrics can be found on the website.
- Publish Or Perish
- Publish or Perish is a free, non-profit software program (courtesy of www.harzing.com) that uses Google Scholar to obtain the raw citations and can provide a variety of statistics that can be viewed and copied into other applications or saved for further analysis.
- Eigenfactor Project (University of Washington)
- The Eigenfactor Project is a non-commercial academic research project sponsored by the Bergstrom Lab at the University of Washington. It is powered by data from Microsoft Academic Search.
Journal Quality Assessment - Publishing Opportunities
When determining quality outlets for publishing your work, it can be beneficial to determine:
- Manuscript acceptance/rejection rates for articles in the publication
- Composition of the publication's editorial board
- Objectivity of the publication's review process (e.g. blind/editorial/peer review)
Publisher's websites can be useful in this determination in combination with the following resources:
- UlrichsWeb: Global Serials Directory
- Ulrichsweb is a quality UNB-subscribed database offering detailed, comprehensive and authoritative information about >300,000 periodical titles published around the world. Details include: publication type/history, editions/formats, frequency, whether the title is scholarly/refereed, price, abstracting/indexing sources, online availability, publisher contact information.
- Cabell's Directory of Publishing Opportunities
- Cabell's directories can assist in determining which journals might be the best fit for your manuscript. Currently UNB Libraries subscribe to the Business Collection via Cabell's. The index within each directory attempts to match the characteristics of your manuscript to the topic areas the journal emphasizes.
- Cabell's also offers descriptions of the types of review process used by the editor(s), number of reviewers, acceptance rate, time required for review, and availability of reviewers' comments.
- JANE:Journal/Author Name Estimator
- JANE is a database (courtesy of The Biosemantics Group) which can locate journals that publish within a specialized area by comparing journal title, abstract or keywords in MEDLINE (the U.S. National Library of Medicine's premier bibliographic database), and matching articles, journals or authors. Click this link to view Frequently Asked Questions about JANE.
- Journal Quality List - Prof. Anne-Wil Harzing
- The Journal Quality List compiled by Prof. Anne-Wil Harzing is a collation of journal rankings from a variety of sources. It is published primarily to assist academics to target papers to journals of an appropriate standard. The listing covers academic journals in the areas of Economics, Finance, Accounting, Management, and Marketing.
- DOAJ - Directory of Open Access Journals
- The Directory aims to be comprehensive and cover all open access scientific and scholarly journals that use a quality control system to guarantee the content. DOAJ journals must exercise peer-review or editorial quality control to be included.
For further assistance, please contact your Liaison Librarian.
Bibliometrics are coming. They have been widely used in Europe for some time now. Evaluation of individuals, research teams, departments, faculties, and universities is happening. Look, for example, at what Scival offers to those wanting to do bibliometric analyses. However, the metrics are only as good as the underlying data. It is important for our institution and its members that the bibliographic data in the major databases is as accurate as we can make it.
There are a wide variety of Web sites that build profiles of researchers. Some focus on social networking (connecting people working on simialr topics, for example) while others are more like RefWorks and allow the researcher to display bibliographic production. With most of them, the researcher has the option of contibuting to or maintaining the profile. Below, we will outline some of the resources we have found and their potential for use.
Note: All of the databases in this section create metrics (such as the h-index for researchers). This data is publicly available.
- The researcher ID is a feature that Scopus has incorporated directly into their product. All faculty are urged to search for themselves in Scopus and closely examine the results. The profiles are created by an automatic process and many "imperfections" have been noticed. In many cases, the researcher's publications are split into multiple IDs. There is a "merge" function in Scopus that will allow anyone to fix this problem. For other inaccuracies, Scopus offers the Author Feedback Wizard
- Google Scholar
- With a Google account, any researcher can create a Google profile. The profile allows the user to make sure she is properly represented in the GS database. Unlike Web of Science or Scopus, missing references may be added to Google Scholar by typing them into a form.
- Microsoft Academic Search
- MAS has some of the most powerful bibliometrics options available. It enables the camparisions of institutions or departments within institutions based on their h-indexes. Individuals may create a profile by signing into MAS through the UNB login (yes, your regular old ID and PIN work). Once in, profile maintenance is a snap, as missing items may be uploaded in the form of a BibTex file (which you can create in RefWorks or, see above, EndNote Web).
- This service is mainly for researchers in the Social Sciences. It links names with all of our ProQuest databases (such as Sociological Abstracts. To claim and edit your profile, preform a name search and then register for the service.
Metrics in Promotion and Tenure
At UNB, there is no requirement that faculty provide Assessment Committees with bibliometric data when being considered for promotion and tenure. There is no place for this on the UNB CV form. However, there is nothing to stop any faculty member from including metrics when it is to their advantage to do so. Section 7 of the CV form asks for "Other Relevant Information" and knowing that a researcher's work is highly cited would certainly be relevant. Remember, though, that each of the resources listed above will yield a different figure for any metric, depending on the underlying data. The applicant is certainly entitled to use the source that best makes the case for advancement. When submitting such data, the date and service where the metric was observed should be submitted.
Other Sites with Profile Capability
These sites not only allow reference management, but create profiles for the researcher. The profiles can lead to contact being made with others in your field or who have a particular expertise.
- An initiative to provide unique profiles to researchers across different content systems. Recent enhancements by Web of Science and Scopus have incorporated ORCID into these major systems.
- Mendeley features stand-alone clients that can be downloaded to any desktop. These clients allow the manipulation of bibliographies and are synched with the data stored on the Mandeley Web site. Mendeley encourages all researchers to upload their publications' metadata and, where permitted, PDFs. This allows "crowd-sourced" searching and discovery. Individual accounts are free. However, some of the better features (such as group creation) are severely limited with free accounts.
- Social Science Research Network
- An open access repository for those working in the Social Sciences and Himanities. There's also a blog that will keep interested readers up-to-date on the system.
- Allows researchers to upload data, including negative data that might not be publishable elsewhere. All data is citable and discoverable.
- Shut down on January 1, 2015.
- The American Mathematical Society has always tried to keep track of individual authors and their publications. They have published a Web guide to linking to author profiles. Corrections to the database and author attributions may be made by emailing mathrev at ams dot org.
- This site came online in early 2015. It is sponsored by a number of major academic publishers. It could be worth watching as it evolves. Right now, an interesting feature is the ability to comment on your published article.
- Mostly a social networking site for the scientific community. The site allows you to upload a list of your own publications. These can then be discovered by other researchers.
- The main menu allows for three options:
- Share your papers
- See analytics on your profile and papers
- Follow other people in your field
- Get Cited
- The premise is that if you upload your publications, they will be found by others and cited in subsequent work.
- "Free online reference management for all researchers, clinicians and scientists". This Web service is run by the Nature Publishing Group. It's mainly promoted as a bibliographic utility but does have "Community" and Profile pages. A note from the publisher indicates that this service will close as of March 31, 2013.
- "CiteSeerx is an evolving scientific literature digital library and search engine that focuses primarily on the literature in computer and information science". Researchers working in computer science and related areas may want to join and update their information.
- For high-energy physics types.
- ACM Digital Library
- The ACM DL creates a Web profile.For those in CS and related fields only.
- Reader Meter
- This service is described as still in alpha development. From the "About" page: "Using readership-based metrics we can estimate impact on the basis of the consumption of scientific content by a population of readers. ReaderMeter adapts two popular impact metrics for authors (the H-Index and the G-Index) and redefines them using bookmarks instead of citations as an HR-Index and a GR-Index respectively... Analysing readership data can help discover areas of real-time impact that may not be visible to traditional citation-based measurements. Readership data is made available via the Mendeley API." The original file provides references.
- From the About page: "ScienceCard is a web service that collects all scientific works published by an author and displays their aggregate work-level metrics. ScienceCard allows a researcher to create and maintain a researcher profile with minimal effort, and to export and reuse this information elsewhere. To make this as effortless as possible, ScienceCard relies on unique identifiers for authors (currently identifiers from Microsoft Academic Search and AuthorClaim) [see below] and works (currently digital object identifiers or DOIs)".
- Total Impact
- Another metric-building site that uses non-traditional measures. This site uses IDs from SlideShare, GitHub, and Dryad.
Reference Management Software
Compared to the sites above, these allow for less social networking.
- A free tool that allows you to collect, organize, cite and share your research sources.
- "citeulike is a free service for managing and discovering scholarly references". A new feature is automated article recommendations.
- Scholar Universe
- Data from this service will be used in ORCID. COS Scholar Universe is closely associated with RefWorks. The funny blue and white symbols next to author names in RefWorks link out to COS Scholar Universe profiles.
- According to their Web site, the name is pronounced "dog ear". In any case, this is a free, open source reference manager. It is downloadable software with versions for Windows, Linux, and Mac. They creators have put together a teaser video that gives a brief idea what the software can do. The documentation states that Docear has "mind mapping capabilities". If you are familiar with Freeplane, you may know what this is. In a similar fashion to RefWorks, Docear has a Microsoft Word plugin.
- For those working in economics and related fields only.
- A clone of the RePEc service, with wider applicability.
- DBLP Computer Science Bibliography
- From the FAQ: "The dblp computer science bibliography is an on-line reference for bibliographic information on major computer science publications. It has evolved from an early small experimental web server to a popular open-data service for the computer science community. Our mission at dblp is to support computer science researchers in their daily efforts by providing free access to high-quality bibliographic meta-data and links to the electronic editions of publications."
Definitions & Readings
This is perhaps the most widely used measure of a scholar's lifetime productivity. It was first proposed by Hirsch in this paper.This schematic, taken from the original paper's web site, illustrates the concept:
Assume the scales on both axes are the same. At the position "1" on the x-axis, record the number of times the most cited paper by the author was cited. At "2", rank the 2nd most cited paer, etc. The 45 degree line from the origin will intersect the resulting curve at a whole number. This is the h-index. It can be read as saying that the researcher has had h papers cited at least h times.
The concept has been expanded to rank departments and institutions as well as individuals.
The concept of using the h-index to measure the influence of a single paper was introduced by Schubert (2009). Egghe proposed that the "h" be capitalized to distinguish between two similar concepts. Egghe describes the single publication H-index as follows: "track all articles that cite this single publication and put them in descending order of citations they received. Then the Hirsch-index of this ranked set is called the single-publication H-index of this single publication."
This is not actually a metric but is a set of articles. As defined by Google Scholar, if a researcher has an h-index of 10, the h-core set of articles would be the articles that had garnered more than 10 citations.
Again, from Google Scholar, this is defined as "the median of the citation counts in [an] h-core ... The h-median is a measure of the distribution of citations to the h-core articles".
h5-index, h5-core, and h5-median
All the metrics confined to the latest 5 calendar years.
This is a simple index that "indicates how many items in the result list were cited at least ten times" (Jascó, 134). Google Scholar uses a variant that calculates the i10-index for citations received in the last 5 years.
This index was first introduced by Egghe. It is a variant on the h-index that gives more weight to very highly cited papers. From the article: "The g-index g is the largest rank (where papers are arranged in decreasing order of the number of citations they received) such that the first g papers have (together) at least g2 citations. (p. 144)
Microsoft Academic Search calculates a g-index for researchers who have established a profile with that service.
m-index or m-quotient
One basic criticism of the h-index is that it takes a lengthy career and many publications to establish a high rating. Cannot a yournger researcher demonstrate an impact quantitatively? This measure takes the researcher's h-index and divides it by the number of years since the publication of the first work. Younger, more active researchers will therefore score better using this measure in comparision to their h-index scores against more established researchers. This measure was first proposed by Hirsch and a further explanation has been contributed by Harzing
A journal's IF is a rating produced by Thompson Reuter's Journal Citation Reports. Both widely used and controversial, it attempts to rate the overall importance of a journal. It is computed by adding the number of citations all the items in the journal garnered and then dividing this by the number of items. Impact Factors are usually computed for a two year period.
UNB Libraries subscribes to Journal Citation Reports. Many journals also display their Impact Factors on their home pages. As well, Google Scholar is using their own set of metrics to rank journals.
SCImago Journal Rank (SJR)
This indicator was developed from the Google PageRank algorithms and uses data from Scopus. It is available for free at the SCImago Journal & Country Rank web site.
Reference Management Software
UNB subscribes to a reference manager called RefWorks. It allows users to store citation and organize them into bibliographies. Other reference managers are available that provide different features, including linking of articles through citations. Li, Thelwell & Giustini compared two of these products (CiteULike and Mendeley) to traditional citation counts in Web of Science.
"In 2010, the Minister of Industry, on behalf of the Natural Sciences and Engineering Research Council of Canada (NSERC), asked the Council of Canadian Academies to examine the international practices and supporting evidence used to assess performance of research in the natural sciences and engineering disciplines." This is their report.
This document by Paul Jarvey and Alex Usher of Higher Education Strategy Associates discusses an alternative method for measuring research strength at Canadian universities.
Courtesy of The Research Information Network (UK) - a policy unit funded by the UK higher education funding councils, seven research councils and three national libraries.
A bibliometric research project conducted at the Institute for Research Information and Quality Assurance (IFQ – Institut für Forschungsinformation und Qualitätssicherung), a scientific institution funded by the German Research Foundation (DFG – Deutsche Forschungsgemeinschaft)
The ERA Initiative assesses research quality within Australia's higher education institutions. It also provides a national stocktake, by research discipline areas, of research strength against international benchmarks.
Business, Economics, Fine Arts, Classical Archaeology
History, Political Science, Religious Studies, Human Rights, Great Ideas