Use Cases
Altmetrics can complement traditional bibliometrics in a number of scenarios:
• Metrics as a discovery tool
• Data-driven stories about the post-publication reception of research
• Business intelligence for a journal, university or funder
• Evaluation of the impact of research and researchers
Metrics as a Discovery Tool
Information overflow has become a major problem, and it has become clear that relying on the journal as a filter is no longer an appropriate strategy. Altmetrics have the potential to help in the discovery process, especially if combined with more traditional keyword-based search strategies, and with the social network information of the person seeking information. The advantage over citation based metrics is that we don't have to wait years before we can see meaningful numbers. The free Altmetric PLOS Impact Explorer[1] is an example for a discovery tool based on altmetrics and highlights recently published PLOS papers with a lot of social media activity. Altmetric.com also provides a commercial service for content from other publishers.
Data-Driven Stories About The Post-Publication Reception of Research
Altmetrics can help researchers demonstrate the impact of their research, in particular if the research outputs are not journal articles, but datasets, software, etc., and if the impact is best demonstrated in metrics other than citations. ImpactStory[2] focuses on this use case. Often creators of web-native scholarly products like datasets, software, and blog posts are hard pressed to demonstrate the impact of their work, given a reward system built for a paper-based scholarly publishing world. In these cases, ImpactStory helps to provide data to establish the impacts of these products and allow forward-thinking researcher. ImpactStory also gathers altmetrics to demonstrate wider impacts of traditional products, tracking their impact through both traditional citations and novel altmetrics.
Business Intelligence for a Journal, University or Funder
The focus is not on the individual article, but rather on overall trends over time and/or across funding programs, disciplines, etc. This is an area that the typical researchers is usually less interested in, but is important for strategic decisions by departments, universities, funding organizations, publishers, and others. This area has been dominated by large commercial bibliographic databases such as Web of Science or Scopus, using citation data. Plum Analytics[3] is a new service that also provide altmetrics and is focusing on universities. The publisher PLOS[4] makes a comprehensive set of citations, usage data and altmetrics available for all articles they published.
Altmetrics as an Evaluation Tool
Traditional scholarly metrics are often used as an evaluation tool, including inappropriate uses such as using the Journal Impact Factor to evaluate publications of individual researchers. Before altmetrics can be used for evaluation, the following questions need to be addressed:
• Can numbers reflect the impact of research, across disciplines and over time?
• Does the use of metrics for evaluation create undesired incentives?
• Do the currently available altmetrics really measure impact or something else?
• How can we standardize altmetrics?
• How easily can altmetrics be changed by self-promotion and gaming?
The first two questions relate to more general aspects of using scientometrics for evaluation, whereas the last three questions are more specific for altmetrics. All these issues can be solved, but it will probably take some time before altmetrics can be reasonably used for evaluation.
Author-level metrics can also include citations and usage stats. Citations are a more established metric for impact evaluation, and citations based on individual articles are much more meaningful than the metrics for the journal that a researcher has published in. The Hirsch-Index (or h index, Hirsch 2005) is a
popular metric to quantify an individual's scientific research output. The h index is defined as the number of papers with citation number C h, e.g. an h index of 15 means a researcher has published at least 15 papers that have been cited at least 15
times.