Example Metrics and Providers

A growing number of metrics are used by the altmetrics community, and the most important metrics and providers are listed below. Not all metrics measure scholarly impact, some of them are indicators of attention, and in rare cases selfpromotion. Some metrics are good indicators of activity by scholars (e.g. citations or Mendeley bookmarks), whereas other metrics reflect the attention by the general public (e.g. Facebook or HTML views) (Table 1).

Metrics describe different activities: usage stats look at the initial activity of reading the abstract and downloading the paper, whereas citations are the result of much more work, they therefore account for less than 0.5 % of all HTML views. Altmetrics tries to capture the activities that happen between viewing a paper and citing it, from saving an article to informal online discussions.

Mendeley

Mendeley is one of the most widely used altmetrics services—the number of articles with Mendeley bookmarks is similar to the number of articles that have ciations. Mendeley provides information about the number of readers and groups. In contrast to CiteULike no usernames for readers are provided, but Mendeley provides basic information regarding demographics such as country and academic position. Mendeley is a social bookmarking tool used by scholars and the metrics probably reflect an important scholarly activity—adding a downloaded article to a reference manager.

CiteULike

CiteULike is another social bookmarking tool, not as widely used as Mendeley and without reference manager functionality. One advantage over Mendeley is that usernames and dates for all sharing events are publicly available, making it easier to explore the bookmarking activity over time.

Table 1 Categorizing metrics into target audiences and depth of interaction (cf. ImpactStory 2012)

Scholars

Public

Discussed

Science blogs, journal comments

Blogs, Twitter, Facebook, etc.

Recommended

Citations by editorials, Faculty of 1,000

Press release

Cited

Citations, full-text mentions

Wikipedia mentiones

Saved

CiteULike, Mendeley

Delicious, Facebook

Viewed

PDF downloads

HTML views

Twitter

Collecting tweets linking to scholarly papers is challenging, because they are only stored for short periods of time (typically around 7 days). There is a lot of Twitter activity around papers, and only a small fraction is from the authors and/or journal. With some journals up to 90 % of articles are tweeted, the number for new PLOS journal articles is currently at about 50 %. The Twitter activity typically peeks a few days after publication, and probably reflects attention rather than impact.

Facebook

Facebook is almost as popular as Twitter with regards to scholarly content, and provides a wider variety of interactions (likes, shares and comments). Facebook activity is a good indicator for public interest in a scholarly article and correlates more with HTML views than PDF downloads.

Wikipedia

Scholarly content is frequently linked from Wikipedia, covering about 6 % of all journal articles in the case of PLOS. The Wikipedia Cite-o-Meter[1] by Dario Taraborelli and Daniel Mietchen calculates the number of Wikipedia links per publisher. In the English Wikipedia the most frequently cited publisher is Elsevier with close to 35,000 links. In addition to Wikipedia pages, links to scholarly articles are also found on user and file pages.

Science Blogs

Blog posts talking about papers and other scholarly content are difficult to track. Many science bloggers use a blog aggregator, Research Blogging, Nature Blogs and ScienceSeeker being the most popular ones. The number of scholarly articles discussed in blog posts is small (e.g. less than 5 % of all PLOS articles), but they provide great background information and can sometimes generate a lot of secondary activity around the original paper (both social media activity and downloads).

Altmetrics Service Providers

Comprehensive altmetrics are currently only available from a small number of service providers. This will most likely change in the near future, as more organizations become interested both in analyzing altmetrics for their content (publishers, universities, funders) or for providing altmetrics as a service.

The Open Access publisher Public Library of Science (PLOS) was the first organization to routinely provide altmetrics on a large number of scholarly articles. The first version of their article-level metrics service was started in March 2009, and PLOS currently provides usage data, citations and social web activity from 13 different data sources. The article-level metrics data are provided via an open API[2] and as monthly public data dump.

Altmetric.com is a commercial start-up that started in July 2011. They maintain a cluster of servers that watch social media sites, newspapers and magazines for any mentions of scholarly articles. The data are available to individual users and as service for publishers.

ImpactStory is a non-profit service providing altmetrics since late 2011. They provide both altmetrics and traditional (citation) impact metrics for both traditional and web-native scholarly products, and are designed to help researchers better share and be rewarded for their complete impacts.

Plum Analytics is a start-up providing altmetrics data to universities and libraries. They also provide usage stats and citation data, and track research outputs beyond journal articles, e.g. presentations, source code and datasets.

At this time it is unclear how the altmetrics community will develop over the next few years. It is possible that one or a few dominant commercial players emerge similar to the market for citations, that a non-profit organization is collected these numbers for all stakeholders, or that we see the development of a more distributed system with data and service providers, similar to how usage data for articles are distributed.

Challenges and Criticism

Many challenges remain before we can expect altmetrics to be more widely adopted. A big part of the challenge is the very nature of the Social Web, which is much more difficult to analyze than traditional scholarly citations.

1. the constantly changing nature of the Social Web, including the lack of commonly used persistent identifiers

2. self-promotion and gaming, inherit to all Social Web activities, and aggravated by the difficulty of understanding who is talking

3. Altmetrics is more interested in things that can be measured, rather than things that are meaningful for scholarly impact. We therefore measure attention or self-promotion instead of scholarly impact.

These challenges are less of a problem for discovery tools based on altmetrics, but are hard to solve for evaluation tools. Altmetrics is still a young discipline and the community is working hard on these and other questions, including standards, anti-gaming mechanisms, and ways to put metrics into context.

  • [1] Wikipedia Cite-o-Meter: toolserver.org/*dartar/cite-o-meter/
  • [2] GitHub: https://github.com/articlemetrics/alm/wiki/API
 
< Prev   CONTENTS   Next >