by Kris Turner
My colleagues and I are currently hip-deep in conversation about how to best measure and present the impact that our faculty have on the larger academic community. This information is critical to tenure committees, libraries, the faculty themselves, and practically everybody else in law schools.
In the academic context, metrics measure the impact that a particular article, author or institution has on other research. The data that is used to measure this ranges from social media impact (tweets, likes, etc.) to citations in highly respected law journals and reviews.
Many libraries focus on gathering data on citations, downloads, mentions (meaning mentions in media or blogs),and abstract views (the number of times an article or author’s page has been read but not necessarily downloaded). A number of databases, both paid and free, gather this information. The most relevant ones to the law community are SSRN and HeinOnline, though others such as ResearchGate and Google Scholar are also discussed in the conversation. Westlaw, Lexis, and BloombergLaw all track this data to a certain extent as well, though not as thoroughly in my experience.
How does a librarian track down all these important stats across so many different platforms? If you had to choose one statistic to track, what would it be? Are there any tools out there to help sort out this conundrum? So many questions!
There are a few commercial metrics tools and other tools out there that you can use to organize your institution’s research…but no perfect solutions. First, let’s cover a few handy definitions to help you decipher some of my metric-speak below, then I’ll discuss several metrics tools.
H-index: The Hirsch index is a numerical way of determining someone’s scholarly impact based on the raw numbers. The actual calculation requires math far beyond my abilities (I’m a librarian for a reason), but the end result spits out a number that you hope is large. The higher the number, the bigger the impact.
Repository Badge, or Badge: Two of the tools below (PlumX and Altmetrics) have the ability to add a widget, or badge, to your repository pages. These give you a quick review of an article’s impact and certainly grab a viewer’s attention when they are browsing your repository.
Author or article pages: A page in your repository (or in another database) that houses one specific article or the articles located in that database by a specific author.
Before I briefly discuss the tools, I would be remiss if I didn’t first mention Christine Anne George’s excellent and thorough post earlier this year on ORCID. In many of the tools below, ORCIDs are a crucial part to locating unique articles by specific authors. They are growing in usefulness and popularity, and using them to help you collect metrics is a smart move.
Altmetrics: Probably the best-known metrics tool, Altmetrics measures scholarly impact from an article level all the way up to an institutional one. On an article’s page, you can see it’s impact in the metric doughnut (with different colors for different impacts), which gives you an H-index score. Almetrics can be seen in some very large journals such as BioMed Central, but I have not seen it in use in many (if any) legal-centric databases. The graphic to the right is a snapshot of an Altmetric doughnut for a particularly robust article that Altmetrics uses as their example.
PlumX: Owned by Ebsco, PlumX is designed to give you an idea of how an article is doing in five distinct areas: Usage, Captures, Mentions, Social Media, and Citations. Each one is assigned its own color and is weighed by the amount of impact in each area. For example, an article that is heavily cited will have a large red (citation) ‘plumprint’ badge. PlumX pulls from a variety of free databases and is beginning to work with more commercial databases as well. No law schools currently use PlumX, but the University of Pittsburgh has implemented it.
Impactstory: Impactstory is mostly aimed at science researchers and faculty, but its design and layout are still interesting. Impactstory wants to become a “new CV” but better. With one glance, you can review someone’s publications, social media, impact, and more. Instead of a badge like PlumX or Altmetrics, Impactstory gives each faculty member a page where their articles are listed as ‘highly cited’ or ‘cited,’ depending on the number of cites. If you hover over the ‘cited’ icon, you get the number. Here is an example of Impactstory, from their webpage.
Scholarometer: A free web extension from Indiana University, Scholarometer provides budget-conscious librarians a way around the more costly tools listed above. Scholarometer pulls citation data from Google Scholar and then gives the user an H-index score after reviewing the Google Scholar metrics. Scholarometer produces some useful information, but I found that it worked only part of the time. It usually produced duplicates, and if your faculty member has a common name…well, expect plenty of false positives. Below is a snapshot of Scholarometer’s web extension and some results courtesy of Pitt’s ISchool.
Author pages on SCOPUS/Hein/SSRN/BePress/Google Scholar/Etc.: Maybe just using one database that tracks this data is good enough? There are plenty to choose from that give you pretty solid data. Hein Online recently introduced author pages that allow authors to see their citations and views and manage their profiles. Google Scholar offers a similar option for researchers. Of course, many are familiar with SSRN and BePress Repositories, both of which give very valuable data to the law school community.
The trick is gathering all that data in one spot. While there is no magic potion that will be a perfect match for your requirements (yet), the options out there are well worth exploring. If you use or have heard of a tool I didn’t list, please let me know. Good luck on your metrics hunt!