In the past days I've set to find a number of article-level metrics (ALMs) for a bunch of articles. TO make things easier, I've set out to get:
- number of citations per article
- number of views/downloads/reads
- geographic coverage (where were the people interested in that article)
Initially, my thought was: 'how hard can it be?'. Well... about that...
First thing I found out, there are multiple tools to find these metrics, some affiliated (or owned) by publishing houses. I found:
- Mendeley - based on Scopus. Gives you readers over time, citations, reader types
- Scopus itself - gives you detailed citations. if you have a proper account, you can get a detailed analysis.
- PlumX - aggregates data from Scopus, Mendeley and can give more insights
- Dimensions.ai - gives you citations
- Altmetric - gives more data, aggregated from Mendeley, Dimensions
- Research gate - gives citations, references, reads
- Google scholar - as a last resort to get information about obscure papers (e.g. old ones, in books)
In principle, few sites can be considered as sources. Scopus, Research Gate, Google Scholar look to have their own data. Things like Mendeley, PlumX, Dimensions and Altmetric do some very clever mashups to obtain their metrics. Quite a few of them provide APIs too (although you usually need to be a social researcher to get access).
I also found e.g. Altmetric measures social media too, which has become quite important these days for science popularisation.
I found a lot of limitations with the tools, mainly focused in these areas:
- Limited data on older articles - I found very hard to get data on old articles. It makes sense as all this digital trend is relatively young (try to get a PDF in the late 1990s and you'll receive a scan of a paper) and only big powerhouses can afford to index citations for old papers (e.g. Scopus does).
- Profiles' quality - Researchers can set up their have a profile set up on e.g. Mendeley. Some manage their profiles, some don't. I guess this will help with indexing later.
- YMMV - Even though data sources are limited (e.g. Scopus for citations), different tools obtain different values (e.g. citation numbers). Metrics also vary between articles on the same site (e.g. some have geo-coverage, some don't).
Things could be better. Given this is a relatively new way of measuring research impact, I expect results will improve as people use these tools (I think things like Altmetric have tasks in background to index things they don't get right the first time).
Social inclusions are coming in strong. New articles have mentions on social media and this weighs quite heavily for their impact.
I'm waiting for a h-index-like value for an article, taking into account all these factors (or lack thereof).