top of page

0 is the magic number: Why small numbers matter just as much as large ones when we talk about Altmet

A lot has been written in the last couple of years about altmetrics and the score that comes with them. Whether that be the Altmetric.com, ResearchGate or Kudos’ score to name but a few. Some of the tools focus in different areas with Altmetric.com being one that tries to capture a broad range of data from scholarly and public communications. With that comes their own Altmetric.com score that is weighed depending on what platform was used. For example, a Tweet is worth one point, a blog post five and a news article eight. Hence with so many of these metrics, including traditional ones like the impact factor score, h-index and citation count, the bigger the number the better. With Altmetric.com that may be good but not wholly useful, as small numbers, especially 0 can tell us a lot too.

The real value in altmetrics does not come from the score but that it measures previously ignored research outputs, such as individual papers and datasets. Also it shows us where these outputs are being communicated or in the case of an altmetric score of 0 – not communicated. The score of zero is to some extent more important than 50, 100 and higher. It tells us that this research has not been shared, discussed, saved or covered in the media. In a world increasingly governed by impact, scholarly communication and dissemination of your research, the straight flat zero indicates a possible need to communicate your work. Of course detractors of such systems will point to such as the Kardashian index and that science is not about popularity, or the ability to communicate beyond a niche group. It is about the ability to complete rigorous and quality research: ie research that is captured in journals, repositories, data management systems and shared at conferences. Yet when so many systems for global scholarly communication exist, why not use them?

Given that many research papers are never cited then it should follow that they will never be Tweeted, shared, blogged or saved in Mendeley. Yet as the research world increasingly uses social media to communicate with the wider world, whether that be publishers, charities, funders or the general public, the ease in which academics can communicate their work is apparent. If a researcher’s altmetric score is 0 it may seem depressing to think no one has shared or communicate their work but it does offer them a starting place in this new world. Unlike citations, it is an instant feedback loop, if you want to act upon that remains your choice.

Whilst critics may be wary of gaming altmetrics scores, and rightly so, the number 0 tells us something potentially important. That either no one knows your research exists and are yet to discover it or that sadly no one is interested in it. Obviously we cannot say this for sure, as we are just talking about active participants on the web, whether that be a discussion forum, blog, news or social media. There are many academics not engaged on the social web who one day with the aid of a literature search or conference presentation will discover your work.

At least with the altmetric score of 0 you can only go up, no one can get a negative altmetric score. So this means investigating who and where to share your research. The problem detractors have with altmetrics is that they are concerned we are focusing just on the numbers. It is a legitimate concern, how many Tweets your paper gets is not an indication it is a good piece of research. Yet that has always been the case, a high number of citations has not always indicated high quality research. The chances are that it is a good piece of research, but we can take nothing for granted in academia. If we were to put it into a sporting context and cricket, having the highest batting average or scoring the most runs in a team has never been an indicator of the best player. Given that it does give us insight that we are looking at some of the best of the bunch, it is merely a useful indicator. As with altmetrics, this is what they are, indicators of communication and interest of varying levels. So the concern is that funders, managers, journals might start to pay too much attention to big numbers. This in turn might cause some to gameplay to increase those numbers, but that was always the case pre-altmetrics. Journal editors have been known to ask authors to cite papers from their own publications and it’s not unheard for authors to self-cite.

Whilst some of this might sound like counter arguments to altmetrics, they are not. We do need to have discussions about what we want from altmetrics. Many academics would be lying if they denied they were not interested in where their research was being discussed on the web. The useful by-product from altmetrics is that we have a much better idea if our research is not being discussed at all. The score 0 may be as significant as 1000 for some academics as it tells them that no one is talking about their research.

The bigger problem is that it has become confusing when there are several platforms that generate their own metrics, Altmetric.com, ResearchGate and more public tools such as Klout have their own scores. It could start to feel like the early days of Ebay before commercial companies set up profiles and generating a 100% feedback score became paramount. These days it is not so important, big feedback scores on ebay mean nothing more than they have sold lots of stuff, one negative makes no difference. Whilst academics will become increasingly aware of the newer metrics, some may be shocked by the succession of zeros by their outputs, especially when so many could be highly cited. The solution is to explain why this happens should they wish build on that score. The scores do not convey the quality of their work or standing, but for those wanting to reach out or looking for feedback on how this is going, then the number 0 is a sign that the only way is up.

bottom of page