If you are a researcher or plan to, going social with your research is not only an ethical position but an actual must for your career. With the alternative metrics in the horizon, ranking your papers in a broad dissemination approach, every tweet, like and share count.
A couple of weeks ago, we shared some tips on how to set up a profile, follow people and build an academic network around your research on social media. But, why build it in the first place? In a word: Altmetrics.
Scientists around the world are using social media to share and disseminate their research and engage with both fellows and citizens. Many do so as a means to give something in return for the public funding spent on their projects. More or less skilfully, they do their best to sound appealing and close. Some even reach it.
On this topic, however, voluntarism may not be a choice anymore. Four scholars, led by Professor Jason Priem, from the University of North Carolina-Chapel Hill, published in 2010 “Altmetrics: a manifesto”, a new proposal to change how scientific production should be assessed. And it is growing momentum every day since.
In their manifesto, the signers claim traditional bibliometrics are dead and hail the new alternative metrics as the true King. Before they can do so, however, a strong argument needs to be put in place. They provide three: the limitations of traditional peer review, the stiffness of citation counting and the odd use of the Journal Impact Factor.
The idea of an individualised impact measure for every single publication is not new and has been around for quite a long time. Its implementation, however, had been hindered in the past by the practical impossibility to count single citations, scattered among hundreds of printed journals. But the world has kept spinning and the deployment of new web-based tracking technologies has turned the old utopia into a real possibility in the nowadays social Internet.
Will Twitter be the new gold standard?
In a sense, there is no such revolution. The idea behind Altmetrics is quite simple after all: why just count citations in journals when we could be counting citations everywhere? This is why the most used alternative indices (i.e. Altmetric.com, Plum Analytics and Impact Story) track citations in the main social networks, the blogosphere, Wikipedia and the new journal repositories like Mendeley and Scopus. And also in the traditional way.
Being the new metrics linked to the presence of your research in the network, the more you go social the better. But sceptics warn that we could not be counting what is relevant, as reviewed in 2016 by Stacy Konkiel, Director of Research at Altmetric.com. What if you buy a dozen bots and make them disseminate your latest paper on Twitter 24/7? It would not be a sort of spam to manipulate your score?
An uncomfortable truth arises: any metrics based on counting citations, anywhere, reflect the audience, but not the relevance. Altmetrics supporters claim relevance—and even impact, the holy grail of metrics— lay behind the aggregate of different metrics. From this point of view, a piece of research that makes it through discussions in social networks, appear in the Wikipedia, is acknowledged in science blogs and is referenced in other papers, all together, must be relevant and could eventually make an impact.
Make no mistake, Altmetrics are here to stay. Nature and Science, the classical references, are already including Altmetric.com scores in all their published papers. Needless to say, more research is needed to refine the method and make it more resistant to manipulation, more meaningful and realistic but, in the meantime, I would be sharing that last paper over my networks, if I were you (and keep begging my colleagues to cite it in their papers too, just in case).