The counting house: measuring those who count. Presence of Bibliometrics, Scientometrics, Informetrics, Webometrics and Altmetrics in the Google Scholar Citations, ResearcherID, ResearchGate, Mendeley & Twitter

Following in the footsteps of the model of scientific communication, which has recently gone through a metamorphosis (from the Gutenberg galaxy to the Web galaxy), a change in the model and methods of scientific evaluation is also taking place. A set of new scientific tools are now providing a variety of indicators which measure all actions and interactions among scientists in the digital space, making new aspects of scientific communication emerge. In this work we present a method for capturing the structure of an entire scientific community (the Bibliometrics, Scientometrics, Informetrics, Webometrics, and Altmetrics community) and the main agents that are part of it (scientists, documents, and sources) through the lens of Google Scholar Citations.
Additionally, we compare these author portraits to the ones offered by other profile or social platforms currently used by academics (ResearcherID, ResearchGate, Mendeley, and Twitter), in order to test their degree of use, completeness, reliability, and the validity of the information they provide. A sample of 814 authors (researchers in Bibliometrics with a public profile created in Google Scholar Citations was subsequently searched in the other platforms, collecting the main indicators computed by each of them. The data collection was carried out on September, 2015. The Spearman correlation was applied to these indicators (a total of 31) , and a Principal Component Analysis was carried out in order to reveal the relationships among metrics and platforms as well as the possible existence of metric clusters.

Access the full-text

The next bibliometrics: ALMetrics (Author Level Metrics) and the multiple faces of author impact

The main goal of this work is to set the purpose and content of a new branch of bibliometrics, which we call ALMetrics (Author-level Metrics), focused on the quantitative analysis of scientific authors’ performance through the measurement of all the dimensions of their intellectual activity through the most varied metric indicators. This work is directed specifically to list, define and classify the different metrics that are offered today in the new information portals created to showcase the scientific activity of authors. These metrics are grouped into five sets: bibliometrics (publication and citation), usage, participation, rating, social connectivity, and composite indicators. The birth of this new bibliometric specialty is justified by the new trends that can be foreseen in the scientific assessment, which transport us from an old bibliometrics (based on the journal analysis and the application of the Impact Factor as a flagship indicator) towards a new bibliometrics directly based on the analysis of both documents and authors through a mix of indicators, responding not only to researchers’ desire of knowledge, but of acknowledgement.

Access the full-text

ResearchGate como fuente de evaluación científica: desvelando sus aplicaciones bibliométricas

ResearchGate es una de las más importantes redes sociales académicas, con más de 9 millones de usuarios y 80 millones de documentos. Además de importantes herramientas de networking social académico y ofertas de empleo, proporciona un amplio catálogo de indicadores bibliométricos, entre los que se encuentra ResearchGate Score. El objetivo principal de este trabajo es revelar las principales ventajas e inconvenientes de estos indicadores, prestando una atención especial al citado RG Score, indicador insignia de ResearchGate. Pese a que ResearchGate ofrece unas prestaciones e indicadores bibliométricos con enormes posibilidades para obtener datos complementarios acerca del impacto de la producción científica y académica de un autor, tanto las políticas de comunicación como algunas acciones recientes tomadas en el diseño, elaboración y difusión de sus indicadores generan importantes dudas respecto a su uso con fines evaluativos. Finalmente, respecto a RG Score, se concluye que el indicador no mide el prestigio de los investigadores sino su nivel de participación en la plataforma.

Access the full-text

Back to the past: on the shoulders of an academic search engine giant

A study released by the Google Scholar team found an apparently increasing fraction of citations to old articles from studies published in the last 24 years (1990–2013). To demonstrate this finding we conducted a complementary study using a different data source (Journal Citation Reports), metric (aggregate cited half-life), time spam (2003–2013), and set of categories (53 Social Science subject categories and 167 Science subject categories). Although the results obtained confirm and reinforce the previous findings, the possible causes of this phenomenon keep unclear. We finally hypothesize that “first page results syndrome” in conjunction with the fact that Google Scholar favours the most cited documents are suggesting the growing trend of citing old documents is partly caused by Google Scholar.

Access the full-text