Henk Moed, Elsevier, Amsterdam, The Netherlands
Speech at the conference Technologies Transforming Research Assessment on March 19th, 2014 in Vilnius, Lithuania.
Since research performance is more and more regarded as a key factor in economic performance and societal welfare, research assessment has become a major issue for a wide range of stakeholders, and there is an increasing concern for research quality and excellence, for transparency, accountability, comparability and competition.
The primary purpose of the lecture is to provide an introduction to the use of bibliometric indicators in research assessment, aimed to show the boundaries of the playing field, and to highlight important rules of the game. It underlines the potential value of bibliometrics in consolidating academic freedom. It stresses the relevance of assessing the societal impact of research, but emphasizes at the same time that one must be cautious with the actual application of such indicators in a policy context.
The lecture shows the multi-dimensionality of the concept of research performance. It presents the notion of the multi-dimensional research assessment matrix, which was introduced in a Report published in 2010 by an Expert Group on the Assessment of university Based Research (AUBR), installed by the European Commission. If one is engaged in a research assessment process, one has to decide which methodology should be used, which indicators calculated, and which data collected. Therefore, one should address a series of questions; their answers determine which methodology and types of indicators should be used. Each question relates to a particular dimension of the research assessment process.
Global university rankings have gained a strong interest both from managers, researchers and the general public. Although such rankings are marketing tools rather than research management tools, underlying data constitute a rich source for secondary analyses of policy-relevant issues that help testing policy assumptions. The lecture aims to illustrate this.
The lecture also identifies major trends in the field of bibliometrics as “big data” science, and focuses on the creation of large, compound databases by combining different datasets. Typical examples are the integration of citation indexes with patent databases, and with “usage” data on the number of times articles are downloaded in full text format from publication archives; the analysis of full texts to characterize the context of citations; and the combination of bibliometric indicators with statistics obtained from national surveys. Significant outcomes are presented of studies based on such compound databases, and their technical and conceptual difficulties and limitations are discussed.
Henk F. Moed is Senior Scientific Advisor at Elsevier in Amsterdam as from 1 February 2010. He is a former senior staff member, – and during the last few months before his departure, a full professor of research assessment methodologies – at the Centre for Science and Technology Studies (CWTS), in the Department (Faculty) of Social Sciences at Leiden University, as from 1986. He obtained a Ph.D. degree in Science Studies at the University of Leiden in 1989.
He published over 50 research articles, and is editor of several journals in his field. He is a winner of the Derek de Solla Price Award in 1999. He published in 2005 a monograph, Citation Analysis in Research Evaluation (Springer, 346 pp.), which is one of the very few books in the field.
He has been active in numerous research topics, including:
the creation of bibliometric databases from raw data from Thomson Scientific’s Web of Science and Elsevier’s Scopus;
analysis of inaccuracies in citation matching;
assessment of the potentialities and pitfalls of journal impact factors;
the development and application of science indicators for the measurement of research performance in the basic natural- and life sciences;
the use of bibliometric indicators as a tool to assess peer review procedures;
the development and application of performance indicators in social sciences and humanities;
studies of the effects of ‘Open Access’ upon research impact and studies of patterns in ‘usage’ (downloading) behaviour of users of electronic scientific publication warehouses;
studies of the effects of the use of bibliometric indicators upon scientific authors and journal publishers.