Reviewer Contribution Index (RCI). How it works and the benefits for the Peer Review Ecosystem

 

Reviewer Contribution Index (RCI) is the metric based on the F3-index and developed by ReviewerCredits in collaboration with University of Valencia. It considers three objective parameters to measure the contribution of peer reviewers: Review report length, Review report delivery time, Alignment of reviewer recommendation to editorial decision.

How RCI works on ReviewerCredits

 

Reviewer Contribution Index goes from 0 to 100 (where 100 corresponds to the maximum contribution) and is calculated once a week. A reviewer of a journal that has subscribed to the RCI feature could show or hide RCI in their public profile on our platform, as shown below.

The RCI will be shown in the reviewer certificate if he/she makes it public.

The journals that have subscribed RCI feature can see in their profile the indexes of their reviewers. As well as the individual strengths on the three parameters: review report length, review report delivery time, alignment of reviewer recommendation to an editorial decision. The RCI feature subscription will be shown in the journal certificate as shown below.

The Benefits of RCI that we see for Scientists, Editors and Journals

 

  • Scientists have a concrete feedback on their peer review activity, based on the measurement of objective data
  • Editors and Journals have a reference metric for:

✓ monitoring peer reviewers performances more sistematically, driven by data
✓ identifying and recognizing the most active Peer Reviewers
✓ managing a reviewers pool turnover, by co-opting new members based on performance

Reviewer Contribution Index is a feature included in the Journal Premium Plan and can be subscribed as a standalone feature by our Free and Plus Journals.

Journal Editors or Managers and Publishers interested in Reviewer Contribution Index can ask for a demo or a quotation

How F3-index works

The F3-index was presented by Federico Bianchi (University of Milan, Italy), Francisco Grimaldo (University of Valencia, Spain) and Flaminio Squazzoni (University of Milan, Italy) in their Open Access paper published by the Journal of Informetrics.The F3-index. Valuing reviewers for scholarly journals

It is the result of comparing the performance of peer reviewers assigned to the same manuscript, given that any submission is different from another in lenght, depth and required effort. F3-index thus can change over time and regardless of the general activity of the individual peer reviewer. It’s a relative and context-specific index that considers three quantifiable dimensions.

Federico Bianchi, Francisco Grimaldo, Flaminio Squazzoni, The F3-index. Valuing reviewers for scholarly journals, Journal of Informetrics, Volume 13, Issue 1, 2019, Pages 78-86, ISSN 1751-1577, https://doi.org/10.1016/j.joi.2018.11.007

 

The choice of dimensions and parameters reflected previous attempts at quantifying reviewer performance (Casnici, Grimaldo, Gilbert, Dondio, et al., 2017; Hartonen & Alava, 2013; Laband, 1990). (…) It does not mean that other factors or dimensions are irrelevant. We have proposed a tool that could be adapted to the context-specific interests of editors and journals, as well as to the quality of available data. Furthermore, we did not aim to identify a “one-size-fits-all” methodology that could reflect context-specific journal characteristics and include varying dimensions [1].

 

The F3-index considers peer reviewers assigned to the same manuscript as match players who compete to deliver a pertinent, informative and timely review.

 

We adapted Keener’s method to develop a rating and ranking algorithm for reviewers of scholarly journals. We considered reviewers as participants of a tournament in which each reviewer was matched over time with others as they were assigned to evaluate the same manuscripts. Similarly to participants in a tournament, reviewers’ strength can be measured by calculating a rating based on their reviewing behaviour, by measuring it together with the behaviour of other reviewers assigned to the same manuscripts and their strength. (…) We calculate attribute measurements by standardizing raw statistics. Then, following a recommendation by Keener (1993), we transform standardized statistics by applying Laplace’s Rule of Succession (1995 [1825]) in order to avoid ‘winner-takes-all’ effects [1].

As Federico Bianchi, Francisco Grimaldo and Flaminio Squazzoni clarify in their article, the F3-index doesn’t estimate the quality of Peer Reviewers:

First, although our index is flexible and adaptable, there are intrinsic limits in quantifying certain qualitative dimensions of peer review (Cowley, 2015). For example, our index should not be used to examine the quality of peer review comprehensively, as this would require discussing what quality is and for what purpose it is relevant for. As already mentioned, the index is only useful to rank performance on quantifiable factors and identify outstanding cases, while careful attention must be paid on conceptual frameworks to inform parameter choices (Subochev, Aleskerov, & Pislyakov, 2018) and communication means to avoid any misuse either by editors or reviewers [1].

 

[1] Federico Bianchi, Francisco Grimaldo, Flaminio Squazzoni, The F3-index. Valuing reviewers for scholarly journals, Journal of Informetrics, Volume 13, Issue 1, 2019, Pages 78-86, ISSN 1751-1577. https://doi.org/10.1016/j.joi.2018.11.007 (https://www.sciencedirect.com/science/article/pii/S1751157718301275)

From F3-index to Reviewer Contribution Index (RCI)

ReviewerCredits has implented the F3-index in collaboration with Francisco Grimaldo and Daniel Garcia Costa (Members of Project PREWAIT: Advanced information tools about peer review of scientific manuscripts – University of Valencia), calling it Reviewer Contribution Index (RCI) to enlight:

  • the relevance of the voluntary activity performed by peer reviewers
  • the need to measure it, in order to properly recognize and reward it


At the moment, the calculation and the assignment of the RCI covers peer reviewers of the registered Journals using our OJS Plugin version 2.1
 We thank the Journals that have helped us in the development and testing activities (in parentheses the Publisher):

We are open to any possible integration with manuscript platforms for the Reviewer Contribution Index. Request us a demo.