Only KGL Logo

First posted on the ORIGINal Thoughts Blog

By:
Melecia Cropped Headshot 003 E1690293845241 278x300 Melecia Miller, MPH
Scholarly Support and Engagement Coordinator, Origin Editorial
ORCID: 0009-0005-7747-9747

 

 

Jason Larger 300x270Jason Roberts, PhD
Senior Partner, Origin Editorial
LinkedIn: https://www.linkedin.com/in/jasonrobertsorigin
ORCID: 0000-0001-9054-4180

 

Take Home Points:

  • Although willing and able peer reviewers are increasingly hard to find, journals can engage with the reviewers that they already have by offering rewards for participating in reviewer training programs.
  • The best-resourced journals are likely to take the lead in developing training initiatives.
  • This blog post presents a brief summary of publicly advertised reviewer training programs, the majority of which operate under a mentor-mentee model and are designed for early-career researchers.
  •  Editorial offices, journals, and societies can use the lists presented here to help develop or refine their own reviewer training programs.

In this second of three posts on the theme of reviewer training as a form of engagement to both increase reviewer invitation acceptance rates and elevate review standards, we look at a variety of training programs that have already been implemented. This endeavor was not a methodologically driven academic exercise. Instead, the purpose of this post is simply to collate some potentially inspiring programs that journals and societies may consider emulating. The final post in this series will reflect upon the issues associated with implementing a peer reviewer-training program.

As many of the training programs listed in this blog post note, willing and able peer reviewers are increasingly hard to find. As similarly posited in the initial post on training as a form of reviewer engagement, there is a sense that if you offer a tangible educational benefit, especially one that involves direct interaction with trainees, such as a mentorship program, there is a possibility you can build loyalty while simultaneously inculcating the values of effective peer review.  After reviewing multiple peer reviewer training programs, it is fair to say this concept is not especially radical or original. For instance, while seeking a course provider for their peer reviewer training course, the European Geosciences Union explicitly linked training to boosting the number of qualified reviewers.

As journals express frustration regarding the difficulties of securing reviewers and as publishers and industry service providers scramble to find better, algorithmically driven, ways to identify reviewers, it is fair to say many journals do not do a good job of connecting and engaging with the reviewers they already have. An annual shout out is always good, and should be done, but probably only marginally fosters goodwill or boosts a sense of collaboration and brand loyalty between a journal and its constituents.

Naturally, there is a tendency for the best-resourced journals to take the lead in developing training initiatives. It is also likely, in a good number of cases, that they are the journals least in need of remedial efforts. Smaller or newer journals will be confronted with either a lack of resources to develop training or no obvious community to call upon and mold into a pool of engaged potential reviewers, especially if those journals are not backed by a supporting society. Nevertheless, that does not mean such journals should exclude themselves from training efforts. As some of the examples presented in this post illustrate, there are multiple ways to train reviewers. Many present similarly themed training materials that can be adapted and redeployed in a variety of journal contexts. Some organizations, such as publishers and industry service providers offer training that could be linked to or co-opted into some sort of supporting arrangement. Yet more examples see societies support their journals by providing training programs as a member benefit, rather than leaving the journal to fend for itself. Furthermore, there is nothing stopping journals from banding together and jointly offering resources or training programs. In short, the wheel does not need to be invented journal by journal.

One problem in devising any manner of training program, however, is that little- to no- research has been undertaken to determine collective reviewer core competencies, making this systematic review preprint by Willis et al. a rare source of information. Furthermore, there is no definitive evidence that peer review training works, though, of course, there is a vast, general, corpus of literature on the effectiveness of training, especially with regards to mentorship, beyond the narrow confines of peer review training. That reviewers need training at all has still never been scientifically confirmed though, anecdotally, most journal editors would agree that just because a researcher is a subject expert, it does not make them a skilled peer reviewer. Indeed, though researchers often learn by doing and reflecting upon what they learned, what is really needed is a structured training program that provides feedback and, to some degree, modeling of good practice. It is also likely impossible to train a general subject matter expert, such as a clinician, to be a strong methodological reviewer as well. This may suggest any training program will work best if it trains reviewers to be skilled in some tasks and self-aware in others to know they are at the limits of their knowledge base and to raise the flag for specialist reviewers, such as statisticians or patient reviewers to take over with a closer look. In a similar theme, Jigisha Patel’s opinion piece on providing specific training to review Randomized Controlled Trials is spot on with its assertion that specialized training, backed with skills appraisals and revalidation, could elevate standards. One could also contend that such specializing in training could posit a sense of clear purposefulness for reviewers, who would understand clearly both the task in front of them and why they were being asked in the first place.

Ultimately, as this is a blog post, our intent is to provide a short cut to several peer reviewer training resources for editorial offices, journals and societies that are looking to develop their own programs or refine what they are already doing. Likely, there are probably many more journals that are undertaking semi-structured or informal training behind closed doors. Some programs focus on highly technical matters or journal-specific needs. Others, plainly, are generic and may provide useful foundations upon which to build more specific resources. To aid comprehension, the resources have been grouped into several different categories. Membership of the category is not exclusive, and the program offered may well be the result of collaborative efforts from several different actors.

What we now attempt to do is review some of the most well-known, readily accessible and discoverable reviewer training programs broken down by the type of program provider. Within each section, the programs are roughly listed in order of comprehensiveness, with the top of the list prioritizing programs that provide general information, evidence-based training, a tangible reward, multiple methods of training, mentee recruitment through invited reviewer recommendation and participation as a mentor; and the list ends with programs for physician residents, and other student- / trainer-led programs.

Examples of Journal-Specific Training

The following list contains training programs that are typically resource driven – such as lecture slides, reviewer guidelines and reading materials. Some mentor-driven programs appear to have evolved from the unofficial practice whereby assigned reviewers pass on the job of completing a review to a student, often without credit, into a more formalized program of mentor-mentee instruction while completing peer review on a manuscript undergoing review.

A detailed tabular summary of these training resources has been placed here. You will be able to download the table from that link.

Examples of Society-Generated Training

During our search we found some instances where the society, rather than a specific journal, appeared to offer training. The level at which this training was pitched ranged from a clear member benefit to a general offering to the entire field of study, be they members of the society or not. In some instances (e.g. the American Speech Language Hearing Association) it seems the society is providing the content to serve its portfolio of journals.

A detailed tabular summary of these training resources has been placed here. You will be able to download the table from that link.

Publisher-based Training

Most of the major publishers offer training in some form or another. Much of it focuses on the core principles of good peer review that stretch across every field of study (be it art history or computational physics). The courses listed below appear to be free of charge unless otherwise stated:

Funder-led Programs

  • Canadian Institute of Health Research – participation by application in a mentor-driven program supplemented with online resources, recent Canadian federal grant recipients are eligible, each CIHR Project committee can host 2 mentees at a time; program is designed with Project grant review in mind, though with clear crossover appeal for general journal reviewer training – https://cihr-irsc.gc.ca/e/52323.html

Other Examples

Common Characteristics of Reviewer Training Programs

Some of the programs, such as those developed by The BMJ, are designed to appeal to anyone interested in improving their peer review skills. Overwhelmingly, however, the programs originate with early career researchers in mind, often under a mentor-mentee, experiential learning model, sometimes including an evaluation of what editors/course directors have determined to be a good review. For example, The BMJ also provides examples of editor feedback on peer review comments and a sample manuscript for any interested individual to review. The courses may conclude with a self-assessment test.

The need for early career training is obvious, especially with no standardized training, as many researchers may be disadvantaged by not having access to learning opportunities. As the American Psychological Association notes, training at graduate or post-graduate level is not always provided. Some courses seek to formalize the already existing informal practice of reviewers passing off the job of reviewing to their students by suggesting the originally assigned reviewers offer some sort of mentorship while completing the review, with the student then receiving recognition for completing the review. Many of these programs also stress that participating in a course offers the opportunity to open up new career advancement possibilities.

While some journal programs do not mention a reward, several programs – such as the one devised by Elsevier – offer participants a certificate or letter of completion, whereas others offer recognition in the journal, integration into a journal reviewer pool or various combinations of these rewards.  Uniquely, the American Journal of Pharmaceutical Education conducts a formal graduation from their reviewer mentorship program, and the Journal of Immunotherapy of Cancer offers recognition at its society’s annual meeting as well as addition to the preferred reviewer database.  Some societies such as the American College of Clinical Pharmacy and the American Society of Therapeutic and Radiation Oncology offer continuing education credits, and the latter also offers the opportunity to become a fellow of the society.  In addition to a certificate of completion, the American Chemical Society offers training in English and in Chinese and offers reviewer trainees who pass the final assessment an identifying badge in their submission system.  As mentioned in the previous post, journals can choose to collaborate to provide reviewer training to researchers.  The American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America journals have done just that by uniting to provide mentorship to early career researchers, taking advantage of the Web of Science Academy curriculum as a basis for their program.  In return, participants receive credit on their Web of Science profile, in addition to a certificate of completion and enrollment as a peer reviewer by the appropriate journal.

What is less clear is whether any of the courses were evidence-based in their design and, crucially, who trains the trainers. Some programs such as the ACS Reviewer Lab and the course offered by The BMJ seem to have been constructed upon published research that has studied the impact of peer review. Most programs, alternatively, seem to be led by subject thought leaders and/or known high-quality reviewers. While mentorship and experiential learning under the guidance of a mentor can be successful, there is the potential for it to accentuate bias or prolong entrenched hierarchical views (“listen to me/trust me, I am an expert”). The art of peer review, of course, is particularly susceptible to this as there are no agreed-upon core competencies, no definitive training resources and little research into what constitutes a good review. A pertinent question to ask, therefore, is who trains the trainers? The Web of Science Academy was one of the few resources that appeared to address the mentor rather than just the mentee with a suite of learning modules.  Willis et al recently wrote an article about a risk of bias tool that they developed to assess the extent to which reviewer training programs are evidence-based.  The tool assesses factors such as whether more than one stakeholder group was involved in developing the reviewer training program, data was gathered for training development, pilot testing was done, learning objectives were presented, and whether learning was evaluated, as well as the method of feedback used in evaluation. In lieu of/in the absence of a training program for trainers, such a tool may be helpful for journals seeking a training program for their reviewers in the interim.

Conclusion

Training courses are somewhat commonplace and range in scale, knowledge delivery mechanism and intended audience. Many aim to combine didactic, self-paced, learning modules with mentorship. But, as this review of the courses shows, there are no uniform standards and no validation of the accuracy or appropriateness of what is taught. Nevertheless, many of these efforts offer possibilities to many that otherwise face no opportunity to learn, which in turn may impact on their overall success as a researcher by missing out on learning about key analytical skills. Interestingly, most of the programs tie training to both boosting the pool of available reviewers and raising collective skill levels, all concepts discussed in the first part of this blog series.

Nevertheless, many journals will struggle to develop their own resources, though setting up a network of mentors may be a more obtainable goal. The potential exists for collaboration. The most successful approach to collaboration may be at a cross-journal or subject matter level. Any higher level than that, such as publisher-led, though useful, may lack the specificity that is needed for domains of study because of their need to serve a general population. Perhaps the future lies in defining reviewer core competencies at a general level, establishing a code of ethics and then adding layers of subject specificity on top of those competencies. Journals can then add a further layer on top of that if they have very specific questions they wish their reviewers to address. Collaboration, of course, can collectively elevate a field while also allowing smaller journals to participate in training programs.Willis et al recently wrote an article about a risk of bias tool that they developed to assess the extent to which reviewer training programs are evidence-based.  The tool assesses factors such as whether more than one stakeholder group was involved in developing the reviewer training program, data was gathered for training development, pilot testing was done, learning objectives were presented, and whether learning was evaluated, as well as the method of feedback used in evaluation. In lieu of/in the absence of a training program for trainers, such a tool may be helpful for journals seeking a training program for their reviewers in the interim.

Acknowledgements:
Thanks to Stephanie Kinnan, Senior Managing Editor of Publications, American Society for Gastrointestinal Endoscopy for her thoughts on the Gastrointestinal Endoscopy peer review training program.

Origin Editorial is now part of KnowledgeWorks Global Ltd., the industry leader in editorial, production, online hosting, and transformative services for every stage of the content lifecycle. We are your source for society servicesmarket analysis, intelligent automationdigital deliveryand more. Email us at info@kwglobal.com.

Go to Top