Skip to content


Research Integrity and Peer Review

Open Access
Open Peer Review

This article has Open Peer Review reports available.

How does Open Peer Review work?

Designing integrated research integrity training: authorship, publication, and peer review

Research Integrity and Peer Review20183:2

Received: 14 September 2017

Accepted: 15 February 2018

Published: 26 February 2018


This paper describes the experience of an academic institution, the Queensland University of Technology (QUT), developing training courses about research integrity practices in authorship, publication, and Journal Peer Review. The importance of providing research integrity training in these areas is now widely accepted; however, it remains an open question how best to conduct this training. For this reason, it is vital for institutions, journals, and peak bodies to share learnings.

We describe how we have collaborated across our institution to develop training that supports QUT’s principles and which is in line with insights from contemporary research on best practices in learning design, universal design, and faculty involvement. We also discuss how we have refined these courses iteratively over time, and consider potential mechanisms for evaluating the effectiveness of the courses more formally.


The idea that institutions ought to provide researchers with formal training in research integrity is now generally accepted. How best to conduct research integrity training, however, is a contested issue [15].

One option is to provide research integrity training by way of “standalone” courses or units, covering a broad range of responsible research practices. The United States Office of Research Integrity (ORI) recommends that training should encompass nine core instructional areas: (1) research misconduct, (2) protection of human subjects, (3) welfare of laboratory animals, (4) conflicts of interest, (5) data management practices, (6) responsibilities of mentors and trainees, (7) collaborative research, (8) Authorship and Publication, and (9) peer review [6]. One option is to train all of these together in a single standalone course. Our institution has a comprehensive online course that does just that, and there are some advantages to this approach. It ensures that all researchers who have undertaken the course are familiar with their responsibilities across all of the core areas. Moreover, training in this way mirrors the structure of national codes governing responsible research practices and allows institutions and researchers to demonstrate their commitment to comprehensive research integrity training and to satisfy the requirements of certain funding bodies.

However, it is not clear that the “standalone” training method is sufficient to teach research integrity effectively or to promote an institutional culture that truly values the responsible conduct of research. Research integrity training is not a “vaccine” to be administered just once [7]. A single training experience is insufficient to instil ethics training in a lasting way [1] and “outsourcing” ethics training to a single course risks sending a negative message that “education is developed… with an eye to expedience rather than excellence” and is therefore not valued by the institution [8].

This paper describes the first 2 years of our institution’s experience attempting to move beyond the single standalone training method to an “integrated training method” wherein various topics in research integrity are addressed separately, framed in the context of researchers’ goals and QUT’s goals to promote responsible research practices, and integrated with other forms of research training. To date, we have developed two such courses. In the first, we have integrated content about a researcher’s responsibilities with respect to Authorship and Publication into a more general course about publishing academic papers. In the second, we have integrated content about research integrity into a course about how to conduct and respond to peer review. We use these integrated courses to complement, rather than to replace, our standalone online research integrity training. This hybrid approach is in line with findings from a 2017 meta-analysis that concluded that hybrid courses are the most effective method of training [2].

The idea of adopting an integrated approach to research integrity training is not new. For one thing, integrating research integrity training with other forms of research training is something that good research supervisors have always done and should continue to do. As Faden et al. put it, “No type of research ethics training will be more effective, ultimately, than mentoring… watching senior faculty grapple with a question of research ethics, as it emerges, in real time, in the lab, the field, or the clinic” [9, 10]. We agree. Indeed, we have a network of Research Integrity Advisors in each faculty whose role includes promoting research integrity, and our institution’s Research Students Centre offers a training course called Effective Supervisory Practices that promotes good mentorship among faculty who supervise graduate students. But not all mentors or supervisors are equal in this respect, and at any rate, the supervisory mentorship model may be insufficient, by itself, for the complexities of the contemporary postgraduate research environment wherein “postgraduate research is no longer viewed according to narrow conceptions of supervision, but as the whole environment and culture in which HDR research is undertaken” [11]. For institutions, like ours, that want to ensure all researchers have access to quality research integrity training, a broader coordinated approach is necessary to complement the day-to-day interactions between supervisors and their trainees.

In developing our training approach, we have drawn on the work of others. We have attempted to design our training in line with contemporary literature on best practices in learning design, universal design, and faculty involvement, as we will describe in this paper. We know that other institutions—locally and globally—are also working to develop effective research integrity training, and believe it is important to share methods and learnings.

Our context

Our office—the Office of Research Ethics and Integrity—was established in 2014 and provides education, guidance, support, and advice about research ethics and integrity within the Queensland University of Technology (QUT). QUT is a major Australian university in Brisbane, Queensland. QUT positions itself as a “university for the real world” with some 50,000 students, a global outlook, and a strong research focus. QUT also places a high value on promoting a culture of research integrity and offers a range of complementary training in this area, including an online research integrity course, and workshops in human and animal ethics. Research integrity also features in other courses, notably Advanced Information Research Skills, which is mandatory for higher degree research students (

In 2016–2017, we launched the two new research integrity training courses described in this paper: Authorship and Publication and Journal Peer Review. The former was launched in April 2016 and has since been attended by 607 participants; the latter was launched in May 2017 and has been attended by 116 participants.

Crucial to the development of our training has been the collaboration across our institution. It is a joint initiative between the Office of Research Ethics and Integrity and the QUT Library Research Support Team. This is a diverse core team encompassing expertise in scholarly communication, data management, research ethics, journal editing, law, learning design, and philosophy. We have also sought the input and participation of senior researchers across all faculties of our institution.

Authorship and Publication

Authorship and Publication is a two-and-a-half-hour face-to-face course that aims to provide higher degree research students and early career researchers with a conceptual map of the world of academic publication. It covers a variety of interrelated topics in a series of lightning talks and multimedia and aims to equip researchers with tools, resources, and information to help them achieve one of their goals: publishing research.

We aim to weave a researcher’s primary ethical responsibilities in terms of Authorship and Publication into the context of achieving this goal. For example, the course considers how to determine authorship, how to ensure originality and avoid plagiarism, how to declare conflicts of interest, and how to manage ongoing responsibilities for published works. These topics are interspersed with other practical research skills: choosing the right journal, managing data, academic writing tips, registering and maintaining an ORCID iD (, and writing a cover letter to the editor. Some of the topics have an intersection between integrity training and other forms of research training. For example, the section about choosing the right journal involves understanding open access and open data, and these latter topics are simultaneously practical issues involving consideration of funding and visibility and ethical issues about transparency and reproducibility. QUT is a strong supporter of open access (, being the first university in the world to adopt an institution-wide open-access policy, and it is something that we promote in our courses.

The structure of Authorship and Publication follows a subway diagram [12] that was carefully designed to tie all of the individual topics together into a coherent narrative (Fig. 1). The session begins in the top left corner of the map, and the agenda follows a path towards publication, crossing various lines, summarising each step of the process. The map is crucial to the training because its structure gives participants a conceptual framework to understand how all the pieces fit together and to think of their own current circumstances within the bigger picture.
Fig. 1

Handout (A3) provided to participants of Authorship and Publication

The topics are delivered using a variety of different methods. Some topics are covered by lightning talks of 3–5 min; others are covered by animated videos; others are covered by short interview clips of senior academics expressing their views and demonstrating different disciplinary perspectives. We make the multimedia content available to participants immediately following the sessions, and some course videos are openly available to anyone under a Creative Commons CC BY 4.0 Licence [13, 14]. The agenda for our most recent session (Oct 2017) is provided in Table 1.
Table 1

Authorship and Publication agenda



Method of delivery


Welcome, overview, etc.

Narrator introduction


Develop a data management plan

Lightning talk


Get an ORCID iD

Lightning talk


Agree authorship

Lightning talk


Academics discussing authorship

Video (short interview clips)


Shortlisting journals

Lightning talk


Open access 101

Lightning talk


Writing tips

Video (short interview clips)


Originality and plagiarism

Lightning talk


Report COIs and acknowledge grants

Video (animated)


Do you still need a cover letter?

Lightning talk


Respond to peer review

Video (animated)


Review the publishing agreement

Lightning talk


Deposit manuscript at QUT ePrints

Lightning talk


What happens after publication?

Lightning talk


Promote your work

Lightning talk



Open discussion

All of these topics are covered within two-and-a-half hours, and so the course is fast paced. Indeed, it is too fast paced for participants to retain all the information about every topic. But within the context of our training goals, this is not a critical issue. The goal of these training sessions is for participants to gain a broad understanding of the whole publication process—to understand how all the complex parts fit together—and to discover resources and tools such that they can pursue individual topics further according to their various needs or interests. By providing course materials immediately after the sessions, we encourage the audience to follow the presentations, rather than scramble to take contemporaneous notes. This is an appropriate training method for our participants who, as academic researchers, are quite capable of self-directed learning, and of grasping complex topics quickly, but are often unaware of all the tools, resources, and support services that are available to help them. In terms of research integrity, we aim to frame all of the topics covered in the context of good and responsible practices and a number of the modules reinforce each other. In this way, we strive to make a norm of good Authorship and Publication practices within our institution. For example, in discussing research data management, we discuss how it enables reproducibility and data sharing. Similarly, in discussing cover letters, we explain competing interests, how declaring one’s interests promotes transparency, and so on. These are the normative dimensions of publication, presented in the context of practical tools and skills. By blending these, our training aims to promote an institutional research culture in which Authorship and Publication are always understood through the lens of research integrity. This is important, since as Langlais and Bent note, “Research… suggests that graduate students have very little awareness of the normative bases of research” [15]. Moreover, our goal is in line with what Antes et al. describe as “the implicit aim of ethics instruction – to foster a community of social responsibility” [4]. We think of our integrated approach as something like “research integrity training by stealth”, and we use this method to complement the more explicit research integrity training that researchers at our institution undertake as part of a standalone online course.

We have refined the courses over time in response to participants’ feedback. Originally, we offered Authorship and Publication over two sessions—“Fundamentals” and “Strategies”—the second of which was a slower-paced panel discussion that explored specific topics in greater detail. We discontinued the slower-paced Strategies session as feedback suggested that participants preferred the faster-paced Fundamentals version. We preserved some discipline-specific elements of the Strategies session in the form of videos that have been included in the faster-paced course and are available on the course website. These are discussed further below.

Throughout the training, we give participants many opportunities to ask questions and express their own views. Following each agenda item, we allow a participant to ask the presenter “one burning question”. We also have an extended question and discussion time at the end of the session.

All participants are provided with links to more information and opportunities to learn further—by watching additional videos available on the course website, by joining one of the university’s academic writing circles, or by connecting with their Faculty-based Research Integrity Advisor or Liaison Librarian. We set up information tables outside the seminar room where participants can explore these options, take home some promotional materials, and chat about any of the topics covered.

Journal Peer Review

Journal Peer Review is our second course in this series. It is a two-and-a-half-hour face-to-face course and aims to provide early career researchers with a broad understanding of the Journal Peer Review system and introductory training in conducting and responding to peer review.

Just as Authorship and Publication was structured around a subway diagram, our Journal Peer Review course is structured around a diagram that draws the individual topics together into a coherent conceptual framework. In this case, the diagram depicts a great industrial machine, composed of many parts that have been added or changed over time [16] (Fig. 2). We will discuss this machine further below. The reverse side of the handout contains a variety of “quick tips”, sourced from academics, about conducting and responding to peer review (Fig. 3).
Fig. 2

Handout (A3) provided to participants of Journal Peer Review (front)

Fig. 3

Handout (A3) provided to participants of Journal Peer Review (back)

As with the previous course, we intertwine the principles of research integrity throughout the training, including fairness, competence, transparency, and confidentiality.

Specific research integrity issues also arise in understanding different forms of peer review. For example, the course content includes encountering and responding to signed and anonymous/unsigned models of post-publication peer review. In another section of the course, participants are asked to consider and discuss three case studies involving peer review from the Committee on Publication Ethics (COPE) (

The agenda for our most recent Journal Peer Review session (Oct 2017) is provided in Table 2.
Table 2

Journal Peer Review agenda



Method of delivery


Welcome, overview, etc.

Narrator introduction


Introduction to peer review

Lightning talk


History of Journal Peer Review

Video (animated)


Peer review as quality control



Informal and formal



Being an editor

Presentation (guest speaker)


Conducting peer review

Presentation (guest speaker)


Responding to peer review

Video (short interview clips)


Pitfalls in peer review

Presentation (case studies)


Emerging trends in peer review

Presentation (including an activity)



Open discussion

The topics are covered using a combination of live talks and multimedia. Participants are encouraged to discuss their own views, and all participants are connected with tools, resources, and links to more information.

Design of the courses

Learning design can significantly impact learning outcomes, and we have aimed to design our courses in view of contemporary research in this area.


One basic principle of learning design is that images are often more successful than blocks of text [17]. “People can learn more deeply from words and pictures than from words alone [18].” To that end, we attempted to reinforce all major concepts with suitable imagery. Most concepts were illustrated using minimalistic graphics. For example, when discussing different types of peer review, we use the graphic shown in Fig. 4 to explain “Double Blind”, “Single Blind”, and “Open” peer review. In all cases, we have tried to use imagery to reinforce, and not to distract from, the spoken words. This complementary method, according to Mayer’s cognitive theory of multimedia, enhances learning in terms of attention and memory and reduces extraneous cognitive processing [18].
Fig. 4

Minimalistic representation of kinds of peer review in terms of anonymity


The effectiveness of animation in training is still open to debate. There is an argument to be made that animated resources have the potential to both attract participants (through humor, novelty, etc.) and enable more effective learning, especially in the case of dynamic concepts that are not easily conveyed by a static image. However, empirical research has so far failed to provide convincing evidence for the superiority of animation over other teaching methods in terms of memory retention [19].

Nonetheless, with the jury still out, we decided to produce short animated videos [20, 21] for a number of reasons. First, short animations are undoubtedly attractive to higher degree students and early career researchers, our target audiences. Animations provoke interest, and can be used as online advertising materials to promote the courses, as we have done with some success. Second, animated clips can easily be shared online after the sessions or on social media. In this way, the courses can become blended learning experiences, transcending the face-to-face time in the workshops themselves, and provoking ongoing discussion. Third, animations have a higher capital cost but a lower operating cost than other forms of presentation. In practice, a senior professor can write and record an expert voice-over, which is then set to animation by a less expensive human resource; thereafter, the video can be used at any future session, even if the senior professor is unable to attend.

Our goal was to provide engaging representations of the key concepts and principles. In one of our animated videos, a narrator discusses the history of peer review as being like a great industrial machine that has built-up and evolved over time [20]. As the narrator recounts the history, parts of the machine are added or changed. The history begins by presenting a simple model of editorial review adopted by Philosophical Transactions in 1665 (Fig. 5); it goes on to explain various innovations and additions one by one (e.g., preprints, post-publication peer review, and mechanisms for getting credit for peer review (Fig. 6)); and the animation ultimately concludes by zooming-out to reveal the vast complex system of peer-review we have today—a large, ever-changing machine in perpetual motion, comprised of both old parts and new (Fig. 7).
Fig. 5

Screenshot from animated history of peer review—simple system circa 1665

Fig. 6

Screenshot from animated history of peer review—credit for peer review 2013

Fig. 7

Screenshot from animated history of peer review—peer review as we know it 2017

This animated information is then reinforced by the following presentation in which a senior academic elaborates about various elements of the animation, and relates it all to the A3 handout depicting the whole industrial machine [9] (Figs. 2 and 3).

Within our animations, we have aimed to keep the text to a minimum, and almost all of the informative content is presented by the narrated voice-over. This is in keeping with Reed’s principles of modality and redundancy:

“Modality principle: Students learn better from animation and narration than from animation and on-screen text.”

“Redundancy principle: Students learn better from animation and narration than from animation, narration, and on-screen text [22].”

Universal design

We want to ensure that our courses are accessible to all researchers at our institution. To that end, we have designed these courses in view of the principles of universal design—“the design of products and environments to be usable by all people to the greatest extent possible, without the need for adaptation or specialized design” [23].

For example, our slides were produced in view of the universal design features of multiple means of representation, larger fonts, and higher contrasts.

We encountered some challenges designing for a universal audience. Following our first session of Authorship and Publication, one participant complained (fairly) that our animations were difficult for participants with varying sensory abilities and s/he could not follow the material. Our response to this dilemma was to make the content available in multiple parallel formats to suit a range of learners. We created a course book that condenses all of the animated information into static text and images, allowing some participants to choose to read rather than watch the moving screens. Participants can now choose the learning materials that are best for them. The complainant expressed that s/he was happy with this response. Moreover, this solution has proved helpful for some other participants—notably international students—for whom the courses were initially too fast paced.

Involving senior faculty

For research integrity training to be successful, there needs to be buy-in from academics from a range of disciplines. Participants must see that research integrity is not merely an “add-on” or something promoted by administrators or management, but something deeply valued by the professors they see as mentors, and with whom they interact on a regular basis [1]. Involving faculty in research integrity training is crucial, since their expertise adds legitimacy, and they provide disciplinary context to the principles [24].

In developing our courses, we sought the involvement of senior academics from all faculties at our institution: Health, Science and Engineering, Education, Creative Industries, Law, and the QUT Business School. In particular, we involved Faculty-based Research Integrity Advisors, senior academics at our institution, whose role includes providing advice about research integrity.

For the first iteration of Authorship and Publication, we asked a number of our Research Integrity Advisors to present as part of a panel at each session. Our participant feedback surveys showed that these sessions were well received, but, as mentioned previously, they were not as well received as the faster-paced multimedia sessions. Moreover, a training review concluded that for groups with a maximum of 100 participants at a time, inviting three senior academics to present in-person at each session, in addition to our core presentation team, gave the course a high operating cost and was not a sustainable long-term solution. We understand that these sorts of challenges—involving faculty in centralised research integrity training—are common [24]. However, we have since found one potential solution. Since the beginning of 2017, we have conducted video interviews with senior academics from different disciplines and edited together these interviews into short video clips to show as part of our courses (Fig. 8). This solution retains many of the benefits of involving senior faculty voices in the workshops and can be used, sustainably, for our ongoing training in the longer term. We have also made the full versions of the interviews available to our participants online to promote extended learning [2531]. Participants get a small taste of the different videos in the session itself, and can then watch the longer videos about various topics online after the session, depending on their needs or interests.
Fig. 8

Sample set of interview clips with Research Integrity Advisors shown at sessions

The video interviews covered the topics of authorship, publication, peer review, and other innovations in publishing. We asked 7 senior academics a set of 13 questions each [Appendix]. We then edited the interviews together into thematic clips. The videos complement the core materials by encouraging participants to think about research integrity through the lens of real-world skills they will need as researchers—for example, communication and conflict resolution. These videos help meet our goal of embedding the principles of research integrity into research discussions more generally.

Another benefit of these videos is that academics typically present research integrity more flexibly than professional staff, who sometimes feel compelled to present the principles of policies and codes of practice to the letter. The flexible approach may well come across as more realistic in the evolving research climate [32]. We are keen to not perpetuate a common perception by researchers “that ethics and integrity is all about bureaucracy, which is an unhealthy thing for really achieving research integrity” [33]. Our videos of researchers presenting real-world anecdotes bring the principles of research integrity down to earth and help avoid the potential pitfall of promoting unrealistic bureaucratic principles.

One further benefit of having the video clips is that they can be used to represent every faculty at every session. Prior to the videos, in our 2016 sessions when three academics attended each Strategies session in person, some faculties were inevitably not represented—a fact that trainees from underrepresented faculties sometimes pointed out on our feedback surveys. The videos helped us mitigate that problem of representation.

Course development collaboration

Large institutions tend to have several professional divisions, and each of these may provide training somewhat independently of the others. For that reason, it sometimes happens that research integrity training is conducted independently of other research training initiatives within the institution. This is not ideal, since, as we have discussed, integrating research integrity training with other kinds of research training can benefit learners.

Our collaboration between the Office of Research Ethics and Integrity and the Library has benefited not only the academic participants of the workshop but also the professional staff who developed the course materials by fostering a broader and more integrated picture of research training. In short, our experience is that much is to be gained and nothing to be lost by collaborating in the development of research training programs.

Maximising participation

Neither of the courses described in this paper is compulsory. As with most non-compulsory courses, we faced issues of participation.

The first time we offered Authorship and Publication in 2017, we were encouraged to find that our maximum registration allowance—200 registrants—was filled overnight, as soon as the course was first advertised. However, as it turned out, only 99 of these 200 registrants actually attended the sessions, and we were quite discouraged by this rate of attrition. We have since discovered, through conversations with training providers within our institution and more broadly, that this is a common problem, and we have employed a number of strategies to improve the rate of attendance, including an invitation to all academic staff from the Deputy Vice Chancellor (Research and Commercialisation), banner advertisements on the university’s intranet, invitations to Heads of Schools, and a very short video “teaser trailer” of the course that we included in the invitations and advertising materials. As shown in Table 3, these interventions appear to have improved the rate of attendance.
Table 3

Registration and participation for Authorship and Publication and Journal Peer Review (excluding school roadshow sessions), 2016 and 2017


# registrants

# participants

% participation









To provide context for these attendance figures, the number of enrolled higher degree research (HDR) students (the main target for our sessions) at QUT was in the range of 2100–2700 at any given time between 2016 and 2017.

Taking the course to those who need it

Because our courses are not compulsory, we are sure that some of those who most need the training do not come. To mitigate this problem, we have taken the training on “school roadshows”, where it is offered at disciplinary schools within faculties at QUT where certain research integrity issues have arisen and where there is an expectation from the Heads of Schools that researchers should attend. So far, we have had 66 participants at these sessions. These sessions have also given us a chance to customise elements of our presentation to particular audiences. For example, instances of plagiarism can be quite different between, say, fine arts, mathematics, and information technology.

Next steps

Peer-to-peer initiatives and participation

Just as important as including senior faculty as role models is encouraging peer-to-peer discussion about research integrity. To promote cultural change, researchers need to see research integrity as a core value of their own cohort. Peer-to-peer learning may be particularly important for higher degree research students [34]. This is something we are aiming to facilitate in future iterations of the course.

To date, we have provided some opportunities for interaction between peers in our courses—for example, by asking participants to discuss case studies and to vote (using sticky notes) on emerging practices in peer review that they thought were most interesting or most important—an idea we borrowed from a voting activity by Bosman and Kramer [35]. But we believe we could do more to promote peer-to-peer learning. One option we will consider is to adopt elements of the “ResBaz” [36] and “Software Carpentry” [37] teaching styles that are modelled on peer-to-peer skills teaching. Others have already incorporated some elements of research integrity training into programs of this style—for example, some sections of the successful “Library Carpentry” [38] course, and a newer offering developed partly by QUT staff involved in the courses described here called, “The 21st Century Academic: Smart, Savvy and Social” [39].

Measuring the effectiveness of these courses

We have conducted pre- and post-session participant feedback surveys that have helped us refine our courses over time, but these do not constitute a formal evaluation of our training.

Measuring the effectiveness of training in the responsible conduct of research is an emerging field. Historically, systematic reviews and meta-analyses in this area have been limited by the fact that although a large number of studies have discussed ethics training, far fewer studies have included explicit evaluations [4]. This situation is improving; a 2017 meta-analysis by Watts et al. included “66 empirical studies consisting of 106 ethics courses” [5].

Evaluating the effectiveness of research integrity training is a challenging task because although it is reasonably straightforward to survey participants about the delivery of a course, it is more difficult to assess whether learning has occurred [40] and more difficult still to determine whether training has affected behaviors and improved an institution’s research culture in the longer term.

Studies show the value of “conducting broad, systematical evaluation efforts in regular intervals to allow for benchmarking the effects of ethics education programs overtime” [4] and the importance of employing rigorous evaluative methods [41].

We will incorporate the findings of previous research to implement formal evaluations of our courses.

Expanding our course offerings

Authorship and Publication and Journal Peer Review have been included as a core part of QUT’s integrity training, and we will continue to offer these courses on an ongoing basis. We are also expanding our training offerings to include research data management, which we are currently developing in collaboration with QUT Library, High Performance Computing, and all faculties across our institution.


Research integrity training at academic institutions is important, but there is little consensus about how it should be designed or conducted [15]. It is vital that institutions share learnings, and the experience we have outlined in this paper may serve as a starting point for institutions looking to begin developing, or to expand, their research integrity training.

We have described the development of integrated research integrity courses to complement the more specific research integrity training that is the topic of a compulsory online course. Our new courses are “integrated” in the sense that although they promote research integrity, they are not explicitly about research integrity. These courses are framed in the context of researchers’ goals in terms of publications and peer review, and the principles of research integrity have been carefully integrated, as if by stealth, within that context. By designing research integrity training in this way, we have strived to make a norm of good practices in authorship, publication, and peer review and to promote cultural change in research, consistent with the goals of our institution.

To give our courses a coherent structure, we have designed diagrams—“conceptual maps”—to tie all the various aspects of the topics together. The subway diagram for Authorship and Publication [13], and the industrial machine for Journal Peer Review [14], are available under a Creative Commons CC BY 4.0 Licence, and we hope that other institutions may use and improve upon them.

We have found that the integrated training model has broad appeal among our researchers. In the past 2 years, these new courses, although voluntary, have attracted 723 participants, and the demand continues.

We attribute the interest in these training sessions not only to the integrated method but also to the careful design of the course materials and involvement of academic staff in selected aspects of the training. We have developed fast-paced multimedia training courses that give a holistic overview of academic publishing and Journal Peer Review, and connect participants with tools and resources to find out more about any of the topics covered, according to their needs or interests. This training method suits our researchers who are typically quite capable of self-directed learning, but sometimes lack a birds-eye view of the academic publishing environment, and are often unaware of all the tools and resources that are available to support them.

There is a need for further work to understand how best to develop and deliver research integrity training. We have not formally assessed the effectiveness of these courses, and we recognise the importance of employing rigorous evaluation methods to assess the outcomes of training courses over time; that will be our next step.



The following individuals comprised the core training development and presentation team: Virginia Barbour, Stephanie Bradbury, Paula Callan, Philippa Frame, Mark Hooper, Jane Jacobs, Ashley Steele, Melissa Tate, and Anne Walsh. The following individuals were the guest presenters (live or on video): Adrian Barnett, Ilana Bolingford, Bianca Capra, Kerry Carrington, Michael Collins, Stuart Cunningham, Kathryn Fairfull-Smith, Marcus Foth, Nicholas Graves, Kristiann Heesch, Katya Henry, Shane Mathews, Paula McDonald, Kathy Mills, Nerida Quatermass, Jonathan Roberts, Robert Schweitzer, Kirsten Spann, Sharron Stapleton, Jessica Stevens, Jennifer Thomas, Ellen Thompson, Kerryann Walsh, and Michael Wolfe. The following individuals provided their feedback on the draft manuscript: Ashley Steele and Melissa Tate.


Not applicable.

Availability of data and materials

Research data sharing is not applicable to this article as no research datasets were generated or analysed during the current study.

Authors’ contributions

MH drafted the manuscript, was part of the team that designed the training, designed the handouts and some videos, and analysed feedback data. VB critically revised the manuscript, was part of the team that designed the training, and presented at the sessions. AW critically revised the manuscript, was part of the team that designed the training, and presented at the sessions. JJ critically revised the manuscript and provided leadership over the development of the training. SB critically revised the manuscript, was part of the team that designed the training, and presented at the sessions. All authors read and approved the final manuscript.

Ethics approval and consent to participate

Not applicable.

Consent for publication

Written consent was given from all whose photos appear in this manuscript.

Competing interests

Virginia Barbour is on the Editorial Board of Research Integrity and Peer Review. She was the Chair of the Committee on Publication Ethics (COPE) until May 2017 and was a Trustee of COPE until November 2017. All authors are employed by the Queensland University of Technology.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Authors’ Affiliations

Queensland University of Technology, Brisbane, Australia


  1. Anderson MA. Pedagogical support for responsible conduct of research training. Hast Cent Rep (0093-0334). 2016;46(1):20.Google Scholar
  2. Todd EM, Watts LL, Mulhearn TJ, et al. A meta-analytic comparison of face-to-face and online delivery in ethics instruction: the case for a hybrid approach. Sci Eng Ethics. 2017;
  3. Hyytinen H, Löfström. Reactively, proactively, implicitly, explicitly? Academics’ pedagogical conceptions of how to promote research ethics and integrity. E J Acad Ethics. 2017;
  4. Antes AL, Murphy ST, Waples EP, Mumford MD, Brown RP, Connelly S, Devenport LD. A meta-analysis of ethics instruction effectiveness in the sciences. Ethics and Behavior. 2009;19(5):379–402.View ArticleGoogle Scholar
  5. Watts LL, Medeiros KE, Mulhearn TJ, Steele LM, Connelly S, Mumford MD. Are ethics training programs improving? A meta-analytic review of past and present ethics instruction in the sciences. Ethics & Behavior. 2017;27(5):351–84. ArticleGoogle Scholar
  6. Steneck NH. ORI: introduction to the responsible conduct of research. Washington DC: Government Printing Office; 2007.Google Scholar
  7. Hollander RD, Arenberg CR, the National Academy of Engineering (NAE). Ethics education and scientific and engineering research: what’s been learned? What should be done? Summary of a workshop. Washington, D.C: National Academies Press; 2009.Google Scholar
  8. Pimple KD. Online integrity training falls short. Nature. 2013;495:449.View ArticleGoogle Scholar
  9. Faden RR, Kass NE, Klag MJ, Krag SS. On the importance of research ethics and mentoring. Am J Bioeth. 2002;2(4):50–1.View ArticleGoogle Scholar
  10. Plemmons DK, Kalichman MW. Mentoring for responsible research: the creation of a curriculum for faculty to teach RCR in the research environment. Sci Eng Ethics. 2017;
  11. Lee A, Boud D. Framing doctoral education as practice. In: Boud D, Lee A, editors. Changing practices of doctoral education. New York: Routledge; 2009. p. 10–26.Google Scholar
  12. Bradbury S, Hooper M, Barbour G, Tate M, Walsh A, Steele A, Callan P. Authorship and Publication. figshare. 2018.
  13. Hooper M, Barbour V, Steele A, Walsh A, Tate M, Bradbury S, Callan P. Authorship and Publication. figshare. 2018.
  14. Hooper M, Steele A, Tate M, Barbour G, Walsh A, Bradbury S, Callan P. Journal Peer Review. figshare. 2018.
  15. Langlais PJ, Bent BJ. Individual and organizational predictors of the ethicality of graduate students’ responses to research integrity issues. Sci Eng Ethics. 2013.
  16. Bradbury S, Callan P, Hooper M, Steele A, Tate M, Walsh A, Barbour V. Journal Peer Review. figshare. 2018.
  17. Prensky M. Digital natives, digital immigrant. On the Horizon. 2001;9(5):1–6.View ArticleGoogle Scholar
  18. Mayer RE. Introduction to multimedia learning. In: Mayer RE, editor. The Cambridge handbook of multimedia learning. Cambridge: Cambridge University Press; 2014. p. 43–71.Google Scholar
  19. Lowe RK. Animation and learning: value for money? In: Atkinson R, Mcbeath C, Jonas-Dwyer D, Phillips R, editors. Beyond the comfort zone: proceedings of the 21st ASCILITE Conference. Perth: ASCILITE; 2004. p. 558–61.Google Scholar
  20. Hooper M. Journal Peer Review: a history. figshare. 2018.
  21. Hooper M. Authorship and Publication: agreeing on authorship. figshare. 2018.
  22. Reed SK. Cognitive architectures for multimedia learning. Educ Psychol. 2010;41(2):90–1.Google Scholar
  23. The center for universal design, NC State University. Accessed 20 July 2017.
  24. Bird SJ. Involving faculty in the teaching the responsible conduct of research. Teaching Ethics. 2012;12(2):65–75.View ArticleGoogle Scholar
  25. Hooper M, Bradbury S, Barbour V, Callan P, Walsh A, Steele A, Tate M. Authorship: interviews with QUT researchers (version 1). figshare. 2018.
  26. Hooper M, Walsh A, Barbour V, Bradbury S, Steele A, Callan P. Authorship and Publication: tips for submitting papers, interviews with QUT researchers (Version 1). figshare. 2018.
  27. Hooper M, Barbour V, Bradbury S, Walsh A, Tate M, Callan P, Steele A. Authorship and Publication: deciding where to publish, interviews with QUT researchers (version 2). figshare. 2018.
  28. Hooper M, Steele A, Tate M, Walsh A, Barbour V, Bradbury S, Callan P. Journal Peer Review: advice to early career researchers (version 1). figshare. 2018.
  29. Hooper M, Barbour V, Walsh A, Tate M, Steele A, Callan P, Bradbury S. Responding to peer review: interviews with QUT researchers (Version 1). figshare. 2018.
  30. Hooper M, Walsh A, Tate M, Steele A, Barbour V, Bradbury S, Callan P. Journal Peer Review: starting out, interviews with QUT researchers (Version 1). figshare. 2018.
  31. Hooper M, Barbour V, Steele A, Walsh A, Tate M, Bradbury S, Callan P. Journal Peer Review: thoughts about the system, interviews with QUT researchers (version 1). figshare. 2018.
  32. Zigmond MJ, Fisher. Teaching responsible conduct responsibly. J Microbiol Educ. 2014;15(2):83–7.Google Scholar
  33. Rawnsley, A. Quoted in: Epigeum Ltd, "Research Integrity" online training course. Module 1.06. Published July 2012.
  34. Batty C, Sinclair J. Peer-to-peer learning in the higher degree by research context: a creative writing case study. New Writing. 2014;11(3):335–46.View ArticleGoogle Scholar
  35. Bosman J, Kramer B. How open is your peer review?. 2016. figshare. Retrieved: 12 June 2017.
  36. Research Bazaar 2017. Accessed 20 July 2017.
  37. Software Carpentry: Teaching basic lab skills for research computing. Accessed 20 July 2017.
  38. Library Carpentry: Software skills for library professionals. Accessed 20 July 2017.
  39. Barbour V, Simons N. The 21st academic: smart, savvy, and social. Accessed 20 July 2017.
  40. McGee R. Evaluation in RCR training—are you achieving what you hope for? Journal of Microbiology and Biology Education. 2014;15(2):117.View ArticleGoogle Scholar
  41. Mumford MD, Steele L, Watts LL. Evaluating ethics education programs: a multilevel approach. Ethics & Behavior. 2015;25(1):37–60. ArticleGoogle Scholar


© The Author(s) 2018