We are making preparations to launch a new online resource for research integrity which should be available later in the summer. Aimed at our research students and their supervisors, this will complement the extensive support and guidance researchers receive in their schools. During the consultation process I’ve spoken to a range of university staff about integrity and added to my understanding of regulations, policies and systems across the University and disciplines.
One of the most interesting of these conversations happened last week when I met Dr Willem Halffman from the University of Nijmegen who was on a brief research visit to Edinburgh. We talked about a wide range of topics in our short meeting, with particular focus on the circumstances which lead to misconduct. My interest in integrity is both old and new. Old, in that I’ve spent close to twenty years training and developing research students and staff, and fostering good practice has been part of this. New, in that it was only last year that I attended the UK Research Integrity Office conference and became fascinated by wider discussions which went far beyond policies and looked at the behaviours and tendencies which lead to misconduct.
One speaker, Dr Maura Hiney spoke about these and referenced David Kornfeld’s paper on the categories of people who violated the rules of research. Kornfeld’s paper is an interesting read, so I won’t give away the headlines, but he summarises that
These acts of research misconduct seemed to be the result of the interaction of psychological traits and/or states and the circumstances in which these individuals found themselves.
This prompted Willem to point me to a model from financial misconduct – the fraud triangle. This originated from the work of Donald Cressey (Donald R. Cressey, Other People’s Money (Montclair: Patterson Smith, 1973) p. 30.), who tried to explain the circumstances under which people commit fraud. The three factors which make up the triangle – opportunity, pressure and rationalistion – are described with a simple animation by the Association of Certified Fraud Examiners. Although the examples used relate to financial fraud, it isn’t difficult to extend the model to research.
I find this model useful as it points to the role that pressure plays in misconduct and is something that cannot be ignored by any institution wishing to develop a high integrity culture. It isn’t enough to play lip service to the regulations and training whilst the pressures on researchers continue to build.
This connection between integrity and resilience is something that I hope to explore and has been a significant driving force in the initial focus I’ve had at Edinburgh on wellbeing and resilience for researchers. As we tailor and embed the integrity module I’ll be looking at how we ensure that our training plays a part in minimising the pressure in the environment as well as being clear about good practice and honest cultures.
Willem’s research has resulted in a number of pubications on scientific integrity, (Whilst you are looking at his publications, The Academic Manifesto [Halffman, W. & Radder, H. (3 April 2015), The Academic Manifesto, Minerva, Vol. 53, no.3, p. 165-187. doi: 10.1007/s11024-015-9270-9.] makes a number of other suggestions to release the pressure in the system!)
Great post. Relationship between integrity and resilience. I’d never put it that way before, but it makes a lot of sense to me.
[…] Sara Shinton (2017) The Road to Misconduct, Blog Post. Available at: https://iad4researchers.wordpress.com/2017/07/04/the-road-to-misconduct/ […]