Interview with Peter Neyroud

This article was originally posted on the Meta-evidence website on 14 March 2018.

Peter Neyroud CBE QPM

This week we had the pleasure of speaking to Dr Peter Neyroud, CBE QPM about his stellar and fascinating career as a policymaker, funder, researcher, and a world-leading advocate for evidence-based approaches to policing.

Dr Peter Neyroud, CBE, QPM was the Chief Constable of Thames Valley Police and then the Chief Executive of the National Policing Improvement Agency (NPIA) in the UK until he retired in 2010. He then began a PhD at University of Cambridge, Institute of Criminology carrying out a major randomised field trial on police led diversion from prosecution.

In 2010 he conducted the “Review of Police Leadership and Training” which led to the establishment of the “National College of Policing” in 2012.

He is currently Lecturer in Evidence-based Policing and Deputy Director of the Police Executive Programme, Institute of Criminology at the University of Cambridge, Co-Chair of the Campbell Collaboration crime and justice co-ordinating group, the Vice-Chair of the Internet Watch Foundation, General Editor of Policing: a Journal of Policy and Practice and Editor of European Police Science and Research Bulletin.

What role can systematic reviews play in helping policymakers and practitioners make evidence-informed decisions?

In my time at the National Policing Improvement Agency (NPIA) I was both a practitioner and a policymaker for policing for the UK. One of the first things I wanted to do was to focus on what we already knew about what was effective in policing. I found systematic reviews provided the perfect means to find out what does and does not work and I spent $1 million on policing studies resulting in 12 reviews published with the Campbell Collaboration.

Managing the stress of the job is a major issue for police forcesThis work set the foundation from which to build a research strategy for policing in the UK. Concentrating on systematic review findings also allowed us to focus on implementing initiatives that we then knew did work such as hot spot policing, and to work to discontinue practices that didn’t work.

A good example is stress in the police force. Managing the stress of the job is a major issue for police forces. The review NIPA funded was able to demonstrate that individual stress management interventions did not have sufficient evidence to justify their widespread use and went on to recommend that future implementation of these interventions evaluate their effectiveness more thoroughly.

What are the barriers to the better use of evidence in policymaking?

There are a few barriers but for me, the first is this; Who really reads a full review? Policymakers certainly don’t. Researchers will ‘top and tail’ a review, reading the abstract and findings and skimming the rest. It’s important to realise that most people in the policy field don’t have PhDs, probably won’t have a research-focused masters and many working in crime and justice won’t have ever studied criminology. These days they frequently rely on Google to find information. Many will not know what systematic reviews are, let alone know where to find them. A good meta-analysis should be the first port of call for evidence for policymakers but we, as researchers, need to get better at making our reviews easy to find and easy to understand.

Who really reads a full review? Policymakers certainly don’t.

If I want someone to use my evidence, I wouldn’t hit them with a forest plot as the first piece of information. Much as I like forest plots, they aren’t widely understood. So the first thing we should see is a plain language summary. This is what you find in most other evidence portals. You get a short summary of a report before a link to the longer executive summary and then the full report. I would also like to see more of a halfway house too, a summary that has more meat than a plain language summary with some details of the primary studies. This can help non-experts to understand the individual studies in the context of the review findings to make the results more concrete in the reader’s mind.

“I just happened to be in the Home Secretary’s Office on the day the Campbell review on the effect of CCTV on crime was published”Another thing that is useful to understand is that the use of evidence in policy isn’t systematic. For example, when I was a senior policy advisor to the government, questions were being asked about the effectiveness of CCTV in light of the high per capita spend on CCTV in the UK. I just happened to be in the Home Secretary’s Office on the day the Campbell review on the effect of CCTV on crime was published. I highlighted the review to the Secretary of State at the time and it was then immediately cited in the parliamentary debate and informed future policy on CCTV use. The point is, that if policymakers don’t know the research exists they can’t use it. So, we as researchers need to work harder at making evidence accessible and getting it to the people who can use it.

“Rapid reviews” are appealing to policymakers even though we know
they are not as reliable as well-conducted systematic reviews.

Finally, there is a mismatch between the pace of research and the time-frame within which policymakers need the evidence. As a result, narrative reviews or “rapid reviews” are appealing to policymakers even though we know they are not as reliable as well-conducted systematic reviews. So, in part policymakers need to know that good quality reviews are worth waiting for and reviewers need to get better at delivering systematic reviews in a policy friendly timescale. Developing reliable databases of empirical work, such as the global policing database, is one way to reduce the time taken to conduct systematic reviews.

Which meta-analysis has most historical significance for you?

It has to be the hot spot policing review. I worked on a study of hot spot policing in my early career, before mobile phones and handheld digital mapping, when I literally had dots stuck on paper maps to see where the hotspots were. This review is personally meaningful to me but it was also symbolic in terms of the use of evidence in policing because the findings were clear that targeted, place-based policing works, and has now been adopted all over the world.

If money and resources were unlimited, what would be the next question you would answer using a Campbell systematic review?

This program of work is aiming to tackle a hugely important, global issue through evidence synthesis.

Very recently the Campbell Crime and Justice Group has been notified that we have been awarded a substantial grant to undertake reviews on terrorism and the prevention of radicalisation. We are aiming to fund four reviews per year for five years. That is about as close to unlimited resources as you get in this area of research and I can’t think of many other topics in crime and policing that are more socially significant than tackling terrorism and radicalisation. This program of work is aiming to tackle a hugely important, global issue through evidence synthesis.

Under what conditions do body-worn cameras work most effectively?Another review in the pipeline that I’m really interested in is body-worn cameras in policing. I was initially sceptical about the utility of body-worn cameras. My Institute worked on one of the first studies on the topic, which was an RCT. This set the standard for work in this area and now there are more than 20 RCTs. Unfortunately, there happens to have been one study, in Washington DC, which had some methodological flaws and found a null effect from camera deployment. Nevertheless, it happens to have been this study that was picked up in the press and has influenced a push against their use. I’m hoping that the new review will be able to provide clear guidance on the extent to which body-worn cameras work, but also under what conditions do they work most effectively?

What keeps you motivated to continue to push for the better use of evidence?

There are so many interventions and policy initiatives that are driven by intuition or individual interests. While intuition is one way to generate hypotheses about what might work, it doesn’t count as evidence. Questions about the effect of interventions are important and should be answered with real evidence. There are still too many resources wasted on ineffective interventions, people’s lives negatively impacted by poor policy and time wasted not implementing effective interventions.

If you are really serious about evidence and understanding
what works then put your money where your mouth is.

It strikes me that medicine is one area where they are serious about evidence-based practice and huge amounts of funding go into generating and using robust evidence in medicine and healthcare. So, if you are really serious about evidence and understanding what works then put your money where your mouth is. If you really want answers to what works in policing or education or nutrition or international development or business and management, then fund systematic reviews with Campbell.

How should research teams’ prioritise which questions should be answered through systematic reviews? What would you recommend research teams do to engage with policy and practice?

Collaborative research is better for all of usI think these two questions go together. Engaging the funder, or potential funder, at the earliest possible opportunity is good for both researchers and the end users of evidence. Investing in those relationships means that the research questions can be developed in collaboration. Funders can get answers to the questions that are important to them and researchers gain a clearer understanding of why these are the questions being asked. Researchers can help to guide the process to ensure that the questions asked are answerable in the context of a systematic review. Making the question setting an interactive process gives everyone involved ownership of the science. This sense of ownership means that the funders/research partners are much more receptive to the research findings and more likely to use the evidence that is produced.

Useful links

 

Meta-evidence is a blog for interviews and tips on evidence synthesis brought to you by Campbell UK & Ireland.

Contact us

  • P.O. Box 222 Skøyen
    0213 Oslo
    Norway
  • +47 2107 8100
  • info@campbellcollaboration.org