Written by Professor Gloria Laycock, UCL Jill Dando Institute of Crime Science.
There is, in case anyone missed it, a current vogue for evidence based everything – medicine, teaching, early interventions, dealing with the elderly – and of course crime reduction. The question ‘is there any evidence that X works?’ is becoming popular amongst the (many) inspectorates across vast areas of public policy. This is a welcome move. Continuing with the same old practices, perhaps in the face of evidence that they are simply not effective, makes little sense to anyone but this is particularly so when funds are tight.
So what can we say about what works in reducing crime? First, volume crime has gone down in almost all advanced western democracies over about the past 20 years. Yes, cyber crime has increased; yes we are more aware of child abuse, sexual violence and organised crime; but increases in these offences come nowhere near to compensating for the reductions in volume offending. So what worked? The answer from research seems to be that increased security delivered the reductions, and much of this was not down to police activities but to deadlocks and immobilisers on cars and better household security in the case of property crime, and possibly the recession in the case of violence, when alcohol consumption typically reduces.
This is a general point about the overall importance of security but what about the specifics? In 2013 the Economic and Social Research Council and the College of Policing for England and Wales funded a consortium of universities in Scotland, Wales and England to carry out a major review of what works in crime reduction. It is a complex project with nine work streams. The first required the identification of all the existing systematic reviews of what works in crime reduction that have been published in the English-speaking research literature to 2013. A systematic review involves searching the literature for studies that evaluate the effect of a specific initiative against the outcome – in this case crime reduction – and then summarising the results of these studies to make something approaching a definitive statement on effect. The project then required the presentation of the results of the systematic reviews that had been identified in an online format that was useful to busy practitioners.
In translating this array of evidence into a practical tool the research team homed in on five criteria, which they regarded as necessary if practitioners and policy makers were to make good use of the emerging information. These are: the effect size (by how much did the initiative reduce crime?); the mechanism(s) (how was this achieved?); the moderator(s) (in what circumstances did the initiative work best?); how was it implemented (what are the necessary conditions to successful implementation?) and finally economic issues (what can be said about the costs?). These five criteria form the acronym EMMIE (effect, mechanism, moderator, implementation and economics). All appear to be necessary if practitioners are to transpose an initiative that appears to have worked in one area at one time, to another area at another time.
The devil, as always, is in the detail. Much, if not most existing research is not conducted to inform policy or practice. It may be to develop theory or investigate outcomes other than crime reduction (such as attitudinal change or improvements in mental health). So there is less really useful information out there to be synthesised and such as there is does not necessarily discuss the five criteria that are regarded as necessary to practitioner decision making. Furthermore, ‘evidence’ can be interpreted in a variety of ways. Clearly researchers think in terms of research based evidence – that which is empirical, measurable, statistically reliable and replicable. Practitioners on the other hand tend to think in terms of experience, precedent or cost. They talk to their colleagues about what worked for them and try to reproduce it; they do not have the time to carry out a literature search on what works before doing something to deal with the day-to-day imperatives of policing.
Does this mean that we are wasting our time in trying to develop a research-base on crime reduction? No it does not, but note the change of label to research-base from evidence-base. We do, however, need to acknowledge that research-based evidence needs to be better integrated into the decision processes of practitioners. Senior officers, for example, need to know the research evidence and be able to take it into account in making decisions in much the same way as a hospital consultant takes account of the medical research when deciding on an individual treatment programme. They may know that on average treatment X works well for condition Y, but in relation to patient A, who has a multitude of additional conditions and symptoms, treatment X is not appropriate – but he/she also knows why that might be the case.
The crime reduction toolkit, which presents the research-based evidence in relation to specific initiatives, can be found at http://whatworks.college.police.uk/toolkit/Pages/Toolkit.aspx. Details on the development of EMMIE are available in Shane D. Johnson, S. D., Nick Tilley & Kate J. Bowers (2015) Introducing EMMIE: an evidence rating scale to encourage mixed-method crime prevention synthesis reviews Journal of Experimental Criminology, DOI 10.1007/s11292-015-9238-7
Comments are closed, but trackbacks and pingbacks are open
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License