In government, there’s a frequent disconnect between policy intent and what people experience on the ground. In part, because our current approach to policy and service design, delivery and evaluation has “been too fragmented and not built on an understanding of the complex social systems they must work in” (pdf).
One way for us to work more effectively with that complexity is by exploring the evidence and ‘measures’ that we use to inform our decision-making and approach to innovation.
Where intentions meet complex reality
There is a bias in government towards certain kinds of data and evidence. This bias limits our understanding of what is going on for people, and what might actually make a difference to improved wellbeing outcomes. As the Brookfield Institute says (pdf):
“There are 'phantom rules' or orthodoxies in government around what is allowable and qualifies as valid evidence that may inhibit policy professionals from innovating."
You see this in evidence-based practice. It gives us confidence, and a sense of certainty, because it suggests we are making decisions and basing policy or investment on things that are known to work. It’s a sure bet! But in evaluating what works, ‘success’ is often defined in narrow terms.
Many of our current social issues - like inequity and child wellbeing - are compounded by a policy and service system that takes a simplistic view of issues. And our responses to those issues require us to expand our understanding of what is or isn’t evidence. We need to question whose perspectives have been included and what kinds of evidence count.
It’s not about disregarding evidence-based policy. But rather, balancing approaches like RCTs with evidence that has been developed and tested with people in place; evidence that recognises different kinds of experiences and engages with (rather than reduces!) complexity.
A few ideas for exploring this in your work:
- Surface and examine your relationship to different forms of evidence. Where are you confident, comfortable and capable (and where are you not)?
- Get out of old habits to understand what kinds of evidence you could or should be drawing upon (pdf).
- Challenge your team around the ‘phantom rules’ that might be holding you back from policy innovation.
- Look at approaches researchers are taking to translate evidence-based programmes into effective practice (pdf).
Are we paying attention to the right things?
We often see a gap between what we “measure” and the outcomes that make the difference. Families we talk with value things you probably value too; strong friendships, places to be without judgement, to feel safe and to see their culture reflected and valued.
At a high level, we know these things matter. Yet the data we report still tends to focus on things like: the number of visits, signs-ups to a service, or attendance in a government programme. It reflects a service-level view of “success” that falls short of what matters to families. And worse, we find the data that providers so dutifully collect and report on often isn’t even used in meaningful ways!
We’re experimenting with developing local indicators of child wellbeing with families. We want to pay attention to what they value, and use that to shape the system that’s designed to support them. This is quite a shift from the default for us in government. Four ideas to get you started:
- Use the “What we track”(pdf) worksheet to reflect on your process for identifying ‘success criteria’ and the values that underpin them. Find those blind spots!
- Co-design (pdf) and developmental evaluation lend themselves to defining outcomes in a localised and participatory way. We expand on this in a blog on place-based approaches.
- The Developmental Evaluation Institute has the basics of developmental evaluation outlined, including Jamie Gambles's early primer on developmental evaluation (pdf).
- Mark Cabaj’s resources on evaluation and systems change are really useful for thinking about outcomes at different levels. So you can zoom in and out of an issue to see signs of change (and get less bogged in the weeds).