Feedback received, “You know you can use Survey Monkey for free right? No wonder arts companies in this country are in the red.”
To rephrase, “Why would an arts organisation spend money on evaluation when it should be spending every available dollar on producing work?”
Let’s unpack the question. The key term is AVAILABLE. If we don’t measure the impact of what we do with the dollars we receive, pretty soon we will have no dollars at all.
This has largely been the story of arts and cultural funding not only in Australia, but also worldwide. While we instinctively know that arts and culture are good for us, arts organisations struggle to measurably demonstrate their value. When funding is tight, it is programs that cannot prove their value that are cut and invariably the first cock on the block is the Arts. If we’re not able to measure the value of what we deliver, we can’t demonstrate the value of their investment to those who fund us.
A question that often comes up in my discussions with potential clients is: “If we do commit to a culture of evaluation, how much should we spend?”
If you google the paraphrased question from the feedback above – Why spend anything on evaluation? –it brings up a very interesting research study carried out by the William and Flora Hewlett Foundation in California. Prompted by a Board member asking exactly that question in 2010, the Foundation recorded what they spent on evaluation over three years and discovered they were spending between 1 and 1.7% of program dollars. They decided that a serious commitment to evaluation should necessitate an annual spend of 5-10% of program dollars, due to its enormous potential for leverage. They stated,
“We invest a little to learn a lot, and in learning we make our grant dollars more effective and more efficient. We gain information that facilitates superior grant allocations going forward, helps us adapt current grant making, and reveals promising new directions for work. Evaluation helps our grant dollars go further and do more, and it does so in a way that we believe makes the net benefits well worth the expenditure.”
My googling also uncovered an article by the Corporation for National and Community Service in the USA that mobilises 5,000,000 Americans to dedicate their time and resources to delivering much needed services into the community. CNCS concluded that between 13-15% of program costs should be devoted to evaluation.
Research shows that the USA, Canada and UK have led the field in evaluation, to try to measure and understand the outcomes of government and corporate cultural and social spending. Now new technologies mean that more sophisticated evaluation can be carried out quickly, cheaply and consistently.
‘Consistent’ is the key word here, because it’s certainly possible for the arts sector to use free survey tools such as Survey Monkey. While these are useful tools for the resource-challenged, surveys are predominantly created in an ad hoc manner that is customised to the individual organisation, lacking a consistent, sector-developed metric framework. Culture Counts is a tool that has been developed over five years in conjunction with leading academics in the field of Cultural Value to provide common language, definitions, and a framework to place value on the benefits that arts and culture provide. By investing a small percentage of funding in such a tool, organisations gain access to metrics that link directly to funder objectives, are able to benchmark globally to learn from others in the sector, and are contributing to a big database of insights that benefits the cultural sector as a whole.
For emerging unfunded organisations, to whom any amount of investment in evaluation is prohibitive, Culture Counts is in the process of establishing a not-for-profit trust. The Foundation will support small and innovative cultural organisations to access evaluative tools, providing them with the insights and evidence needed to strengthen, grow, leverage and diversify.
I will leave you with a thought-provoking quote from a recent Nesta article examining government spend on evaluation in UK, which concluded:
“Spending on evaluation is hard to single out, but the available evidence suggests it represents a fraction of the overall budget. The most recent figure (from 2010-2011), from the National Audit Office, indicates that across all government departments £44 million went into finding the impact of programmes. That was 0.006% of the total. For reference, a non-profit organisation such as the Big Lottery Fund recommends devoting “up to 10%” of total spending to determine whether a programme works. In the context of budget cuts, it is not a luxury to evaluate a programme, but rather the best way to ensure maximum impact.”
Main image source: http://www.bbc.com