Example Cultural Event

Example Cultural Organisation

Data and Insights - Culture Counts™

  • 3,500

    Attendees

  • 280

    Respondents

  • 4

    Peer Reviewers

  • 7

    Quality Metrics

Press the down arrow to begin

Future Music Festival 2013
Photo by Eva Rinaldi

1.0

Evaluating an example cultural event

The organisation used Culture Counts to survey people attending the event over two weeks in April. 280 members of the public and four peer assessors were surveyed to find out what they thought of the event.

Each survey contained seven ‘dimension’ questions, asking the public about their experience of the event. Peer and self-assessors were asked an additional three questions. These artistic quality dimensions have been developed with the arts sector to measure the impact and value of arts and cultural events.

Public Assessment

  1. Captivation

    It held my interest and attention

  2. Rigour

    It was well thought through and put together

  3. Meaning

    It moved and inspired me

  4. Relevance

    It had something to say about today's world

  5. Connection

    It helped me to feel connected to my local community

  6. Excellence

    It is amongst the best of its type I have seen

  7. Local Impact

    It’s important that it’s happening here

Peer and Self Assessment Only

  1. Innovation

    It was introduced to the audience in a new way

  2. Growth

    It could appeal to new audiences

  3. Leverage

    It could attract a variety of investors

Teatro Broadway
Photo by Flodigrip's world

2.0

Who completed the survey?

Survey respondents were asked to provide their age, gender and postcode at the start of the survey. This enables data to be matched to the wider population and responses to be filtered to understand differences in demographics. The charts show the proportion of survey responses captured for each of the age and gender demographics.

How did respondents describe their gender?

What was the age of respondents?

Insights:

The majority of respondents were female (59%) and the largest proportions were in the 30-39 age cohort (24%) and 20-29 age cohort (23%).

Performing Arts Center, San Luis Obispo
Photo by Performing Arts Center, San Luis Obispo

3.0

Who has attended before?

Respondents were asked whether they had attended an event of this art form before, and whether they had been to a event by this organisation before. This informs expectations and helps organisations to better understand the background and prior knowledge levels of their audiences.

Insights:

A large proportion of respondents had attended an event of this art form before (89%), and 70% had also attended an event by this organisation before. Almost a third of respondents were attending an event by this organisation for the first time.

Venice Carnival
Photo by Stefano Montagner

4.0

What did the public think of this cultural event?

Survey respondents moved a slider to indicate whether they agreed or disagreed with the dimension statement using a likert scale. The chart contains data for all public responses, showing the average score and the percentage of people that agreed or disagreed with each of the statements.

Insights:

A large majority of people agreed or strongly agreed with all seven dimensions. High average scores (76% and above) were received for all dimensions, and 99% of people agreed or strongly agreed that the event was well thought through and put together, captivating, of importance to the local area, and moving. Over 90% of respondents also agreed or strongly agreed that it was the best of its type they had seen and had something to say about today’s world.

While the average scores were lower for Connection, 84% of respondents still agreed or strongly agreed that the events helped them to feel connected to people in the community.

Holland Dance Festival
Photo by Maurice

5.0

How did the Self, Peer and Public experience align?

The Culture Counts platform has a three-pronged evaluation process - with participation from self, peer and public assessors. Average scores from each group can be compared to see whether the public and peers understood the creative intentions of the artists or curators.

This chart compares the scores given by self assessors, peer assessors and the public after experiencing the event.

Insights:

Four self assessors and four peer assessors completed the evaluation. Self assessors gave the lowest after scores of the three groups for four dimensions – Captivation, Rigour, Meaning and Relevance – and the highest scores for three dimensions – Connection, Excellence and Local Impact.

Peer and public scores were more closely aligned for five of the seven dimensions, with strong agreement for Captivation, Rigour and Excellence. The public was more closely aligned with self assessors for Local Impact.

Sani Festival, Zanskar valley, India
Photo by sandeepachetan.com travel photography

6.0

Did this event meet self and peer expectations?

Self assessors (who can be artists, arts organisations, curators or festival organisers) and peer assessors (who are usually experts in the field) complete before-event and after-event surveys to measure both their expectations and actual experience of the event.

Four self and peer assessors completed prior and post event evaluations. These charts contain their average scores for each dimension before and after.

Self Assessor Scores

Peer Assessor Scores

Insights:

The event exceeded the expectations of self assessors for seven of the ten dimensions. It greatly exceeded expectations for Connection and Excellence, but fell short of expectations for Captivation, Rigour and Relevance.

The experience of the event exceeded peer expectations for six of the ten dimensions. Like self assessors, it fell slightly short for Captivation and Rigour. In contrast to self assessors, peers found it to be more relevant than predicted but less innovative.

Festival of Colors
Photo by Thomas Hawk

7.0

Artistic Intention Realisation

As self assessors complete a prior survey to assess their expectations of the event, it is possible to assess whether they have met their goals in the eyes of the public. Culture Counts refers to the ability to achieve an organisation's goals as "Artistic Intention Realisation", or AIR for short.

Culture Counts scores Artistic Intention Realisation out of 100, indicating the average difference between self expectations and the actual experience of the public. When a dimension score is below expectations, it is weighted to reflect it being a worse outcome than if the score had exceeded expectations.

This chart contains average difference between self expectations and the public experience for each dimension, followed by the AIR score for this event.

AIR Score: 92

Insights:

Overall, the public experience was very close to self expectations, with the public scoring only 2% lower on average. The biggest difference between self expectations and the public experience was for Meaning (7%), indicating that the public felt that the event was slighty less moving and inspiring than predicted by self assessors.

A higher score (closer to 100) means that the organisation was very good at predicting and meeting expectations, while a lower score (closer to 0) indicates that the organisation was poor at predicting the public experience, or that they performed significantly below expectations. The overall AIR score of 92 indicates that self assessors had a very good understanding of their audiences.

Art Gallery of NSW
Photo by Bentley Smith

8.0

Does gender influence scores?

Every respondent was asked to provide their age, gender and postcode at the end of the survey. This enables scores to be filtered to understand differences in demographics.

This chart shows average scores for each of the dimensions based on gender.

Insights:

While there wasn’t a significant difference in scoring by gender, female respondents gave higher average scores than male respondents for all dimensions. The largest gaps between female and male respondents were for Relevance and Connection.

There were no obvious trends or differences in scoring by respondents of each age category. Interestingly, respondents that had not attended an event by the organisation before gave slightly higher average scores than prior attendees for all dimensions except Local Impact.

Art Antidote To Hate Art Serve
Photo by Elvert Barnes

9.0

How did people rate their experience overall?

At the end of the survey, respondents were asked to rate their experience overall, with a choice of five options – Excellent, Good, Average, Poor, Very Poor.

This chart shows the percentage of respondents that rated the event as Excellent, Good and Average.

Insights:

97% of respondents found the event to be better than average, with 85% having an excellent experience and 12% having a good experience. Just 1% of people thought it was average (equating to one respondent), and no-one surveyed found the experience to be poor or very poor. Four people did not answer this question.

London, Natural Geological Museum
Photo by Luc Mercelis

10.0

Would respondents recommend this organisation to a friend?

Respondents were asked whether they would recommend the organisation to a friend or colleague. Respondents could choose a number from 0 to 10 from a pulldown menu, with 0 meaning not likely at all, and 10 meaning highly likely.

These scores can be used to calculate a Net Promoter Score (NPS). NPS measures loyalty between the organisation and its audience. People giving a score of 9 or 10 are considered Promoters. Detractors are those who respond with a score of 0 to 6. Scores of 7 and 8 are Passives. NPS is calculated by subtracting the percentage of customers who are Detractors from the percentage of customers who are Promoters.

This chart shows the proportion of respondents that would or would not recommend the organisation, followed by the calculated NPS below.

Net Promoter Score: 66

Insights:

91% of respondents gave a score of 6 or more, indicating that they would recommend the organisation to a friend or colleague. Of these, more than half (56%) are highly likely to recommend the organisation. Just 3% were neutral or unlikely to recommend the organisation, while 6% of respondents did not answer this question.

An NPS that is positive (i.e., higher than zero) is felt to be good, and an NPS of +50 is excellent. This score of 66 indicates that audiences have developed a strong level of loyalty toward this organisation..

untitled
by dangerismycat

11.0

Changing Perceptions

Respondents were asked two additional questions:

  • Prior to this event, what three words would you have used to describe this organisation?
  • Following this event, what three words would you use to decribe this organisation?

A selection of responses is contained below:

Prior

Elite, stuffy and high-brow

Not my thing

Old person productions

High quality but conventional

Talented, classical, powerful

Traditional, stale, boring

Classical, traditional, same-same

Classic quality entertainment

Had no idea

Prestigious, classic, cultured

Post

Engaging, fun, responsive

Diverse, talented, engaging

Awesome clever performances

Experimental, brave, virtuosic

Human, universal, professional

Versatile, talented, fun

Innovating, contemporary, fresh

Unbelievable, Clever, Talented

Beautiful, Moving, Awe-Inspiring

Innovative, relevant, cultural

Ballet on the Buses 2
Photo by Pete Ashton

12.0

Changing Perceptions

Two word clouds have also been created from the list of words describing the organisation before and after the event. The clouds enlarge words that were repeated more frequently in survey responses.

Prior

Post

Ballet Shoes
Photo by Kryziz Bonny

13.0

Was the survey sample representative of the audience population?

The Culture Counts digital platform aims to capture survey responses via various methods at minimal marginal cost. Achieving larger samples enables organisations to be confident that the average scores and opinions of the survey group are representative of the total audience.

This chart shows the margin for error for each dimension from the sample.

Insights:

At a 95% confidence interval, the margin of error for dimensions ranged from 1.8% to 3.4%. Margins of error under 5% are considered good representations of the opinion of the population. For example, we can be 95% confident that if we surveyed the entire audience, the average score for Captivation would fall within 1.8% of the average score given by the sample group (which is between 89 and 93 out of 100).