The Configure page in the platform is the first stage of building your survey. This is where you input your main survey settings: Survey Name, Start and Close Dates, Survey Introduction, Custom Branding, Survey Types and Delivery Types.
The Design page in the platform is the stage of building your survey. This is where you add and edit your survey questions and content.
A standardised statement that respondents can agree or disagree with, rated along a slider. This is how we measure the specific outcome identified, by asking the respondent to indicate their level of agreement with the Dimension Statement according to their experience of the program, event, or activity being evaluated.
Domains are the five key overarching areas of impact we measure – these are Cultural, Social, Civic, Economic and Place. Alongside this, we also investigate areas of Quality too.
An evaluation refers to the grouping of a number of surveys relating to one particular program, event, or activity. You can have multiple surveys within one evaluation that each gather feedback from different areas. For example, a Public Programs evaluation can have a survey for each public program held within one calendar year. By grouping these surveys together you can view data either individually for each program, or in aggregate and explore the overall impact of your public programs for that year.
The Example Evaluation is the example folder within the platform which is shared with new clients as a resource to help guide them in their survey creation. It contains one prior, one post and one public survey which can be easily copied to create surveys with pre-determined conditions.
Outcomes are the things that change for an audience member or participant as a result of a program, event or activity. Outcomes can refer to new skills learnt, new social connections made, being inspired or feeling a sense of achievement – the intangible elements of an individual’s experience.
A number of outcome areas sit within each of the domains. For example, if we are looking at Cultural impact, a number of outcome areas within this domain could be Insight, Appreciation, or Enrichment.
Refers to the in-built sets of metrics within The Culture Counts evaluation platform. These metrics were designed to measure outcomes associated with cultural and community experiences across Cultural, Social, Civic, Economic, Environmental and Quality impact areas. These questions have been developed through extensive consultation with the cultural sector, internationally tested and academically validated. The use of standardised language and consistent collection methodologies enables data to be aggregated, with opportunities for sector benchmarking and big data insights.
The Invite page in the platform is where you can invite peer and self assessors to partake in your surveys.
Peer Assessors are people that possess expert knowledge in relation to your activity who can be evaluated to gain deeper insights. Peer assessors may include staff from similar organisation types, artists, academics or funders (depending on what you’re evaluating).
Required questions are questions which are made compulsory within your survey. This means that a respondent can not skip the question or complete a survey without answering it. We highly recommend against using this feature unless absolutely necessary.
Self Assessors are respondents of a survey who have contributed to or worked on producing the activity being evaluated. Self Assessors may include artists, producers, curators, managers or board members.
The Summary page in the platform is where you can view a real-time overview of your survey, including the number of questions, a list of nominated self and peer assessors and a summary of the results so far.
Surveys refer to the individual sets of questions used to evaluate your activities, which are stored within your evaluation folders.
Survey Logic is a feature that you can use when designing your survey, which will either hide or show a particular question based on a respondent’s answer to another question. Asking targeted questions helps keeps surveys short and achieves increased response rates.
The Reporting Dashboard is a built-in feature within Culture Counts’ platform. The reporting dashboard is where you can view and compare your results in real-time, accompanied by a range of auto-generated graphs. This is also where you can export all of your collected data from your evaluations.
We calculate sample variance to determine the level of confidence that you can have that your results are accurate.