Photo by Aleksejs Bergmanis

Example Place Report

Data and Insights - Culture Counts™

  • 8.8m


  • 10,000+


  • 8

    Quality Metrics

Press the down arrow to begin

Photo by Unknown

King's College, Cambridge, U.K.
by Paul Green.


Evaluating the place

Culture Counts was used to collect feedback from people who visited the area. The organisation wanted to measure whether they were achieving the four key objectives they had defined for the place:

  1. Connectivity: Supporting walking, cycling, and public transit access
  2. Sociability: Bring people together through places and programming
  3. Proximity: Promote urban efficiency through infrastructure and buildings to facilitate a critical mass of population and employment
  4. Identity: Support high quality urban design, protect heritage and encourage public events and cultural activities

Culture Counts has worked with the sector to develop specific metrics to measure perception of place. The alignment between the organisation’s strategic place objectives and the core Culture Counts dimensions can be seen in Figure 1.

Figure 1. Strategic Alignment

Objective Description Dimension Statement
Connectivity Support walking, cycling and public transit access Mobility I find it convenient to get around this area

Bring people together through places and programming

Safety I feel safe here
Community I feel a sense of community here
Comfort I feel comfortable here in all weather conditions
Proximity Promote urban efficiency through infrastructure and buildings to facilitate a critical mass of population and employment Choice There is a wide range of goods and services here

Support high quality urban design, protect heritage and encourage public events and cultural activities

Facilities I am satisfied with the public facilities here
Vibrancy I enjoy the vibrancy and activity here
Authenticity The place reflects the unique character of the area and its people

Dimensions are assessed on a likert scale, in which respondents move a slider to a point that indicates whether they agree or disagree with the dimension statement. This is an example of a dimension question in the Culture Counts survey tool.

Figure 2. Dimension Question

Photo by Nicole Law.


Visitor Demographics

Survey respondents were asked to provide their age, gender and postcode at the end of the survey. This enables data to be matched to the wider population and responses to be filtered to understand differences in demographics. Figure 3 shows the breakdown of respondents by age and gender, and Figure 4 shows the proportions of respondents who live within the state, elsewhere in Australia, and overseas.

Figure 3. Age and Gender

Note: While Culture Counts does provide the option to describe their gender in another way, only 26 respondents selected this option across all precincts. As such, it is not likely to represent a statistically relevant sample and has been excluded from the graph above.


There was a broad distribution of people between age groups and across genders. The lowest representation was by those aged 20-29, accounting for 13% of total responses, with males aged 20-29 accounting for just 6%. The largest portion of users were between 30 and 39, accounting for one quarter of all responses (25%). The largest individual segment were males between 30-39. These results highlight the broad range and role of the areas that the organisation are involved with.

Figure 4. Place of residence


Each of the precincts experienced very high levels of international tourism, with approximately one in eight visitors living overseas. Conversely, very few visitors are from elsewhere in Australia, accounting for just 3% of responses.

Photo by Daria Shevtsova


Outcome Scores

Eight standard place metrics were used to help assess how successful each precinct was at contributing to the overall placemaking strategy. Respondents were also asked how satisfied they were with each precinct. Figure 5 shows average outcome scores across all redevelopment areas, while Figures 6 and 7 show outcome and satisfaction scores for each precinct.

Figure 5. Overall outcome scores

Figure 6. Outcome scores by project area

Figure 7. Percentage of respondents who agreed they were satisfied by project area


All precincts scored well across the board, with no average scores below 60% even accounting for the fact that precincts are unlikely to excel across all dimension areas due to their specific purpose (e.g. a residential area could be expected to score lower for Choice than a commercial area). Highlighted success areas are Safety (73% average; 77% agree), Comfort (71% average; 74% agree), and Mobility (71% average, 77% agree) with high scores indicating safe, welcoming, and comfortable precincts have been created that are easily navigable. These high scores are likely to contribute to the similarly high satisfaction rating (73% average; 82% agree). The lowest average score was recorded for Vibrancy (61% average; 60% agree), however this is a metric that not all precincts would excel in, particularly those in development.

New year in Moscow
by Jason Leung.


Outcome Comparison

Metrics are likely to contribute to the overall strategic objectives to varying degrees depending on a range of factors, including the context of the place (i.e. area, local demographics), purpose (i.e. commercial, residential, industrial, mixed), and other external factors. Individual metric scores can be compared against satisfaction and the average across all metrics to get a better idea of what elements are contributing most to the success of each project area. Figure 8 shows the percentage of the public who agree or strongly agree with each outcome area, the average proportion of respondents who agreed with that metric, and how they compare to satisfaction. Projects have been ranked highest to lowest based on the average proportion of respondents who agreed with each metric.

Figure 8. Outcome scores by percentage agreement


Precinct 3 received the highest levels of agreement on average, particularly for Safety, Mobility, and Comfort, while Precinct 2 received the lowest. The context of these scores is important — many of those interviewed in Precinct 3 were residents of the area and therefore more likely to rate the area positively (as most of the time spent in the area is in their residence). Conversely, Precinct 2's overall score contains residential, industrial, commercial, and recreational areas.

The average level of agreement for Satisfaction correlated highly with the overall average for each area, indicating the metrics were broadly appropriate to be considered "contributors" to the sense of place at each area. Perhaps most notably was the correlation between the three areas scored highly by respondents (Safety, Mobility, Comfort) and Satisfaction, which may imply respondents consider these elements to be more important. The exception to this rule was in Precinct 3, where Choice was instead rated higher while Safety was rated lower. Given the precincts in Precinct 3 are primarily commercial, this indicates that Choice is weighted more heavily in commercial districts.

Photo by Lily Lvnatikk



A WiFi analytics platform was in place so that visitation statistics could be captured in the area throughout the year. The WiFi analytics platform picks up WiFi impressions (signals) from mobile devices that are looking for WiFi networks. To calculate visitation, a scaling multiplier is applied to WiFi signals received, to account for people not carrying WiFi devices or with WiFi turned off.

The graph below shows annual visitation to the area captured via the WiFi platform.

Figure 9. Visitation over time

Figure 10. Visitation by day

Day Average Visitation
Monday 23,301
Tuesday 23,767
Wednesday 24,121
Thursday 24,969
Friday 28,727
Saturday 28,331
Sunday 27,144
Average 26,766


The data collected from the WiFi analytics platform shows that the area received approximately 8.8 million visits throughout the year. The platform recorded approximately 26,700 WiFi signals per day. Significant visitation to the area highlights the impact of the redevelopment, creating an area with much pedestrian footfall that was previously underutilised or unavailable. The most highly visited days were Friday, Saturday, and Sunday — likely attributable to the majority of events occurring over these days. This could indicate potential opportunity for further utilisation during the week through holding more regular events on these days.

Portland Food Carts
by Varrqnuht.


How did visitor opinion change over time?

Periodic visitor surveys are carried out in the area to assess its changing nature and capture the impact of different stages of development. This enables the visitor experience in the area to be compared over time.

Delivering the survey via free public WiFi in the area helped to collect over 10,000 responses throughout the year (approximately 200 responses per week), allowing detailed trends over time to be examined. The tabs below show the weekly average for each outcome area, the monthly trend and how they compare to key performance indicators (KPIs) set by the organisation.

Figure 11. Outcome scores over time


High scores for Community began in late January/early February and saw a decline leading into Autumn, and then a further decline into Winter. Contrary to this an inflation in average scores was observed during the three week period of the pop-up bar installation that began on the 20th of June. Spring saw a late increase before swiftly increasing leading into Summer. This is a fairly good indication that these impacts are due to seasonality with upward trending scores in the warmer months, and downward trending scores in the colder months of the year. Place activations such as pop-up bar proved to be successful in increasing average scores during the colder months.

Overall, the scores remained above the KPI baseline for majority of the year, with slight dips below the 70% baseline in the month of August.


Authenticity saw a slight decline leading into Autumn before rebounding slightly in May - reaching the highest monthly average since opening across the last year (84%). This coincides with the State Day Festival and the Contemporary Art Festival, which were held on the 6th of May and had significant outcomes relating to the State. Authenticity subsequently saw a 6% drop in August — interestingly the lowest point in the last year.

Overall, monthly average scores have stayed above KPI baseline throughout the year, with only a few dips below the baseline when considering the weekly average, and once when the monthly average fell to 69%. As of January 2017, Authenticity was rated 8% above baseline or an average of 6% across the year.


Vibrancy scores have remained high throughout the year. Beyond the initial peak following opening, scores have remained around 75-80%, with a slight decline in Winter before an increase in Spring and Summer. The low point in Winter and rebound in Spring indicates some influence of seasonality. This is unsurprising for an outcome area assessing vibrancy and activity — things that are intuitively linked to warm, comfortable weather, and clear skies.

Across the year period Vibrancy consistently outperformed the KPI baseline — showing an average increase of 8% across the year, and ending 7% above baseline as of Janurary 2017.


Safety scores have remained high across the year with a slight downtrend since opening. This downtrend is primarily due to the high initial scores received in the first few months of the precinct opening. The lowest point across the year was in August 2016, where the monthly average was as low as 79%. Conversely, the highest weekly scores following the opening was recorded in May (89%).

Safety was assigned the highest KPI out of the eight dimension questions. On average Safety scored 2% above the KPI baseline across the annual period, with a peak difference of 7% in May, and -1% at the lowest point in August.


Comfort levels have varied across the year with highs in Spring, Autumn and Summer, while hitting the lowest scores to date in June 2016. This corresponds with anecdotal evidence and comments complaining about the lack of rain/wind cover.

In the warmer months scores remained well above the KPI that was assigned for Comfort. However at some points in Winter when temperature, wind, and rain were an issue the scores deviated below the KPI baseline. It is expected these scores will remain high leading into Autumn, and improvements will be seen once the walkway is completed in June, providing protection from the elements in the following winter season.


Choice has seen promising growth since opening, with only a slight downward deviations in August, and November 2016. It is only expected that scores will continue to improve further as shops become available leading into 2018.

The increase in positivity has pushed average scores above the KPI baseline that was assigned to Choice, with scores towards the end of the year seeing as much as an 18% improvement.


Facilities has maintained a similar score across the year, with its highest point being in May and lowest point being in June 2017. The low point could partly be reflective of the numerous events held throughout the month, such as the pop up bar which drew in large crowds and put strain on the public facilities available. Anecdotal comments collected during interview surveys in June indicated that this was particularly true later at night as event crowds left the facilities unclean.

The annual outcome average for Facilities was equal to the KPI baseline assigned to the dimension. The largest dip below the baseline was in June 2017, which as above, was largely due to the condition of event facilities and possibly outside immediate control. It is expected this score will remain steady and continue to perform at the baseline rate.


Mobility scores have been steady across the year, with their highest point in May 2016 and lowest point in January 2017. Mobility scores are expected to remain below the KPI baseline as further contstruction occurs, and as the walkway is being completed leading into June of next year.

Average weekly scores rarely dipped below the KPI baseline, indicating that people found it to be an improvement upon the previous infrastructure, prior to the redevelopment, and prior to opening. On the whole, people are still significantly positive about the ease of getting around Precinct 1.

Photo by David Klein.


How does the area compare to other places in Australia?

A broad number of organisations across Australia are using the Culture Counts platform to assess the performance of places they manage, and these scores can be used as a basis of comparison. Seven organisations from across Australia have been used to calculate the Australian Place Benchmark for the eight dimensions assessed, including a university campus, a city foreshore precinct, and various city centre precincts.

The chart below shows the difference between the redevelopment area and the benchmark for other places in Australia.

Figure 12. Australian Place Benchmark


Although the redevelopment area scored slightly below the benchmark for all dimensions except Choice, these scores are expected to improve as the area continues to grow and be refined. Monitoring visitor perceptions via survey over time will enable the organisation to identify which investments and interventions have the greatest impact on visitor experience in the area.

Photo by Unknown

Insights and report prepared by: | (08) 9325 6551