Use your keyboard arrow keys to navigate
Download Report
Photo: Aleksejs Bergamanis
8.8m
Visits
10k
Respondents
8
Quality Metrics
Photo: Unknown

Evaluating the place

Culture Counts was used to collect feedback from people who visited the area. The organisation wanted to measure whether they were achieving the four key objectives they had defined for the place:

  1. Connectivity: Supporting walking, cycling, and public transit access
  2. Sociability: Bring people together through places and programming
  3. Proximity: Promote urban efficiency through infrastructure and buildings to facilitate a critical mass of population and employment
  4. Identity: Support high quality urban design, protect heritage and encourage public events and cultural activities
  5. Culture Counts has worked with the sector to develop specific metrics to measure perception of place. The alignment between the organisation’s strategic place objectives and the core Culture Counts dimensions can be seen in Figure 1.

    Public Assessment

    Dimensions are assessed on a likert scale, in which respondents move a slider to a point that indicates whether they agree or disagree with the dimension statement. This is an example of a dimension question in the Culture Counts survey tool.


King's College Cambridge, UK — Photo: Paul Green

Visitor Demographics

Survey respondents were asked to provide their age, gender and postcode at the end of the survey. This enables data to be matched to the wider population and responses to be filtered to understand differences in demographics. Figure 3 shows the breakdown of respondents by age and gender, and Figure 4 shows the proportions of respondents who live within the state, elsewhere in Australia, and overseas.

Note: While Culture Counts does provide the option to describe their gender in another way, only 26 respondents selected this option across all precincts. As such, it is not likely to represent a statistically relevant sample and has been excluded from the graph above.

Insights

There was a broad distribution of people between age groups and across genders. The lowest representation was by those aged 20-29, accounting for 13% of total responses, with males aged 20-29 accounting for just 6%. The largest portion of users were between 30 and 39, accounting for one quarter of all responses (25%). The largest individual segment were males between 30-39. These results highlight the broad range and role of the areas that the organisation are involved with.

Insights

Each of the precincts experienced very high levels of international tourism, with approximately one in eight visitors living overseas. Conversely, very few visitors are from elsewhere in Australia, accounting for just 3% of responses.

Photo: Nicola Law

Outcome Scores

Eight standard place metrics were used to help assess how successful each precinct was at contributing to the overall placemaking strategy. Respondents were also asked how satisfied they were with each precinct. Figure 5 shows average outcome scores across all redevelopment areas, while Figures 6 and 7 show outcome and satisfaction scores for each precinct.


Insights

All precincts scored well across the board, with no average scores below 60% even accounting for the fact that precincts are unlikely to excel across all dimension areas due to their specific purpose (e.g. a residential area could be expected to score lower for Choice than a commercial area). Highlighted success areas are Safety (73% average; 77% agree), Comfort (71% average; 74% agree), and Mobility (71% average, 77% agree) with high scores indicating safe, welcoming, and comfortable precincts have been created that are easily navigable. These high scores are likely to contribute to the similarly high satisfaction rating (73% average; 82% agree). The lowest average score was recorded for Vibrancy (61% average; 60% agree), however this is a metric that not all precincts would excel in, particularly those in development.

Photo: Daria Shevtsova

Outcome Comparison

Metrics are likely to contribute to the overall strategic objectives to varying degrees depending on a range of factors, including the context of the place (i.e. area, local demographics), purpose (i.e. commercial, residential, industrial, mixed), and other external factors. Individual metric scores can be compared against satisfaction and the average across all metrics to get a better idea of what elements are contributing most to the success of each project area. Figure 8 shows the percentage of the public who agree or strongly agree with each outcome area, the average proportion of respondents who agreed with that metric, and how they compare to satisfaction. Projects have been ranked highest to lowest based on the average proportion of respondents who agreed with each metric.

Insights

Precinct 3 received the highest levels of agreement on average, particularly for Safety, Mobility, and Comfort, while Precinct 2 received the lowest. The context of these scores is important — many of those interviewed in Precinct 3 were residents of the area and therefore more likely to rate the area positively (as most of the time spent in the area is in their residence). Conversely, Precinct 2's overall score contains residential, industrial, commercial, and recreational areas.

The average level of agreement for Satisfaction correlated highly with the overall average for each area, indicating the metrics were broadly appropriate to be considered "contributors" to the sense of place at each area. Perhaps most notably was the correlation between the three areas scored highly by respondents (Safety, Mobility, Comfort) and Satisfaction, which may imply respondents consider these elements to be more important. The exception to this rule was in Precinct 3, where Choice was instead rated higher while Safety was rated lower. Given the precincts in Precinct 3 are primarily commercial, this indicates that Choice is weighted more heavily in commercial districts.

New Year in Moscow — Photo: Jason Leung

Visitation

A WiFi analytics platform was in place so that visitation statistics could be captured in the area throughout the year. The WiFi analytics platform picks up WiFi impressions (signals) from mobile devices that are looking for WiFi networks. To calculate visitation, a scaling multiplier is applied to WiFi signals received, to account for people not carrying WiFi devices or with WiFi turned off.

The graph below shows annual visitation to the area captured via the WiFi platform.

Visitation by day

Day Average Visitation
Monday 23,301
Tuesday 23,767
Wednesday 24,121
Thursday 24,969
Friday 28,727
Saturday 28,331
Sunday 27,144
Average 26,766

Insights

The data collected from the WiFi analytics platform shows that the area received approximately 8.8 million visits throughout the year. The platform recorded approximately 26,700 WiFi signals per day. Significant visitation to the area highlights the impact of the redevelopment, creating an area with much pedestrian footfall that was previously underutilised or unavailable.

The most highly visited days were Friday, Saturday, and Sunday — likely attributable to the majority of events occurring over these days. This could indicate potential opportunity for further utilisation during the week through holding more regular events on these days.

Photo: Lily Lvnatikk

How did visitor opinion change over time?

Periodic visitor surveys are carried out in the area to assess its changing nature and capture the impact of different stages of development. This enables the visitor experience in the area to be compared over time.

Delivering the survey via free public WiFi in the area helped to collect over 10,000 responses throughout the year (approximately 200 responses per week), allowing detailed trends over time to be examined. The tabs below show the weekly average for each outcome area, the monthly trend and how they compare to key performance indicators (KPIs) set by the organisation.

Insights

Mobility scores have been steady across the year, with their highest point in May 2016 and lowest point in January 2017. Mobility scores are expected to remain below the KPI baseline as further construction occurs, and as the walkway is being completed leading into June of next year.

Average weekly scores rarely dipped below the KPI baseline, indicating that people found it to be an improvement upon the previous infrastructure, prior to the redevelopment, and prior to opening. On the whole, people are still significantly positive about the ease of getting around Precinct 1.

Portland Food Carts — Photo: Varrqnuht

How does the area compare to other places in Australia?

A broad number of organisations across Australia are using the Culture Counts platform to assess the performance of places they manage, and these scores can be used as a basis of comparison. Seven organisations from across Australia have been used to calculate the Australian Place Benchmark for the eight dimensions assessed, including a university campus, a city foreshore precinct, and various city centre precincts.

The chart below shows the difference between the redevelopment area and the benchmark for other places in Australia.

Insights

Although the redevelopment area scored slightly below the benchmark for all dimensions except Choice, these scores are expected to improve as the area continues to grow and be refined. Monitoring visitor perceptions via survey over time will enable the organisation to identify which investments and interventions have the greatest impact on visitor experience in the area.

An NPS that is positive (i.e., higher than zero) is felt to be good, and an NPS of +50 is excellent. This score of 66 indicates that audiences have developed a strong level of loyalty toward this organisation.

Photo: Unknown









Data and Insights by


T: (08) 9325 7476