Case study: Embedding evaluation at Bradford Museums & Galleries

From Our Toolkit

Background to the project

Bradford Museums and Galleries is a local authority museum service that runs four sites located across Bradford district’s diverse neighbourhoods. The service was developing a new strategic plan and wanted to use evaluation to help it understand and reach local audiences and build its profile.

How did Bradford Museums & Galleries achieve this?

The museum manager used external funding to secure the help of an evaluation consultant who brought specialist expertise and some much-needed extra capacity. Together they agreed the approach to the work based on the museum’s primary objectives:

  • To understand the profile of people who currently visit each site compared with the demographics of the local neighbourhood in order to inform audience development planning
  • To support continuous improvement by understanding what visitors really value about the service and what needs work
  • To create a consistent, easy to use framework for evaluating future programmes, understanding what works and creating an evidence base to help the service demonstrate its impact to stakeholders and funders.

The evaluation consultant worked alongside other consultants focusing on family learning, audience development, learning and events, which gave the museum service access to a wide range of expertise and a holistic approach to reaching new users.

Understanding audiences

The museum began by conducting a visitor survey to understand the views of users and non-users within the community. Due to the COVID-19 lockdown, this had to be done online. A cross-service team developed questionnaires using SurveyMonkey and Snap Surveys and promoted them via direct email to people on the museum’s distribution list, through council networks, and through social media. The survey was designed to be quick and easy to complete and included a mixture of multiple choice responses and open-ended questions to secure both qualitative and quantitative information. Responses were anonymous but the survey asked for basic demographic data about people’s age, ethnicity, gender and disability to understand the demographic profile of respondents.

There was a good response to the survey with a great deal of insight from museum visitors that showed how much the service is valued. The data generated some powerful advocacy stories as well as highlighting specific areas for improvement that the service can address.  However, comparing the survey results with the museum’s on-site visitor data indicated that those who responded to the survey came from a relatively narrow demographic that wasn’t fully representative of the museums’ on-site audiences, or the neighbourhoods in which the museums are located. Staff are doing further consultation work on site now the museums have re-opened and working with partners in their local neighbourhoods to understand the perceptions of local people and test out new activities and approaches.

Planning for the long term

The evaluation consultant used the data from the visitor survey and other management information (for example, visitor figures; feedback from on-site events) to establish a baseline set of data against which the service will be able to measure change. This is a group of key performance indicators relevant to the museum’s core purpose and main priorities.

The consultant then developed a framework for ongoing evaluation designed to help the service understand the extent to which it is achieving its strategic aims. The framework takes the service’s aims and intended outcomes, identifies a set of indicators that will be used to determine whether it has been successful, and suggests a mixture of qualitative and quantitative evaluation techniques for staff to use to collect evidence. After some research, managers decided to use Arts Council England’s free Audience Finder data and development tool to conduct periodic visitor surveys across its sites, which will also enable it to compare insight on their audiences with other venues locally and nationally.

The consultant identified a range of recommendations to help the service implement its evaluation programme successfully. A key recommendation was to train and support front of house and visitor facing staff so that they understand why they are being asked to collect evaluation data from audiences and how it will be used. The service is planning to implement this, and is working to establish a cross-site evaluation group including staff from across the service to maintain and develop their focus on audience research.

Browse the toolkit
Search the toolkit
Toolkit contents

A one-page list of all the sections of the Toolkit.