Building Evaluation Capacity with Santa Cruz Public Libraries

How can libraries strengthen program evaluations and improve their impacts?

by Rebecca Joy NorlanderJoanna Laursen BruckerNicole LaMarcaElliott Bowen
Jul 11, 2023

Through interactive learning workshops, we equip organizations with the tools and techniques for evaluating the impact of their work. Our workshops are tailored to specific organizations' goals and needs, and consist of a series of capacity-building exercises in which we work with partners to co-create the evaluative strategies they need to assess and deepen their impacts.

Curious to know what these workshops consist of, and how they might benefit your organization? To show exactly what our evaluation capacity-building work looks like, we'd like to share an example from Spring 2023, when we helped staff at the Santa Cruz Public Libraries (SCPL) devise strategies and techniques for measuring the impact of their programs.

Building a foundation in evaluation basics (January 2023)

The first workshop, which was conducted virtually in January, was essentially a crash course in evaluation and evaluation methods. We began by addressing some key questions, including:

  • Why do we do evaluation?
  • What are some commonly used evaluation terms, and what do they mean?
  • What are some different evaluation types and approaches?
  • How do we determine which evaluation methods to use?

After addressing these questions, we introduced some key tools for measuring programming impacts – including gap analyses and logic models. SCPL staff were then asked to fill out a logic model worksheet for one of their programs or services, in preparation for a second workshop.

Mapping and modeling assets, gaps and goals (March 2023)

The goal of our second workshop was to help SCPL staff think critically about two questions. First, where does evidence of program impacts already exist? Second, in cases where these impacts were unknown, how could evidence of them be gathered? Building on skills and knowledge acquired in the first workshop, we reviewed specific data collection methods and best practices for evaluation. We also led a gap analysis exercise, helping staff differentiate between what they already knew about their audiences and programming and what else they wanted to know. We thematically organized this information to identify staff priorities, and then led a collective brainstorming session in which the entire group decided on initial strategies for applying what staff had learned about evaluative methods to existing programs and services. During this session, staff identified who they wanted to collect data from, and determined how they wanted to collect it.

Creating a plan to measure program impact (April 2023)

In April, we facilitated a 4-hour in-person workshop in Santa Cruz, California to help staff figure out how to apply what they learned in the first two sessions to specific library programs. After dividing staff into smaller groups based on their job responsibilities, we conducted a series of breakout sessions, helping each group refine and align their logic models with their programs, services and the library's broader strategic plan.

At the conclusion of the workshop, each group had developed explicit next steps for evaluating their programs and services, as well as ideas for instruments and timelines to move forward with measuring their impact. Following the workshop, staff incorporated the framework into their regular and ongoing program proposals.

Interested in Setting Up a Workshop?

Please fill out this short form and we will connect with you.

Photo by Adam Winger at Unsplash

Support research that has a real world impact.