Empowering art organizations with data & analytics – McKinsey

Art institutions improve people’s lives and livelihoods in diverse ways. They enrich individuals by fostering education, health, creativity, and empathy. They also act as community catalysts by encouraging inclusion, contributing to tourism, improving quality of life, and growing economies. But many art organizations struggle to measure this impact. That difficulty makes it harder for them both to articulate their performance to boards and other stakeholders and to evolve their operating models strategically so that they can navigate current and future challenges.
Art institutions were struggling with the resilience of their operating models heading into the COVID-19 pandemic because of increasing overheads, changing audience behaviors, and shifts in funders’ priorities. The disruptions triggered by the pandemic only compounded these trends. For example, in the United States, research estimates that pandemic-related losses for the performing-arts industry exceeded $3.2 billion, while changing consumer behaviors decreased audience ticket demand by up to 25 percent.1 An uncertain economic outlook and high inflation in the wake of the pandemic are exacerbating these pressures.
Building an effective data collection and analysis program is always challenging. It is even more challenging when you are working in a field that historically has a lack of experience with the power of data.
But technology can help art institutions navigate current disruptions and build resilience against future ones. We have seen data and analytics transform public- and social-sector organizations by enabling them to make data-driven decisions. While many art institutions recognize the potential benefits of these tools, they lack the capabilities, processes, and infrastructure to build them.
To help art institutions get started on the journey to increase their leverage of technology, McKinsey collaborated with seven leading US art institutions to gain insights into how to strengthen their data and analytics practices (see sidebar, “Partner organizations”). Our work included the creation of an easy-to-use, objective, and scalable dashboard designed to inform institutional strategy, improve business operations, and establish the proper use of data and analytics within each organization.
Historically, the intrinsic difficulty of measuring the impact of art institutions has limited the use of standardized performance measures among them. As a result, such organizations often have an imprecise understanding of their audiences, short-term perspectives on revenue and growth prospects, and planning methods often set up in reaction to unreliable patterns of arts funding. These problems have aggravated the struggles that many institutions have faced following the pandemic-induced declines in revenues and audiences.
Our work sought to create a standardized methodology and dashboard that addressed these challenges and could scale across institutions, serving as a launch pad for further investments in improving the resilience of art institutions through data and analytics. Our shared-design methodology recognized that what constituted success would vary by institution. Thus, our focus was on helping each institution better assess its growth trajectory, match performance with aspirations, and, ideally, learn and exchange best practices with peer institutions.
We studied five measures that we considered essential to assess operational and financial performance: strategic priorities, long-term-performance measures, organizational direction of travel, standardized data sets and analyses, and level of top-management support. Achieving high results in these measures could accelerate the use of data and analytics by a broad range of art organizations, from large, public museums to community theaters. Although diversity, equity, and inclusion and environmental impact issues are strategic priorities for our partner institutions, we didn’t include these measures in the inaugural stage of our analysis, because of the need for greater data maturity on these topics.
Many art organizations have developed some internal practices for data management, thanks to their adoption of customer-relationship-management systems. But few standardized metrics exist for assessing overall performance. For example, such management systems should make it possible to determine the number of unique visitors to an institution, yet few organizations have prioritized this data point. Instead, they have focused on the more traditional measure of total attendance. Similarly, few institutions use the same metrics to understand their donor group. Measures of the number of donors, average donation amount, percentage of first-time donors, and percentage of overall donations by the top ten donors are used interchangeably by institutions for this purpose.
[There’s a] lack of bandwidth and capabilities to analyze and derive insights from visitor data that the institution is already capturing.
The lack of standardized measures can overwhelm and even derail teams when deciding where to focus their nascent data management practices. A quick way to begin leveraging data and analytics is to focus on high-level institutional priorities and lean into the areas where data are already available. We found that with just 15 data points, we could create insights related to six strategic priorities that our seven partner institutions shared (Exhibit 1).
Many art institutions determine their performance by assessing changes year over year. But this approach can complicate decision making, given that art organizations regularly experience significant year-over-year volatility in their operations because of events such as a large, one-off gift or an exhibition that’s a blockbuster. By inadvertently highlighting outlier events, such assessments undermine efforts to use data to inform strategy or benchmark assessments among institutions.
Measuring longer-term trends, such as CAGR, can help smooth out year-over-year distortions. For this reason, we examined longer-term trends, such as rolling changes over three-year periods, to isolate art organizations’ underlying performance and reduce volatility. This enabled us to highlight and contextualize multiyear trends related to the institutions’ strategic priorities, as well as to prioritize the use of existing and readily available data points to facilitate adoption and create familiarity with the process.
In our field, we sometimes give up too easily on new initiatives because they don’t show a quick ROI. It is critical when investing in and assessing big projects to let them fully develop.
Additionally, it takes art institutions longer than it might for other types of organizations to make sustained changes, including developing new talent, changing the course of programming, and influencing audience perception. A major exhibition can take up to three years to plan, including the negotiation of artwork loans, institutional partnerships, temperature-controlled installation, and the production of associated publications. Orchestra performances require arranging for and managing rehearsals, guest conductors, and players. Analyzing events over a three- to five-year time frame therefore enabled a more accurate and strategic measure of where an institution was headed and a reflection on how years of strategic actions had influenced outcomes.
There are few uniformly agreed-upon standards for assessing the performance of art institutions. For example, increased attendance may not necessarily be a good thing, as some institutions deliberately limit attendance to maintain the quality of the visitor experience. Similarly, a significant increase in profits could be viewed as a missed opportunity to have invested in the institution’s noncommercial activities.
We don’t execute educational programs with hopes that people will buy tickets in ten years. We hope it will immediately provide lasting value to the community.
In the absence of standardized performance assessments in nonprofit environments, we focused our analyses on capturing an art institution’s trajectory on topics of strategic importance. The analyses were designed to help generate insights on key questions, such as the following:
To help answer these questions, we designed our analyses to look at the difference between the CAGRs of two data points. This technique shed light on each institution’s underlying trajectory on these strategic topics while recognizing that the answer didn’t represent positive or negative performance.
For example, operating revenues growing faster than attendance could reflect efficient ticket sales for events that required payment. If they grew more slowly than attendance did, it could be a signal that the institution had prioritized free events for the public to drive community engagement. Similarly, if operating revenues grew more quickly than head count did, it could be because of a degree of organizational efficiency. Slower operating-revenue growth relative to head count could be because new hires were prioritized to accommodate audiences after the peak of the COVID-19 pandemic.
Benchmarking performance against peers can reveal valuable insights. This practice is more mature among commercial entities than among art organizations. While all our partner institutions supported performance benchmarking, agreeing on the necessary data sets to invest in was complex because of key differences in how each institution measured audience engagement and business operations.
Each institution has a unique mission and a distinctive manner of fulfilling that mission, depending on the needs of its audience. This makes it complicated for any group of institutions to align on standardized data points. Generous ad hoc data-sharing relationships do exist between specific institutions, but the lack of high-quality, up-to-date, and standardized data sets places limits on the perspectives that art-institution leaders can have of their organizations’ performance.
We need to understand common financial benchmarks to better assess financial performance in the face of crises and uncertainty.
A handful of art-focused, nongovernmental organizations and industry associations—such as the Association of Art Museum Directors, League of American Orchestras, and SMU DataArts—organize data-collecting and -sharing programs for the sector. Participation in these programs is voluntary, and the completeness and quality of the data are difficult for the industry groups to certify. As a result, art institutions are often guarded in how much they invest in or leverage such data sets.
To unlock the potential value of data and analytics for our partner institutions, we invested significant effort in the standardization and quality control of data points. To sustain that value over the longer term, art institutions may need to consider sharing data among themselves in a more systematic way.
In our project, we focused on developing a data dashboard for executives, trustees, and external stakeholders that would inform institutional strategy (Exhibit 2). But setting up a dashboard is only one aspect of the process. Equally important is an organizational mindset that’s both creative and analytical—and that mindset starts at the top.
Art institutions are more likely to be successful with data and analytics if they adopt the best practices of for-profit organizations when it comes to measuring performance. This includes securing executive-level alignment on investing in data and analytics, implementing these tools for the long term, and relying on the insights that these tools generate to set overall institutional direction. At a more granular level, a senior leader needs to be responsible for maintaining the dashboard and ensuring that the data are up to date and of high quality.
It’s also important to establish an annual strategic review meeting with executives and trustees in which they can examine institutional direction, based on insights from the dashboard, and set goals for future performance. Questions to address at this meeting might include the following:
Much more extensive use of data analytics tools and directional-trending analysis is allowing leadership to be proactive in creating revenue streams, serving audiences, understanding the impact of our programs, and making informed pricing decisions.
The process of setting up or enhancing data and analytics capabilities is about building a mindset of experimentation, learning, and iteration. Lack of perfection in the data doesn’t negate the value of data and analytics in building an institutional dashboard.
We suggest that boards and executives in art institutions take action over three horizons to derive greater value from data and analytics:
The authors wish to thank Christophe Abi-Nassif; Brooklyn Museum; Chicago Symphony Orchestra Association; Cleveland Orchestra and Musical Arts Association; Kaywin Feldman; Gabrielle Frasco; Gary Ginstling; André Gremillet; Dale Hedding; Kennedy Center; William McClure; Museum of Fine Arts, Boston; Museum of Fine Arts, Houston; National Gallery of Art; National Symphony Orchestra; Michele Nichols; Cierra Reimche; Christian Schörnich; Maggie Scott; Jeremiah Strickler; Gary Tinterow; K. P. Trueblood; Genevieve Twomey; and Eric Woods for their contributions to this article.
Photo credits:
Art institutions enable and catalyze the creative sector. In the United States, the combined economic value of art and cultural institutions is estimated to be growing at twice the rate of US economic growth.2 Governments and foundations have numerous opportunities to demonstrate that value by driving the establishment of data standards and stronger data collaboration within the sector.
There are also opportunities for art institutions to become lighthouses for impact and business analyses grounded in rigorous data and analytics practices and peer collaboration. Building data and analytics capabilities to capture these opportunities will strengthen the resilience of art organizations and improve lives and livelihoods in the long term.
While an increase in membership rolls is always a good thing, a decrease is not necessarily a harbinger of difficult times. We are experiencing a strong increase in general admission concurrent with soft membership numbers. As our audience skews younger, we find that these new visitors have different patterns of engagement.
Zina Cole and Richard Steele are partners in McKinsey’s New York office, Ben Mathews is a senior partner in the Cleveland office, and Loïc Tallon is an associate partner in the Amsterdam office.
The authors wish to thank Ali Baqeri, Eduardo Doryan, Jesse Han, Ian Jefferson, Thomas Kilroy, McKinsey Center for Government, Scott Nyquist, Ma Cathyrine Ortiz, Jesse Salazar, Vivien Singer, Tanvi Sinha, Daniel Soto, Ramesh Srinivasan, Cassiope Sydoriak, Valeria Valverde, Ping Wen, and Matt Wilson for their contributions to this article.
This article was edited by Rama Ramaswami, a senior editor in the New York office.
This interactive experience was a collaborative effort led by McKinsey Global Publishing, with design and data visualization from Stephen Landau, Matt Perry, and Diane Rice.


Leave a Comment