The Arts Don't Have a Data Problem. They Have a Translation Problem.

Arts organisations are not short of audience data. What they are short of is a fast, practical way to turn it into better decisions. AI has changed that.

Last weekend I took my children to a Hallé children's concert at Saffron Hall. The place was packed. Kids were laughing, stamping their feet, completely absorbed. The performance was world-class. And walking out afterwards, surrounded by families who'd clearly had one of those rare, effortless cultural experiences, I found myself thinking: how many of them will come back?

Not because the concert wasn't good enough – it was brilliant. But because I know, from years of working in arts marketing, that the answer depends almost entirely on what happens next. Whether anyone follows up. Whether the organisation knows who was in the room, what brought them there, whether they've been before, and what might bring them back. In most venues, that information exists somewhere – in the ticketing system, in the CRM, in a mailing list. But it rarely makes it into the conversations that shape what happens next.

That's the translation problem. Not a lack of data. A failure to turn it into decisions.

I've been thinking about this a lot recently, because after a decade building data infrastructure and decision systems in the commercial world, I'm returning to the arts – as an adviser, a trustee, and someone who cares deeply about the sector's future. And the gap between what's now possible with data and what most arts organisations are actually doing with it is, frankly, striking.

Why this isn't a new conversation – and why it's different now

The arts sector has been talking about data for years. Nesta's Digital Culture research tracked digital adoption across UK arts organisations from 2013 to 2019 and found that data-led activity had barely moved. McKinsey published a report on data and analytics for arts institutions recommending standardised dashboards. The Wallace Foundation funded a multi-year initiative in the US and concluded that data is not a magic bullet – organisations that used it well combined it with deliberation, not just dashboards. Arts Council England launched Illuminate in 2023 to help funded organisations gather and analyse audience data. Audience Finder did similar work before it.

So this is not a sector with no tools, no research, and no awareness. Much of what's been written about data in the arts is either high-level strategy, careful academic research, or theoretical. Less of it starts from the practical question arts leaders are actually asking: what can we build, how quickly, and will it work?

The sharper question is: if the infrastructure and the reports and the platforms already exist, why does the translation problem persist?

Because the sector has been given systems of record when what it increasingly needs are systems of decision.

Other industries hit the same wall and broke through it. In 2002, the Oakland Athletics were spending a fraction of what the New York Yankees spent on players. Billy Beane's insight – the one that became Moneyball – was not that baseball needed more data. Scouts had been collecting data for a century. His insight was that the sport was measuring the wrong things. Batting average was a vanity metric. On-base percentage was the one that predicted wins. The data had always been there; nobody had asked the right question of it.

The arts are in a similar position. Organisations have years of ticketing data, membership records, visitor surveys, donor histories and campaign metrics. But they're still largely measuring the equivalent of batting average – total attendance, percentage capacity, income per performance, membership numbers at renewal – when the questions that matter most are about retention, depth of engagement, and long-term audience value.

The off-the-shelf trap

Most arts organisations already have systems in place – the problem is not a lack of technology. CMSA Consultancy's #WhoTicketsWho? survey (2025) of more than 600 UK performing arts venues identified 32 separate ticketing platforms in active use. Spektrix is now the dominant player, used by around 59% of venues surveyed, with Tessitura, TicketSolve, TicketSource and AudienceView each serving smaller shares. Museums and heritage organisations tend toward platforms like TOR Systems or Blackbaud Altru. Membership bodies and arts charities use a different set again – Beacon, Donorfy, CiviCRM, Access Charity CRM – often alongside Mailchimp, spreadsheets and whatever else has been bolted on over time.

The result is a familiar pattern: data gets exported into spreadsheets, someone builds a pivot table, a report appears weeks after the event, and by the time it reaches the people making programming or fundraising decisions, it's already historical. The insight arrives too late to influence anything.

For most organisations, the honest response is not to reach for a solution at all. The day-to-day demands of running a venue, a membership body or a festival leave little headroom for rethinking how data is used, and if what you've always done still feels adequate, the case for change rarely gets made. Among those who do recognise the problem, the instinct is usually to buy more software – a dashboard tool, a reporting add-on, another SaaS subscription. But layering generic tools on top of systems that were never designed to answer your most urgent strategic questions doesn't solve the problem. It just makes it more expensive.

The same reflex shows up on the people side. Faced with rising demands for "insight", the default is to look for a data analyst, a consultant or an agency. But a growing share of what we used to ask those people to do – pulling lists, slicing segments, spotting basic patterns – can now be handled by the tools themselves, if we wire them in properly. If we're serious about this, the payoff shouldn't be fewer staff; it should be staff whose time is freed from drudgery to do the relational, creative work that no algorithm can replace. The scarce resource should be human judgement, not human spreadsheet time.

What has actually changed

For years, bespoke was code for expensive, fragile and over-engineered. If you wanted a custom tool, you were looking at consultants, procurement, long timelines and the sort of budget conversation that made most arts organisations stop before they started.

That is no longer true.

When Tesco launched its Clubcard in 1995, the data it generated was so revealing that the chairman reportedly said the people analysing it knew more about his customers after three months than he'd learned in thirty years. But scaling that analytical capability – Dunnhumby – took years and tens of millions of pounds of investment. Today, the same kind of analysis can be built on top of a ticketing system or membership database in weeks, for a fraction of that cost. AI has changed the economics of bespoke problem-solving. It's now possible to clean, structure, connect and interrogate fragmented data far faster than before, and to build narrow, useful internal tools in days rather than months.

By "bespoke", I don't mean commissioning a six-figure platform. I mean three things: a thin data layer that connects the systems you already use; a small set of models that answer your most important questions – who attends, who returns, who might give; and interfaces that slot into existing workflows for marketing, development and programming, rather than another dashboard nobody has time to open.

"Bespoke" doesn't have to mean fragile, either. A thin data layer can be built on well-understood, commodity components, documented properly and maintained by more than one person. The risk isn't the technology; it's treating this as a side-project rather than a core part of how the organisation works.

But I want to be honest about what this requires, because the technology is the easy part.

The harder work is what happens before you build anything. Most arts organisations have data that is incomplete, inconsistent, or siloed across systems that don't talk to each other. When I built a business intelligence platform for a commercial operation, we had multiple sales channels, three different ID formats for the same customers, and no consistent way of categorising products across systems. We didn't start with AI. We started by agreeing a single customer ID, cleaning the worst gaps, and standardising one quarter's data properly. Only then did the models start telling us anything we could trust, rather than just dressing up bad data as insight. Arts organisations face exactly the same challenge – years of manual entry, staff turnover, and platform migrations leave gaps that no tool can magically fix. The first step is always cleaning, structuring and connecting what already exists.

The second challenge is people. A bespoke tool is only useful if the team using it understands what it's telling them and trusts it enough to act on it. That means building capability alongside infrastructure – not handing over a dashboard and walking away, but working with marketing teams, development directors and programme planners to embed data into how they already make decisions. Change management matters as much as the technology. If a tool isn't embedded into weekly routines – into marketing meetings, programming conversations, fundraising pipelines – it will go the way of every unloved dashboard: admired once in a board paper, then forgotten.

For a national company, this might mean a dedicated internal product team. For a small venue, it might mean a modest shared tool or partnership that makes better use of the data they already hold. The principle is the same; the scale is not.

What this looks like in practice

Once the foundations are right, things become possible that most arts organisations currently cannot do.

A lightweight internal app that flags which first-time bookers are most at risk of lapsing unless they're contacted in the next three weeks. A live view showing, mid-run, which productions are bringing in genuinely different audiences rather than recirculating the same attenders. A development tool combining attendance frequency, booking value and event mix to surface likely donor prospects. A season-level overlap map showing whether programming is deepening loyalty across art forms or fragmenting into disconnected pockets of attendance. A membership body spotting which members are drifting toward lapse based on declining event attendance, reduced engagement with CPD or a shift in communication behaviour – months before their renewal date. A museum identifying which exhibition visitors convert into members and which temporary exhibition campaigns bring footfall that never returns.

None of this requires replacing the ticketing system or the membership database. None of it requires a six-figure software build. And none of it is remotely fanciful now. For the first time, arts organisations can afford software that fits their questions, rather than forcing their questions to fit the software.

The retention question alone is worth the effort. Spektrix and Indigo’s Tomorrow’s Audience report found that first-time bookers represented 54% of all bookers in 2023, while only 19.5% of people who first bought in 2022 returned in 2023. Those figures need reading carefully – the same research found that on average, 62% of first-time bookers had in fact attended the organisation before. "First-time booker" is not the same as "brand new audience member." But the underlying signal is clear: the sector is generating large cohorts of first-time bookers and remains much weaker at converting those moments into repeat relationships. Membership organisations face the same challenge in different clothing – renewal rates, lapsing members, the gap between signing up and genuinely engaging. The problem across the sector is not acquisition. It's retention.

Programming and planning with evidence

One of the most underexplored applications is in artistic planning itself – programming seasons, curating exhibitions, scheduling learning programmes, shaping a membership offer. These decisions are currently made largely on instinct, peer networks and whatever historic data can be accessed – usually in a format that makes comparison across seasons, exhibitions or audience segments difficult at best.

In most artistic careers, the skill that gets rewarded is instinct: a feel for repertoire, for talent, for timing, for what a community needs. Data is often framed as the opposite of that – cold, managerial, reductive. In reality, the best use of data in the arts looks a lot like the best rehearsal process: you try something, you listen hard, you adjust. The difference is that we're finally in a position to listen to more than ticket sales, visitor numbers and gut feel.

When Netflix greenlit House of Cards in 2013, it wasn't a shot in the dark. The company knew from its viewing data that a large audience existed for political drama, for David Fincher's direction, and for Kevin Spacey's performances – and that the overlap between those three audiences was substantial. The decision to commission the show was a creative one, but it was informed by data that made a bold commission look less like a gamble and more like a calculated risk. That's the kind of insight arts organisations could be using – not to play it safe, but to back adventurous choices with evidence.

This is not about replacing artistic judgement. No data model should tell an artistic director what to programme or a curator what to exhibit. But data can inform that judgement in ways that simply weren't available before. Which contemporary works are building audiences over time, and which generate a spike of interest that doesn't convert into ongoing engagement? How does the balance of repertoire across a season affect overall subscription retention? Which exhibitions draw new visitors who then become members, and which attract crowds that never return? When a membership body launches a new CPD strand, does it deepen engagement among existing members or reach a genuinely different cohort?

Data doesn't have to push programming toward the safe middle ground. Used well, it can protect adventurous choices by making their impact visible.

Speed doesn't mean chasing whatever sells fastest or renews easiest. It means getting feedback on the audiences you care about in time to adjust how you support them, talk to them and plan for them.

The fundraising connection

This matters far beyond the box office. Public funding pressure has not disappeared. House of Lords Library analysis found that grant-in-aid for arts and cultural organisations fell by around 18% in real terms between 2010 and 2023. Artquest's “Restore the Arts” report found that more than half of Arts Council England's National Portfolio Organisations were in a precarious financial position, with cumulative deficits reaching £118 million in 2023–24, up from £29 million in 2015–16.

In that environment, organisations need a stronger grip on who they reach, who comes back, who deepens their engagement and where growth is genuinely coming from. They also need to explain that convincingly to boards, funders and partners.

The old question from funders was "How many people came?" The harder question now is "How did this investment change who engages, how often, and how deeply?" Without joined-up data, you can't answer that second question convincingly. With the right infrastructure, you can show genuine audience development – not just attendance figures, but who was reached, whether they came back, and what the data tells you about where to find more of them.

The funding landscape has shifted decisively from outputs to outcomes. Arts Council England now requires NPOs to conduct structured evaluations using its Impact and Insight Toolkit. Esmée Fairbairn asks applicants to define trackable outcomes with specific indicators. Corporate sponsors expect dashboards showing audience reach, demographics and engagement. Yet Nesta's own research found that the standard of evidence across the sector remains low – which means organisations that can demonstrate genuine movement, not just describe their mission, have a material advantage when competing for support.

The line that needs watching

There is a conversation here that goes beyond optimisation, and it would be dishonest to pretend otherwise.

Done well, using data in the arts is about understanding audiences better and serving them more thoughtfully. Done badly, it tips into clumsy profiling, over-targeting and behaviour that feels more like retail surveillance than cultural stewardship. The question isn't just "can we do this?" but "are we comfortable doing it, and on what terms?" That's an artistic and ethical conversation as much as a technical one.

It's also a legal one. The ICO is clear that individuals have an absolute right to object to the use of their personal data for direct marketing, and that this right extends to profiling for direct marketing purposes. Organisations must tell people about that right. Fundraising marketing has its own specific compliance requirements under PECR and data protection law. Any organisation that wants to become more sophisticated in audience modelling also needs to become more disciplined about lawful basis, transparency, restraint and governance.

There's also a question about who gets seen. Data-driven approaches inherently favour audiences who leave digital footprints – people who book online, open emails, engage on social media. Audiences who don't do those things risk becoming invisible to systems built around digital behaviour. Any organisation serious about using data to build audiences needs to think just as carefully about the people the data cannot see as about those it can.

Where to start

For most organisations, the path forward is simpler than it sounds.

First, audit your data. What do you have, where does it live, how clean and connected is it? Be honest about the gaps. Most organisations overestimate how usable their data is until someone actually looks.

Second, pick three questions that really matter – not thirty, not ten. "Who are we retaining?" "Who are we losing?" "Who might give?" If you can't answer those three reliably today, everything else is secondary. In practice, most institutions find that a small handful of well-chosen metrics, tracked consistently, does far more for decision-making than sprawling "data strategies" that try to measure everything.

Third, prototype one small tool that answers one of those questions, and train the relevant team to use it every week. Not every month. Every week. The value of data insight compounds with frequency – a quarterly report changes little; a weekly habit changes everything.

What needs to change

The arts don't need to become more corporate. They don't need to treat audiences as "customers" or reduce programming and curation to whatever the data says will sell. But they do need to stop accepting a world in which years of valuable audience and member behaviour sit trapped in systems that describe the past while offering almost no help with the next decision.

Adopting this approach requires more than a technology purchase. It requires honest assessment of data quality, investment in team capability, and a willingness to change how decisions get made. None of that is trivial, and anyone who tells you it is hasn't done it.

The organisations that thrive in the next decade won't be the ones with the biggest software contracts. They'll be the ones that recognise that AI has changed the speed and cost of bespoke problem-solving, and that build small, targeted tools on top of the systems they already have – tools that answer real questions, fit real workflows, and translate raw data into better decisions about what to programme, what to exhibit, how to grow audiences and memberships, and how to make the case for support.

The arts don't have a data problem. They have a translation problem.

And for the first time, they have very few excuses left for not solving it.


Joel Garthwaite is a marketing and growth adviser working at the intersection of culture, audience development and commercial strategy. He began his career as a professional contemporary classical musician before moving into senior marketing and leadership roles across the arts and commercial sectors. He writes and speaks on audience growth, arts leadership, data strategy and the practical use of AI in decision-making.

Interested in speaking, advisory work or commissioned writing? I work with arts and cultural organisations on audience growth, strategic marketing, data infrastructure and AI-enabled decision tools. For speaking opportunities, consultancy enquiries or editorial commissions, please get in touch.

Sources cited in this article:

Previous
Previous

Are Music Colleges Still Failing Their Students? Twelve Years On