You can read this because our 400-plus members make it possible.
Good right?

Where is the cultural sector's AI ambition?

W

In a previous article I dwelt on the outcry surrounding the British Museum's use of artificial intelligence. What became apparent there was less a technological incident than a cultural tension. The discussion was not primarily about algorithms, but about trust, authorship and the question of who is in charge when machines interfere in the realm of meaning and interpretation. It was a debate in which technology acted as a catalyst for a deeper discomfort.

This discomfort is echoed, albeit more subtly, in recent policy and standards documents from the Dutch cultural sector. AI is now an established part of the discourse, but rarely appears in it as an explicit promise. It is mostly treated as something to be carefully framed. Or, less euphemistically: controlled.

Check

When it is AI policy framework of DEN, the Museum standard 2025 and whether the introduction by Birgit Donker at De Balie on 5 February, a remarkably consistent picture emerges. The papers differ in tone, scope and intention, but share an underlying logic. AI is approached from the perspective of control, regulation and risk mitigation. The language is that of control, procedures and responsibilities.

The AI policy framework of DEN is the most explicit in this. AI appears here as an administrative practice that calls for policy. Reliability, copyright and portrait rights, privacy, transparency, bias and impact on work are systematically run through. Without exception, the questions the paper asks are rational and necessary. How is AI output controlled? How do organisations deal with errors or hallucinations? How do AI applications relate to copyright? What measures are needed to protect personal data? When is labelling or explanation required?

The underlying attitude is managerial. AI introduces risks and uncertainties and therefore requires explicit agreements. Structure and accountability are at the core. AI appears less as a creative engine than as a managerial task.

Stability

The Museum standard 2025 chooses a different route, but arrives at a similar outcome. AI is hardly mentioned explicitly, but is implicitly everywhere. Risk management, digitalisation and security of digital information naturally include AI applications. The standard thus does not treat AI as an exceptional technology, but as part of existing professional standards. Everything that applies to IT, data and continuity also applies to AI.

Again, stability dominates. Procedures, assurance, PDCA cycles and continuity policies form the framework. AI is incorporated into an institutional logic focused on quality, safety and professionalism.

The lecture by Birgit Donker introduces a different register. Here, AI shifts from organisational level to a broader social and economic context. The focus is on copyright, value creation and the position of creators. AI is described as a system that disrupts existing distribution mechanisms, but remarkably, the text opens emphatically with an acknowledgement that often gets underplayed in the debate: AI makes many things possible and offers opportunities.

Instrument

That passage is not purely rhetorical. The document explicitly points to artists and creators who are already using AI as a tool and material. AI, artificial intelligence, makes a lot possible; it offers opportunities. Artists are taking advantage of it, such as the performance collective URLAND with their performance live art in digital times. Donker argues that the creative sector should not be seen exclusively as a victim of technological change, but rather as a source of innovation. AI innovation programmes should be open to artists, she argues, and other parties should structurally involve creative creators. Here, AI appears not only as a risk or threat, but as an opportunity.

Donker's speech contains another element worthy of attention: her proposal for a 5 per cent AI levy on subscriptions, intended to fund a transition fund for creators. The principle behind this is not foreign to me. As chairman of the media committee of the Culture Council, I proposed a similar system for the streaming sector years ago: a limited levy on subscriptions, so that part of the revenue from distribution flows back to domestic production. It took time, but the principle was eventually implemented. That money stays within the sector and strengthens the ecosystem of creators and producers.

White trim

With AI, things are more fundamentally different. The providers operate globally, the value chains are diffuse and the effects extend far beyond arts and culture. The transition caused by generative AI potentially affects the broad field of white-collar professions. A transition fund may provide relief for creative creators, but the structural shift will become relevant for a large proportion of knowledge workers. This shifts the issue from cultural policy to labour market policy - and the AI debate inevitably becomes a debate about the design of work in the 21st century.

After reading these three documents, a curious emptiness remains palpable. What dominates almost everywhere is the language of control. Legal, ethical and organisational frameworks define the discourse. AI is regulated, standardised, analysed.

Bias

This reflex is understandable. Cultural institutions operate in an environment where public funds, social legitimacy and trust are central. Caution is not a weakness, but an institutional necessity. Yet this creates a subtle bias. AI appears primarily as a source of risk, not as a tool for new forms of cultural value. What is less visible is its translation into practice and ambition. This is remarkable, because AI obviously offers more than risk as Donker rightly points out.

It opens up new spaces for interpretation and presentation. Big language models can make connections between objects, contexts and narratives on a scale that transcends traditional research methods. Not as a replacement for curators, but as an extension of their analytical tools. AI interfaces enable forms of audience interaction in which visitors are no longer merely passive recipients, but active interlocutors. Technology can deepen accessibility, support multilingualism and strengthen inclusivity.

Paradoxical risk

New possibilities are also emerging within artistic practice. Artists are already experimenting extensively with generative systems, synthetic images and algorithmic narratives. There, AI acts not as a threat, but as new material. Dutchman Jeroen van der Most is an internationally leading AI artist.

Approaching AI primarily defensively creates a paradoxical risk. Not just legal or reputational damage, but loss of relevance. Audience expectations are changing. Younger generations naturally move in interactive, responsive digital environments. Institutions that see AI solely as a compliance issue risk following technology rather than helping to shape it.

Acceptable risks

Innovation and mastery tolerate each other poorly when strictly separated. AI thrives on experimentation, iteration and unexpected applications. Policy frameworks assume predictability and control. Perhaps this is why AI requires an additional layer alongside norm and policy: an explicit innovation vision.

Not only which risks are to be avoided, but also which risks are acceptable in the pursuit of innovation. The cultural sector traditionally has a unique capacity to critically and creatively explore new media. Art and technology have always had a reciprocal relationship. Why should AI be an exception in this?

AI policy is necessary. But policy alone mostly prevents harm. Ambition creates direction. Perhaps therein lies the next step for the sector: no longer seeing AI exclusively as something to be controlled, but as a new space in which cultural institutions can redefine their own role.

Because technology is always changing. The question is who determines its meaning.

PS: Interesting to follow a project in which they are working on the design of a global online archive that guides the impact of AI on art, culture and knowledge production https://www.allianceimpact.org/

Appreciate this article!

donation
I donate

Respond!

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Popular posts

Recent publications

Analogue or AI?

Analogue or AI?

Don't forget to fathom AI. And Holland Festival, and Jip and Naaz, and VPRO.

Categories