The cultural and journalistic sectors view artificial intelligence with a mixture of fascination and suspicion. That is understandable. Anyone who compares recent reports on AI will see not a clear picture of the future, but a shift in power, risk and responsibility. It is not the technology itself that is central, but the way in which it is embedded in institutions, revenue models and governance structures.
This insight is most explicitly expressed in the Guidelines for (supervisory) boards by SparkOptimus. In it, AI is not presented as a tool, but as a structural intervention: “Productivity gains are real, but the real challenge lies in steering organisations through structural disruption that is only just beginning..Governments that reduce AI to efficiency gains, the authors warn, are repeating the mistakes of previous waves of digitisation: fragmented pilots, outsourcing to IT, and a lack of strategic reinvestment.
Uncomfortable
This is an uncomfortable message for cultural institutions and media organisations. After all, they have been operating for years under pressure from shrinking budgets, temporary subsidies and project funding. It is precisely here that the temptation to view AI as a cost-saving measure is great. But those who choose this path are imperceptibly shifting the core of their organisation: from cultural mission to operational optimisation.
We see the same pattern in the journalism sector. The extensive IRIS report by the European Audiovisual Observatory describes how news media have become increasingly dependent on digital infrastructures that they do not control themselves. “As digital technologies become gatekeepers for content discovery and visibility, the parameters of trust, reliability, and accountability in the news sector are being redefined.,” writes the editorial team. In this context, AI is not a neutral tool, but a new layer of power between journalists and the public.
Bottleneck
The parallels with the insurance sector are instructive. In The New Power Brokers it becomes clear how data and AI have shifted from being supportive to decisive. One director summarises it succinctly: “If your company doesn't understand the meaning of AI, data, technology... you won't get it right..It is not the model, but the governance that is the bottleneck. The winners are not the parties with the best algorithms, but those who dare to redesign their organisation around data sovereignty and decision-making.
For cultural and creative sectors, this is a cautionary mirror image. While insurers invest in data capacity and governance, the cultural sector often lacks even a discussion about ownership. This shortcoming is painfully evident in the research conducted by the Boekman Foundation the impact of generative AI on work and income.“Almost one in five self-employed professionals report fewer assignments and lower income due to GenAI; among translators, this figure is approximately one in three.,” is the conclusion. At the same time, salaried employees mainly experience efficiency gains.
Rolled out
This asymmetry is not a side effect, but a structural consequence of how AI is used. Organisations internalise the productivity gains, while the risk is passed on to freelancers and creators. This also explains why 65 per cent of self-employed people fear reduced employment opportunities and 90 per cent are concerned about the use of copyrighted material for AI training.
What is lacking in many policy responses is coherence. The SparkOptimus guidelines emphasise that Successful AI application requires a people-focused foundation.”, with a focus on data, security, ethics, and governance. Those words sound familiar., but rarely take consistent form in a cultural context. AI policy is often delegated to project leaders, not supported by management and supervision.
Journalistic core
The IRIS report explicitly warns against this vacuum. In a media landscape where journalism competes with platforms and ‘newsfluencers’, an overly broad definition of media leads to an erosion of professional standards. “An overly broad definition inevitably creates significant challenges for policy design and implementation.,” according to the authors. Without a clear delineation of responsibilities, the core of journalism disappears under a layer of algorithmic distribution.
The guiding question is therefore not whether cultural institutions and media should embrace AI, but under what conditions.. The insurance sector demonstrates that data-driven working only delivers value when managers themselves take responsibility for design and use. The cultural sector demonstrates what happens when that responsibility is lacking: inequality increases, trust erodes, and creators pay the price.
Perhaps that is the central lesson that these four documents collectively reveal. AI is not an innovation that you simply “add on”. It is infrastructure. And infrastructure requires public choices, governance and countervailing power. Those who fail to do so will not achieve a future-proof cultural policy, but rather a silent redistribution of risks — to the detriment of those with the least institutional weight.
The question is not whether this is desirable. The question is who dares to address it administratively.
Appreciate this article and support Culture Press!!!




