Guidelines for use of AI in journalism

[Translate to Englisch:] KI im Journalismus
© AdobeStock

Artificial intelligence is changing the world - and journalism too. AI is already supporting media professionals throughout the entire content production process, from initial research to the distribution of news. Communication scientists from Magdeburg-Stendal University of Applied Sciences, the Catholic University of Eichstätt-Ingolstadt and Macromedia University of Applied Sciences Munich now show in a basic research paper for the Friedrich Ebert Foundation how AI can facilitate journalistic work in many areas and make it better – and where the dangers of the new technologies lie. The authors recommend that editorial offices formulate guidelines and orientation aids and call on politicians to monitor the platforms on which news is disseminated more closely.

A radio station with presenters who are given a voice by artificial intelligence. Videos on online platforms that depict scenes that never took place. News texts that were not written by a journalist but by ChatGPT. All of this is already a reality. Although AI applications are still taking their first steps, they are already capable of a lot. The "Wolf Schneider AI" from the online journalism school Reporterfabrik, which was published at the end of last year, can edit and rewrite texts according to the style rules developed by the legendary language critic. Do we still need humans in editorial offices?

The three journalism professors Jonas Schützeneder (Magdeburg-Stendal University of Applied Sciences), Klaus Meier (Catholic University of Eichstätt-Ingolstadt) and Michael Graßl (Macromedia University) have now conducted a study to investigate the added value that AI already offers in news journalism, the consequences of the new technology and the perspectives and recommendations that journalism and politics should consider. According to the authors, there are various things that need to be considered in order for AI to provide added value in journalistic newsrooms. First of all, we must create an "editorial culture that is optimistic towards technology". "Journalism and the media cannot ignore the increasing potential of AI tools. Simply waiting or blocking is not the solution", says Prof. Dr. Klaus Meier, who teaches journalism in Eichstätt. Rather, it is important to look at the rapid development together and "with open curiosity", to actively try it out under good moderation and, if necessary, with external support, and to keep discussing the potential and weaknesses.

Michael Graßl, Jonas Schützenender and Klaus Meier
Michael Graßl, Jonas Schützenender and Klaus Meier (from left)

The experts are in favor of establishing editorial guidelines. "Ignorance and uncertainty among both the public and editorial teams regarding the potential risks of AI applications in journalism are still hampering development", says Prof. Dr. Jonas Schützeneder. Editorial teams should therefore provide information and explain which tools are used for which purpose and also show limitations of such tools. "Responsibility and due diligence must always remain with the editorial team." Across the industry, an addition to the press code could be helpful, for example by developing standards for labeling media contributions that have been created using AI. Journalism professors, on the other hand, are skeptical about demands for certification of AI tools. After all, a quality check would probably not keep pace with technical developments, and journalistic quality cannot be certified.

The experts see dangers in connection with AI applications - even more so than in editorial offices with their journalistic routines - on social media platforms that generate content. So far, they have done too little to prevent manipulative and democracy-threatening content such as disinformation and hate speech. German, European and international media policy is finding it difficult to regulate the platforms effectively, but further efforts are needed here. "At the same time, public education through journalism must be strengthened – as a counterweight based on research, fact-checking and clarification", says Klaus Meier. There are a number of proposals as to how journalism could be strengthened – possibly using taxpayers' money. In particular, Meier and his colleagues propose increased support for innovation, for example by supporting public media labs or through public funds for innovations in journalism that are awarded on a competitive basis. "It has been empirically proven that innovations in journalism strengthen its contribution to democracy."

Politics, media, science and education are called upon to work together to strengthen media literacy. Although there are a large number of initiatives in this area, ignorance and uncertainty are still widespread. "We need targeted media literacy support that includes the new AI technologies", says Prof. Dr. Michael Graßl. This has to start in schools, but it is also important to have an everyday discussion in companies, associations and families. "Everyone has a role to play here, and politicians can set a good example and actively promote forms and programs that provide education and knowledge in this field."

The three journalism professors, who train the next generation of journalists at their universities, also see the responsibility that is on them: "AI is increasingly becoming a core topic when we prepare our students for their professional future", says Klaus Meier. The opportunities and risks of AI must also be more firmly anchored in the curricula of journalism and media courses in order to prepare the media professionals of tomorrow in the best possible way.

The article on overcoming boundaries and shaping opportunities - AI in the journalistic newsroom was published in German in "FES impuls" by the Friedrich-Ebert-Stiftung, available at 
https://library.fes.de/pdf-files/a-p-b/20987.pdf