By: Kemiso Wessie-

The increased use of AI in creative arts like painting, singing and writing, as shown on social media platforms, has sparked many conversations, even  moral panic: suddenly science fiction films like TerminatorMinority ReportBlade Runner and I, Robot don’t seem so far from reality anymore.

Artificial Intelligence, defined by Encyclopedia Britannica as “the ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings, ” is also making its way into newsrooms – and journalism teachers should consider the implications.

Many newsrooms and media outlets have already adopted AI in their reporting practices, according to Allen Munoriyarwa, Sarah Chiumbu and Gilbert Motsaathebe. While this observation was more related to the West than Africa and the Global South, it has sparked wide debate on what it means for journalists and the relevance of their roles.

Charlie Beckett, founding director of Polis at the London School of Economics and Political Science explains in the Journalism AI Report: New powers, new responsibilities:  A global survey of journalism and artificial intelligence (2019) that AI can be a powerful tool for enhancing and amplifying good journalism. Its adoption should be “additional, supplementary and catalytic,” to the work already being done in the newsroom.

Journalism practitioners, educators and academics need to inform themselves about AI, explore its uses and consequences for the newsroom as well as wider society, he adds.   AI can help journalists create better work while “the news industry is fighting for economic sustainability and for public trust and relevance.” It can also help audiences wade through news overloads and misinformation, directing them to credible content that is relevant, useful and stimulating.  When fully and correctly integrated AI could have high value for such as audience engagement, story discovery, and labour efficiency.

News organisations are able to track website visits, unique actions per news item, time spent reading them, the view rate of videos, where readers connect to their website from and other analytics that can help them optimise and perform better. “For journalism, the possibilities for knowing about user preferences that have arisen from enhanced information processing tools and techniques are a great opportunity.”

At the 2022 African Investigative Journalism Conference (AIJC), Beckett, said in his speech on artificial intelligence for newsrooms, that although AI can perform “very repetitive, predictive data-driven tasks at great speed and at great scale,” it’s what journalists can do with those tools and systems that matters.

Many are already integrating forms of AI in story creation, with three areas of use identified by Beckett:

    • Newsgathering: sourcing of information, story idea generation, identifying trends, investigations, event or issue monitoring, extracting information or content.
    • News production: content creation, editing, packaging for different formats and platforms, text, image and video creation, repurposing content for different audiences.
  • News distribution: personalisation, marketing, finding audiences, understanding user behaviour, monetisation/subscriptions.

 Negatives and ethical concerns:

So if we already use it so widely already as media practitioners and in everyday life, why the concern? As explained, AI is already making news work easier but what are the ethical concerns? Patrícia Ventura Pocino highlights that from an ethical point of view AI can actually be helpful in ensuring stories are more in line with media industry values. For instance,  software and applications are able to detect misinformation and gender, race and sexuality biases.

Ventura Pocino does explain that implementing this technology could negatively impact smaller newsrooms that don’t have access to the funds to pay for and sustain the tech. This could mean newsrooms in the Global South could take a while to catch up to the technological advancements of bigger and better funded newsrooms in the West.

“The algorithms have become everyday instruments, at least in companies of a certain standing. And everything suggests that this process will be consolidated and will eventually have an impact on more modest newsrooms,” says Ventura Pocino.

She later adds that AI can and already does in certain instances take on a decision making role at the core of journalism’s editorial function, asking the important questions about a story’s newsworthiness,  the form it should  take, possible headlines and which facts to include or omit. AI won’t eradicate bias but could still perpetuate the biases of newsmakers and executives that will at the beginning stages “train” the software to perform such functions.

At a panel discussion in March 2019 on Mobile Learning Week at the UNESCO headquarters in Paris, Guy Berger, then Director for Freedom of Expression and Media Development at UNESCO, explained how AI can play a role in disinformation. He highlighted that social media can perpetuate the dissemination of ‘deep fakes’ and AI-generated content like automated trolling and images. “Artificial intelligence evidently plays a crucial role in the rapid spread of disinformation,” however, Berger said AI can also be part of the solution.

How to move forward: 

Beckett summarises a seminar held at International Journalism Festival in May 2022 about what newsroom leaders think about the future of journalism and AI, highlighting how useful AI is for journalists, However,  it remains important for journalists, editors and news executives to inform themselves about these technologies.

Berger explains that neither journalism teachers nor students are able to accurately know how AI will affect journalism and how it is taught.  He does suggest creating a community of practice where AI can be tested and where  issues on OpenAI’s GPTChat, for example, can be identified. Berger also suggests humans will need to create  the content that AI is unable to; and can add new contributions to the online stock of recorded data which feeds machine learning.

Beckett explains the need for better AI coverage that is critical and informative that will reflect the news company’s AI and technology literacy. “We need to use better language that demystifies and explains AI to the general public,” he urges.

Journalists must adapt to new developments but, at the same time, preserve the values of the professionThe future impact of AI is uncertain, but it does have the potential for profound influence on how journalism is made and consumed.

The hard truth is that we can’t predict how technology will change our world until it has.