Misinformation and disinformation pose the biggest global short-term threat, with independent journalism the antidote, writes Claire Stuchbery, executive director of the Local and Independent News Association (LINA)
This January the World Economic Forum (WEF) named misinformation and disinformation as the most severe short-term risk the world faces.
Combating this issue has been high on the agenda for newsrooms, politicians and hopefully anyone who consumes information online, for some time. So why is it now bumped above all other high-impact risks including extreme weather events, cyber insecurity and interstate armed conflict?
The WEF has pointed the finger straight at one of the newest players in the digital field: generative AI.
The WEF Global Risks Report 2024 explains how AI is “amplifying manipulated and distorted information that could destabilise society”. Put simply, AI absorbs information already online, repurposes it and spits this out again in response to a question or prompt from a user.
With no editor to verify and contextualise this information, this process has the potential to snowball, mis- and disinformation to become so widespread as to be taken as fact. This is without even getting started on the dangers of AI’s increasingly jaw-dropping ability to create hyper-realistic “synthetic content” such as deep fake videos.
As the WEF points out, the consequences of misinformation and disinformation will present “one of the biggest ever challenges to the democratic process”, deepening polarising views and with the potential to trigger civil unrest and confrontation. A serious issue at the best of times, but with many major elections due to take place in the next two years, enough to place it as the biggest current worldwide threat.
Elections aside, WEF highlights how “perceptions of reality are likely to also become more polarised, infiltrating the public discourse on issues ranging from public health to social justice”, shifting public opinion towards distrust in facts and authority, and exacerbating the risk of domestic propaganda and censorship.
So what can be done to counter this threat? Improving media literacy is one place to start, with education and resources so that people are more equipped to identify misinformation or disinformation when it is presented to them.
But perhaps more importantly, and certainly nothing novel, is that now, more than ever, we need to support trusted independent journalism.
Despite the many challenges of running a smaller masthead, local newsrooms are leading the charge in combating mis- and disinformation. While national or major metro outlets compete with one another to push out stories within minutes of an event occurring, local news publishers often run to deadlines which make sound fact-checking and thorough contextualising of information much more achievable.
Strengthened by local knowledge and connections to the communities they serve, local and independent news outlets are well-placed to identify misinformation circulating in their area, investigate and report the truth quickly and efficiently.
To give just one example, while heated discussion was underway last year about a proposal to build off-shore wind farms off the coast of Wollongong, opponents in the community, national media and parliament began circulating claims that wind turbines were dangerous to whales, supported by a supposed University of Tasmania study. The Illawarra Flame investigated and found the research to be entirely fabricated, with no scientific evidence worldwide of wind farms harming whales, and were acknowledged for their commitment to fact-checking by Media Watch.
For their part, newsrooms now have a greater responsibility to readers to verify the information that they choose to share. Nine News recently got themselves in hot water for doctoring an image of Victorian MP Georgie Purcell, enlarging her breasts and digitally altering her clothing to appear more revealing. Nine apologised and blamed AI tools now embedded in Photoshop for the supposedly unprompted mistake.
Regardless of how the incident occurred, newsrooms choosing to engage AI in their work need to take responsibility for ensuring the content still meets the journalistic values of accuracy and fairness.
Whether we like it or not, generative AI is here to stay, and government regulation is unlikely to keep up with technology developing this quickly. So it’s time to embrace its capacity to assist with the sharing and consumption of important news and information, while also engaging in a “healthy scepticism” and consideration about what we see online and where we choose to get news and information from, as Academic T.J. Thomson advised in The Conversation.
It is easy to point the finger at newsrooms for letting false or distorted information be published, but beneath these mistakes is the very human problem of being under-resourced and under-funded.
With resourcing, trusted newsrooms are our best shot at capitalising on their connections within their communities to build the media literacy rates needed to counter mis- and disinformation.
The Illawarra Flame is a founding member of LINA.