Credits: Tara Winstead/Pexel
AI is a controversial topic, offering opportunities to improve efficiency and explore new methods, but also raising ethical and practical questions. Two sessions at UKCSJ24 discussed the introduction of AI into science journalism, its ethical issues and the thorny issues involved. One session, AI: Effective tools for science journalists or a source of fear for the future? featured two experienced specialists – Dr Andy Ridgway, Senior Lecturer in Science Communication at the University of the West of England and Subhra Priyadarshini, Chief Editor, of Nature India & Global Supported Projects. The other one – the AI Workshop: Using AI Effectively as a Journalist – was led by Jody Doherty Cove, Head of Editorial AI at Newsquest.
The ethical questions and opportunities of AI tools
AI's ability to generate content such as interview questions or even entire articles has sparked debate about its potential to erode the human touch that defines journalism. Sue Nelson, for instance, highlighted that while AI can help with such tasks like transcription, its uncontrolled use risks compromising the originality of the work. Similarly, Laura How from Chemical and Engineering News highlighted the ethical concerns of feeding sensitive data into AI systems without the informed consent of sources.
Journalists always have a responsibility to protect the privacy of their sources and ensure the informational accuracy of their work. This responsibility is heightened in the context of AI, which can distort or misuse real data.
Another pressing issue is the origin of the data used to train AI. Many AI models are trained on copyrighted materials, such as books and articles, without the consent of the authors. This unauthorized use has sparked debate about economic harm to writers and other creators and the destruction of intellectual property rights. If journalism relies too heavily on AI without addressing these issues, it risks alienating content creators and undermining the industry's ethical foundations.
Creativity and the role of journalists
The limitations of AI go beyond ethical issues and lie primarily in its absolute – at least for now – inability to replace human creativity and originality. Clara from Republic Magazine in Switzerland pointed out that relying on AI for interview questions could narrow the scope of research by perpetuating existing patterns in the data. This risks reinforcing established narratives and suppressing innovative prospects. Similarly, Margaret Harris from Physics World warned that using AI to identify sources could increase bias and miss different points of view
The role of journalists as critical storytellers cannot be replaced by AI. While AI can speed up repetitive tasks, it lacks the ability to replicate the depth of human exploration or the ability to ask unexpected, thought-provoking questions. Maintaining this creative edge is critical to preserving the integrity and impact of journalism.
AI’s practical impacts and structural changes
AI is already changing modern newsrooms, sometimes with highly controversial results. For example, the Australian science journal Cosmos uses AI to write content, which has led to many job losses. This highlights the potential for AI to directly impact workforce measures, at least if not used thoughtfully.
However, AI is not inherently evil, it is merely a tool. In some cases, it can be used to increase efficiency rather than replace human roles.
Practical steps for navigating AI in journalism
To solve these problems and use AI effectively, it is worth doing this:
- Clarity and accountability: Organisations and publishers must disclose how and why AI is used. This includes, among other things, obtaining informed consent from sources.
- Education and training: Professional associations and media organizations need to invest in education to use AI tools responsibly and fairly. Initiatives such as that of the German Science Writers Association to offer an educational course on the use of AI will help equip journalists with the skills to critically evaluate AI results.
- Ethical frameworks: Editorial staff should develop clear policies outlining acceptable uses of AI and addressing issues such as data privacy, bias and intellectual property rights.
- Public awareness: Educating writers and communicators about the value of human content creation can help counter the commercialization of AI-generated content.
- Maintaining creativity: AI should complement, not replace, the creative aspects of journalism. Encouraging journalists to focus on storytelling, investigation and critical thinking ensures that the profession's core values are preserved.
A global study conducted by the Reuters Institute and Oxford highlights the majority finding the importance of mentioning the artificial intelligence used. First of all, artificial intelligence is a question of fairness.
AI presents both new opportunities and potential challenges for journalism. While it can improve efficiency and reduce time spent on repetitive tasks, the ethical and practical issues of its use require careful consideration. By promoting clarity and education, journalism can both benefit from AI and preserve the integrity and humanity of the profession.
Zoya Chernova is a freelance science writer and former researcher with a background in biochemistry. She writes about modern medicine, women scientists, biochemistry, and cats.
'The future of science TV, radio and podcasts', by Deborah Cohen
'How to best pitch for broadcast content in the UK and Germany', by Sharon Ann Holgate
'Nurturing the evolution of diverse science journalism', by Grey Enticknap
With thanks to our partners and sponsors that made the UKCSJ24 possible.
EurekAlert! the ABSW's Lead Professional Development Partner
UK Research and Innovation Gold Partner, Excellence in Science & Technology Journalism
LifeArc Silver Partner, Excellence in Science & Technology Journalism
Yakult Bronze Partner, Excellence in Science & Technology Journalism