By Shruti Menon Seeboo
In a world saturated with information and misinformation, the role of a journalist has never been more challenging or more critical. Few understand this complex landscape as intimately as James R. LeMay. A media veteran with a career that has shaped news coverage on a global scale, LeMay brings a unique perspective forged by experience in both traditional newsrooms and the rapidly evolving world of digital media.
As a Senior Consultant for Frank N. Magid and Associates, his expertise is sought by broadcast executives and journalists worldwide. His clients are a who’s who of major media, including the NBC O&O group and Cox Media Group. This high-level consultancy is the natural next step in a journey that saw him serve for a decade as the Deputy Managing Editor at CNN, where he was instrumental in building the network’s pioneering multiplatform approach to news. He has not only reported on history but has helped to create the very channels through which it is delivered.
Beyond his impressive titles and numerous awards—including Emmys, Peabodys, and Edward R. Murrow awards—LeMay is a dedicated advocate for global journalistic integrity. He frequently travels to countries like Uganda, Iraq, and Saudi Arabia at the invitation of the U.S. State Department to provide training and insights to professionals in emerging media markets. This work underscores his belief that the core principles of truth and accuracy are universal and essential for a functioning democracy.
With the proliferation of AI tools and the increasing threat of disinformation, the challenges LeMay has addressed throughout his career are more relevant than ever. In this interview, he offers his expert analysis on how journalism and AI are intersecting, the strategies newsrooms can use to fight back, and the vital role of media literacy in a world where “seeing is no longer believing.” Excerpts:
1. With your extensive experience at CNN, how do you see the role of traditional broadcast media evolving in covering elections across Africa, especially in an era dominated by social media?
Broadcast remains the backbone of election coverage in Africa. TV and radio still carry unmatched trust and reach, but they must adapt to a digital-first audience. The future lies in blending broadcast credibility with online speed—using live fact-checking, interactive dashboards, and social media engagement. When voters see familiar broadcasters actively countering online rumors, they stay relevant. Broadcast can’t match the viral speed of Twitter or WhatsApp, but it can provide the verification and depth those platforms lack. That combination is what will keep it indispensable.
2. How do you view the relationship between international news organisations, like CNN, and local African media outlets during election coverage?
International and local outlets need each other. Global networks bring visibility, reach, and tech resources; local reporters bring context, languages, and cultural literacy. The best collaborations are genuine partnerships—fact-checking desks, co-produced explainers, and joint investigations. Done well, these projects elevate African voices rather than overshadowing them. The result is richer reporting for both local citizens and global audiences, with fewer blind spots.
3. What are the most effective strategies for local newsrooms to combat the spread of false information in real-time?
Preparedness is everything. Newsrooms should set up verification desks that monitor social media and push out corrections quickly—on air, online, and in local languages. Partnerships with groups like Africa Check strengthen trust. The rule is simple: corrections must travel as far and as fast as the misinformation. That means using radio, WhatsApp, and shareable graphics, not just long reports. Speed and clarity, without sacrificing accuracy, is the winning formula.
4. Are there success stories of innovative election coverage in Africa?
Ghana’s 2020 elections stand out—broadcasters paired live dashboards with fact-checking that undercut rumors in real time. In Kenya, newsrooms used WhatsApp groups to feed communities verified updates directly. South Africa has combined radio call-ins with online fact-checking to reach both urban and rural audiences. These aren’t high-budget experiments. They show that when technology is matched with trusted voices, innovative coverage is possible anywhere.
5. With the rise of generative AI, what tools have been used to mislead voters in African elections?
Deepfakes, voice cloning, and AI text are the new disinformation toolkit. We’ve seen fake videos of candidates making false statements, cloned voices in robocalls telling people not to vote, and AI-written posts flooding platforms with persuasive lies. In Nigeria’s 2023 elections, manipulated images and edited clips spread widely. The danger isn’t just the tech itself—it’s how cheaply and quickly these tools allow disinformation to be produced and scaled.
6. What tools can journalists use to detect manipulated content in real time?
Most newsrooms don’t need expensive forensic labs—they need practical tools. Reverse image searches, InVID for video, and metadata checkers can catch a lot. AI detectors like Reality Defender are emerging, though access is uneven. More important is workflow: cross-check with eyewitnesses, verify with local sources, and know the signs of fakes—lip-sync issues, odd phrasing, unnatural shadows. It’s about resilience, not perfection. Reporters who can quickly spot red flags will blunt the impact of manipulations.
7. How can journalists build public trust when “seeing is no longer believing”?
Trust comes from transparency. Journalists should explain how they verify stories, show their sources, and admit what they don’t know. Fact-checking must be visible, not hidden. Audience engagement—through town halls, WhatsApp groups, or call-in shows—also reinforces accountability. In an AI-driven world, audiences don’t just want information, they want proof. Newsrooms that open their processes will stand out as credible.
8. How is AI driving astroturfing and microtargeting in elections, and what role should media play?
AI is powering fake grassroots campaigns—endless bot comments, hashtags, and posts that simulate real voter sentiment. At the same time, microtargeting uses data to push tailored messages to narrow groups, often exploiting ethnic or religious divides. The media’s job is to shine light on these tactics. By exposing patterns of coordinated manipulation, journalists can help voters see what’s real and what’s manufactured. Transparency is the best antidote to hidden persuasion.
9. How can social media be used for positive voter engagement?
Social platforms can be civic tools if used well. WhatsApp, Facebook, and Telegram are ideal for distributing fact-checked guides, voting instructions, and real-time updates in multiple languages. Partnerships with influencers and community leaders add credibility. But moderation matters—unchecked, the same channels can fuel rumors. Done responsibly, social media can educate voters faster and more cheaply than almost any other medium.
10. What advice do you have for frontline journalists covering high-stakes elections?
Safety first. Always know your exits, your contacts, and your backup plan. Avoid volatile crowds. Stick to verified facts, diversify your sources, and resist pressure from political actors. The psychological toll is real, so practice self-care and lean on colleagues for support. Objectivity and safety aren’t in conflict—they’re both essential if journalists are to do their job under intense scrutiny.
11. How should newsrooms prepare journalists for the risks of covering AI-targeted elections?
Newsrooms need to take both physical and digital threats seriously. Provide protective gear, secure communication tools, and trauma support. Reporters face harassment online, surveillance, and exposure to deepfake harassment. Training in cybersecurity and mental health resilience is just as critical as political analysis. Protecting journalists isn’t a luxury—it’s the foundation of credible coverage.
12. What role do media training and digital literacy play in this new era?
They’re the frontline defense. Journalists need training in verification, cybersecurity, and ethical AI use. Citizens need literacy programs to recognize falsehoods and seek credible sources. NGOs, universities, and media associations are making progress, but programs must be multilingual and reach beyond cities to rural voters. When both journalists and the public are equipped, democracies are far more resilient against manipulation.



