Artificial Intelligence, particularly tools like ChatGPT, can be a powerful companion for planning, brainstorming, and even sharpening a rough draft.
When used responsibly, it can elevate journalism by assisting reporters to think more deeply, write clearly, and save time on technical tasks. But what it cannot, and must not replace is the creativity, poetic expression and core ethics that have long defined journalism: curiosity, fact-checking, human storytelling and writing with inspiration.
Yet, over the past few months, I have watched with shock, disgust, and deep disappointment how AI writing tools, particularly ChatGPT, are being misused in Lesotho’s media space. Stories with the unmistakable “DNA” of machine-generated text are creeping onto our pages. What began as a quiet concern within my own newsroom became glaring when I looked outward and realised that many of our news outlets are falling into the same trap.
Instead of sharpening their craft with AI tools, too many journalists are surrendering their pens to it. The result? Bland, repetitive, soulless articles that sound polished but lack depth, originality or truth.
The worst example I encountered recently was a front-page story that carried a damning headline, but the content beneath it screamed “AI template”, no fact-checking, no editing, no copy tasting, just filler text, printed as though it were newsworthy truth.
This is not journalism.
As an editor, a media manager and trainer and a journalist, I feel compelled to issue a gentle but urgent warning: the careless use of AI tools is exposing the laziness of Lesotho’s journalism. And if unchecked, it risks destroying the credibility of our profession altogether.
I speak not from theory but from practice. I have used AI tools, including ChatGPT, a few times to reshape raw copy submitted by reporters. I regularly use Grammarly to ensure the grammar is polished and ready for consumption. At its best, both tools and other can help untangle messy drafts and give structure where it was lacking. But what worries me is how many reporters now abdicate their responsibility to think, to research and to write with authority. Instead, they lazily type a half-baked prompt into ChatGPT, wait for a machine to spit out paragraphs, slap their byline on top and call it a day.
Too often, these same bylines belong to people who do not even understand the core issue of the articles they are signing. How can one claim authorship over a story they have not lived with, interrogated, or contextualised? How can journalism fulfil its duty to truth when the very people tasked with it show no intention of fact-checking, cross-verifying, or adapting machine output back into human storytelling?
These incredible tools have not failed us; we have failed ourselves for refusing to grow in the digital age. The technology is not the enemy. The real problem is the mediocrity it is exposing.
For years, some of our reporters have relied too heavily on press releases, avoiding the hard work of digging deeper. Now, with AI, that laziness has a faster, shinier tool. But instead of elevating the quality of our work, it is making the shortcuts painfully obvious.
Journalism has always been more than words on a page. It is about context, accountability and voice. It is about standing in the rain to get the story, sitting with communities to hear untold truths and asking the uncomfortable questions that others avoid. AI can never replicate that; it cannot feel the weight of silence in a grieving family’s home. It cannot smell the dust of a protest march. It cannot probe evasive politicians until cracks appear in their answers.
Only journalists can do that.
So, where do we go from here? First, we must remind ourselves and our newsrooms that AI is a tool, not a crutch. Reporters must own their stories, research them, understand them, and then, if need be, use AI tools to polish them.
Editors must sharpen their oversight, spotting generic writing, checking facts rigorously and rejecting laziness masked as productivity. Media trainers must prepare the next generation not just to use new technologies, but to uphold the ethics that define us as truth-tellers.
The credibility of journalism in Lesotho hangs on this balance. If we continue to misuse AI, we will erode public trust, dilute the power of storytelling and reduce our newsrooms to hollow echo chambers of machine text. But, if we embrace it wisely, pairing its efficiency with human curiosity and creativity, then we have an opportunity not just to survive but to thrive.
The choice is ours. And it begins with honesty: AI is not exposing the strength of Lesotho’s journalism. It is exposing its laziness.
Summary
- The worst example I encountered recently was a front-page story that carried a damning headline, but the content beneath it screamed “AI template”, no fact-checking, no editing, no copy tasting, just filler text, printed as though it were newsworthy truth.
- As an editor, a media manager and trainer and a journalist, I feel compelled to issue a gentle but urgent warning.
- It is about standing in the rain to get the story, sitting with communities to hear untold truths and asking the uncomfortable questions that others avoid.

Co-Owner and Managing Editor of Newsday Media Lesotho. PEPFAR Media champion, Award wining features journalist, an Investigative journalist, REPSSI CAB member. Maseru, Lesotho Joined November 2009.