The Fake News Pandemic
- 1 day ago
- 4 min read
Covid-19 lessons may help

Introduction
We are nearing a tipping point in how we consume information. Truth is steadily losing ground to falsehood. Misinformation, disinformation, and malinformation are spreading at pandemic scale—rapid, contagious, and increasingly difficult to contain. Generative AI has supercharged the problem, making falsehoods cheaper to produce and more convincing than ever. Social media platforms amplify them at lightning speed. As the saying goes, a lie travels halfway around the world while truth is still putting on its shoes.
A clear distinction matters. Misinformation is false but not intended to harm. Disinformation is false and deliberately crafted to deceive. Malinformation is true information weaponized to harm. For example:
“Drinking hot lemon water kills Covid-19 instantly” is misinformation—false, but not malicious.
Fabricated Chinese-language videos portraying PM Lawrence Wong negatively would be disinformation—falsehoods designed to deceive.
Saying “Political candidate X failed O-level mathematics years ago” is malinformation—true, but disclosed to spoil election prospects.
Understanding these differences is the first step. The second is to treat fake news as we treated Covid-19: as a public health crisis requiring layered defences. Six parallels stand out.
Quarantine
When someone was infected, we isolated them to slow transmission. Similarly, if a video or message looks suspicious, isolate it. Delete it. Or at least delay forwarding it.
Better still, cultivate healthy scepticism. Treat unverified news as contagious. Tarrying a little often allows the truth to catch up.
Contact Tracing
During the early days of Covid-19, contact tracing was swift and systematic. People were alerted when exposed so they could monitor symptoms.
In the information ecosystem, platforms such as WhatsApp and Instagram should be required to label suspicious content and track its propagation. Users should be alerted when they encounter potentially false material.
Platforms should also maintain digital logs to identify “superspreaders” and coordinated disinformation campaigns. When virality crosses a predefined threshold, a mandatory cooling-off period—say, a 12-hour pause before forwarding—could curb runaway spread.
Rapid Test Kits
Rapid test kits allowed Singapore to move beyond DORSCON Orange by empowering individuals to test and act responsibly.
We need the equivalent for information. Singaporeans should be able to quickly check whether a video is deepfaked or a message is a scam. The ScamShield helpline is a start, but its scope should be expanded to cover all forms of digital media.
Ideally, a free and accessible verification app would sit on every device. Privacy safeguards must be carefully designed: Should the app scan all messages automatically, or only those selected by the user? Should flagged content trigger automatic reporting? These options should remain configurable.
Empower the individual. Scale the defence.
Masking Up
Masks and social distancing reduced exposure in physical spaces. In the digital realm, the equivalent is limiting exposure to algorithmic noise.
Spend less time doom-scrolling in Tik Tok. Follow credible sources on Instagram. Curate your feed deliberately, and ignore feeds from people you did not follow.
Today’s social media algorithms are opaque and optimized for engagement and profit—not for user well-being. Platforms should be mandated to offer granular controls: filter by topic, by source, by country of origin, by virality. Critics may warn of echo chambers. But a consciously chosen information diet is better than one dictated by opaque algorithms.
Vaccination
Vaccines train the immune system to recognize threats. We can train our minds the same way.
Learn to spot tell-tale signs: mismatched image resolution, unnatural blurring, or flickering artifacts. Ask critical questions: Is the message provoking fear or greed? Is there an urgent call to action? Does the content make sense contextually? A video of Steve Jobs promoting a Samsung phone should raise alarm bells even without visible glitches.
Bad actors often modify existing content rather than create from scratch. So using a reverse-image search should uncover the original source material, allowing you to see what was changed.
However, mental inoculation is not foolproof. Vaccines do not eliminate infection; they reduce severity and spread. Likewise, critical thinking will not eliminate falling prey to falsehoods—but if widely adopted, it can create a form of herd immunity against infodemics.
Lockdown
At the height of Covid-19, governments imposed lockdowns. In information space, a lockdown could mean temporarily suspending platforms or restricting certain digital services. Circumvention could carry legal consequences. Such measures, however, are blunt instruments—disruptive in the short term and devastating in the long term.
More nuanced options may prove effective: restricting video while allowing text, separating essential government services from recreational platforms, and throttling non-essential traffic. Unfortunately, today’s internet infrastructure is not designed for such precision. All bits are treated equally. Without new internet protocols, imposing an information lockdown is like cracking a nut with a sledgehammer.
Conclusion
Combating falsehoods is both a personal duty and a collective responsibility. The Covid-19 pandemic showed us that layered defences—individual discipline, technological tools, institutional safeguards, and calibrated government intervention—can contain even a fast-spreading threat.
The fake news pandemic demands no less.
If we act now, we may yet give truth enough time to put on its shoes and get out the front door.






Comments