now loading...
Wealth Asia Connect Middle East Treasury & Capital Markets Europe ESG Forum TechTalk
TechTalk
AI, risk – Snowden cautions Asia’s digital economies
Developers, policymakers should avoid designing systems that enshrine conformity as virtue
Tom King   23 Jun 2025
Edward Snowden
Edward Snowden

In a tightly packed 15-minute keynote delivered remotely, whistleblower Edward Snowden jolted the SuperAI 2025 conference audience in Singapore last week with a provocative forecast of our potential algorithmic future.

Speaking, not as a vendor, but as a global privacy advocate, Snowden unveiled his view of how artificial intelligence ( AI ) is accelerating mass surveillance, not as a possibility, but as an inevitability.

Snowden began by charting the evolution of surveillance since his own 2013 revelations, noting that where once it took an hour to process a minute of video, today’s AI systems, like Nvidia’s latest rival Whisper, can process 30 hours of video in a single hour.

This kind of processing, he warns, soon will become cheap enough to run “on chip”, embedded in cameras in malls, hospitals and public transport hubs, particularly relevant in high-surveillance, tech-forward Asian cities like Singapore, Seoul and Shanghai.

Footage to fingerprints

Snowden’s central argument was that the definition of “public utterance” is expanding. Voice, face and behaviour, he says, are all becoming trackable across devices and physical space.

Chinese surveillance firms Hikvision and Dahua, he points out, are already selling off-the-shelf systems with cross-camera tracking and outfit recognition. When paired with voice biometrics, this forms a seamless, ambient surveillance web, one that could soon understand not just what you say, but where, when and to whom.

For Asia’s retail and finance sectors, this has direct implications. As consumer analytics move from online to physical environments, financial institutions may be tempted to use AI-powered behavioural modelling for creditworthiness, risk assessment and fraud detection.

However, systems trained to normalize behaviour, Snowden cautions, may also penalize the “abnormal” people who deviate from the algorithmic mean.

Financial risk of being different

This normalization, Snowden argues, carries enormous consequences. In a world increasingly driven by algorithmic decision-making, he asks, “Can you get a mortgage? ( and sarcastically ) Will the dating app match you with anyone but serial killers?”

When AI systems reduce people to data points, he adds, human individuality, the source of innovation and progress, risks being treated as noise, or worse, risk.

This is especially pressing, Snowden says, in Asia’s fast-growing fintech and HR tech sectors. The drive to automate lending, hiring and screening via AI can easily become a mechanism for social control, suppressing dissent or even creative deviation under the guise of efficiency.

Snowden in his closing remarks was both philosophical and urgent. “The average is not the ideal,” he shares. “It is the bad habit, the skipped workout, the worthless filler episode.” AI developers and policymakers, particularly in innovation-heavy Asia, he states, should avoid designing systems that enshrine conformity as virtue.

Instead, he advocates for freedom to be embedded in AI governance. “If we do not have freedom from the system,” he warns, “we will be reduced inevitably to simply pieces of it.”

Ultimately, Snowden calls on Asia’s leaders and technologists to ensure that freedom of choice isn’t engineered out of society in the name of optimization. As he puts it, “the average is not the ideal,” and a future built on such logic could be efficient but profoundly dehumanizing.