As iris scanners gain traction in the crypto world, explore the clash between innovation and surveillance in the race to verify humanity.
Share
Subscribe to the AlphaWire Newsletter
As biometric IDs enter the blockchain era, the very notion of digital selfhood is being rewritten. But what kind of future is being engineered, one of liberation or one of quiet capture?
Key Takeaways:
In an era defined by deep fakes, AI-generated influencers, and bots indistinguishable from humans, the question of who, or what, is behind a screen has never been more urgent. It’s no longer a philosophical exercise. It’s a design choice. And increasingly, that choice has to do with your body.
Scan your iris. Get crypto. That’s the premise of World (previously Worldcoin), the high-profile identity project co-founded by OpenAI’s Sam Altman. Using silver orbs stationed in cities across the globe, the system maps a person’s iris to assign a “World ID”, a kind of universal proof of personhood. It promises a future of fair airdrops, Sybil-resistant voting, and even global Universal Basic Income (UBI). Just don’t blink.
For many, this isn’t innovation. It’s something else: a biometric Rubicon. Blockchain, long a haven for anonymity and cryptographic freedom, now finds itself entangled in a far more intimate ledger, the body.

Did you know? In 2020, Monero faced a large-scale Sybil attack where an adversary tried to deanonymize users by linking IPs to transactions. Thanks to Monero’s privacy tech, like Dandelion++ and ring signatures, the impact was limited. It was a sharp reminder of how vital node diversity is for privacy.
World is not alone. The rise of proof of personhood systems has sparked a biometric arms race. Proof of Humanity, Humanode, Idena, Anima, Passport XYZ (aka Gitcoin Passport) and others are all betting that crypto’s next frontier isn’t financial—it’s biometrical. In a world where bots swarm social platforms, AI impersonates celebrities, and Sybil attacks plague DAO governance, verifying that you are a real, unique, and living human has become a billion-dollar problem.
The logic is compelling: blockchain systems need a way to ensure each participant is a single person, not a farm of pseudonymous wallets. Traditional methods rely on Know Your Customer (KYC) checks, passport scans, utility bills, centralized databases, but these come with regulatory hurdles, geographical exclusions, and profound privacy trade-offs. Biometrics offer a sleek solution: frictionless, universal, and hard to fake. Your body becomes your key.
Already, over 12+ million people across 160+ countries have enrolled in the World’s system. In exchange for their iris data, users receive a digital ID and a handful of tokens. Lines have formed from Buenos Aires to Nairobi, Manila to Bangalore, sometimes hundreds deep, for the chance to join what its creators promise is the future of economic inclusion. The premise is techno-utopian: a global identity layer untethered from borders, banks, or bureaucracies. A kind of “citizenship of the internet,” anchored not in documentation, but in biology.

But the questions it raises are deeply grounded. Who governs this infrastructure? Who manufactures the Orbs? Who audits the code? Who stores the data, and under what jurisdiction? Can biometric systems, by nature tied to physical traits and hardware, ever truly be decentralized?
Biometrics are seductive. They’re fast, seamless, and unique. They reduce fraud and make user onboarding elegant. But they are not revocable. A stolen password is an inconvenience. A stolen iris is a lifetime vulnerability.
World claims it deletes raw biometric data after converting it into an irreversible “iris hash.” The hash, in theory, cannot be reverse-engineered. But critics argue the underlying infrastructure, especially the proprietary Orbs used for scanning, remains dangerously opaque.
While the crypto industry is built on trustless design, biometric systems often require trust in opaque vendors, hardware manufacturers, and machine learning pipelines. And in lower-income communities, where most World enrollments are happening, the power dynamics of “consent” blur quickly.
In Kenya, where thousands lined up for scans, the government intervened. Authorities suspended World in 2023, citing concerns about data protection, coercion, and the legitimacy of mass biometric harvesting. A parliamentary report later deemed the project illegal. In May 2025, the Indonesian authorities said they had also suspended World following public complaints and concerns about data privacy and regulatory violations.
The deeper unease isn’t just privacy—it’s power. Who gets to write the rules of being “real” online? And who benefits from those who must prove their reality?
Did you know? In 2021, Edward Snowden sharply criticized the World, warning: “Don’t catalogue eyeballs.” The privacy advocate slammed the idea of scanning users’ irises to distribute crypto, arguing that even hashed biometric data could be misused in the future. “The human body is not a ticket-punch,” he tweeted, voicing broader concerns about surveillance and consent in biometric crypto schemes.
There are two visions of identity emerging in the crypto space. One is anchored in cryptography: decentralized identifiers (DIDs), zero-knowledge proofs (ZKPs), and selective disclosure. The other is anchored in biometrics: iris scans, facial maps, and body-based verification.
The cryptographic camp, led by protocols like ION (Identity Overlay Network), Veramo, and Ceramic, advocates for user-controlled credentials that can be selectively shared without revealing underlying personal data. These systems emphasize consent, minimal disclosure, and reversibility. Emerging ZKP systems, such as Sismo and Zupass, offer an alternative: the ability to verify credentials or uniqueness without ever revealing the underlying data, proving you’re real without exposing your body.
Biometric systems, by contrast, rely on permanence. You don’t forget your iris like you might lose a seed phrase. But therein lies the risk: what is permanent cannot be undone. While most biometric projects centralize hardware or data custody, some, like Humanode offer a rare exception, using encrypted local biometric checks to validate uniqueness without storing or transmitting personal data, and enabling one-person-one-node consensus through fully decentralized governance.
We are building two futures: one where identity is abstracted into cryptographic math, and another where it is anchored in the unchangeable contours of your body. Which will win? More importantly, who decides?
Biometric identity systems are already being integrated into key use cases:
These implementations are not abstract. Humanity Protocol, backed by Animoca Brands, is developing ID systems that link biometric proof to gaming and social experiences. Decentralized finance (DeFi) projects are piloting biometric-backed lending to reduce default risk.
It’s not that these tools don’t offer utility. It’s that they offer it on terms that often elude public scrutiny. Infrastructure is destiny. And identity, more than any other infrastructure, shapes the rules of belonging.
Let’s assume, for a moment, that biometric systems are designed ethically. That privacy claims are verified. That orbs never malfunction. Even then, risks persist.
Did you know? The European Data Protection Board (EDPB) emphasizes that biometric systems must comply with GDPR and protect fundamental rights, warning against unchecked deployments that could infringe on privacy, especially when operated outside clear legal frameworks.
In the United States, biometric regulation is a legal patchwork. Illinois leads the way with its Biometric Information Privacy Act (BIPA), one of the toughest laws of its kind, which has fueled a wave of high-profile lawsuits against companies like Meta and Google. With no federal privacy law in place, legal protections vary widely by state, leaving both users and tech firms navigating an uncertain and inconsistent regulatory landscape. Notably, even hashed biometric data has been at the center of liability claims, raising the stakes for emerging identity protocols.
The very features that make biometrics useful: persistence, immutability, uniqueness, also make them dangerous. Crypto wanted to disintermediate banks. It may now be building a new class of intermediaries: biometric gatekeepers.
The question isn’t just technical, it’s ontological. What does it mean to be a person in a networked system? What counts as “real”? And who has the authority to grant that reality?
Blockchain once promised liberation from identity. In the early days, you were your key, your wallet, your pseudonym. Today, to participate fully in Web3, you may soon be asked to scan your eye, prove your humanity, or submit to behavioral surveillance.
This isn’t just a shift in authentication. It’s a shift in worldview.
In other words, digital identity is not neutral. It encodes values. And as crypto becomes infrastructure—governing not only money but voting, reputation, and access—the identity systems it builds will shape the very contours of citizenship in the digital age.
Did you know? Legal scholar and blockchain researcher Primavera De Filippi has observed that digital identity systems are not merely technical tools but also shape how individuals perceive themselves and are perceived by others. In her work, she discusses how these systems can influence personal autonomy and societal structures.
Crypto’s identity crisis is not merely about security or user experience (UX). It’s about power, permanence, and what kind of society you are willing to encode into protocol.
Biometrics may very well solve some of the thorniest problems in decentralized systems. They might enable UBI, ensure voting integrity, and onboard billions. But they also carry within them the seeds of coercion, overreach, and the quiet erosion of what it means to be free.
You must ask not only what works, but what endures. And what you are willing to lose for the sake of convenience, certainty, or speed. Crypto once promised us freedom through code. You must now decide whether that freedom includes the right not to scan.
Share
