The "1984" of the encryption world: How Worldcoin reconstructs Big Brother's gaze with Orb

robot
Abstract generation in progress

Original title: Is World’s biometric ID model a threat to self-sovereignty?

Original author: Amin Haqshanas

Source:

Compiled by: Daisy, Mars Finance

Worldcoin claims to achieve financial inclusion through identity verification, but critics warn that it may sacrifice decentralization, privacy, and autonomy.

The cryptocurrency industry is never short of controversy, but few projects have drawn as intense scrutiny as Sam Altman's Worldcoin (formerly the World project). The project promises to verify human uniqueness through iris scans and globally distribute WLD tokens, touting itself as a tool for financial inclusion. However, critics point out that its biometric methods are invasive and overly centralized, contradicting the ideals of decentralization and the spirit of digital privacy.

The core of the controversy lies in the fact that biometric systems that rely on proprietary hardware, closed validation methods, and centralized data pipelines fundamentally cannot achieve true decentralization. "Decentralization is not just a technical architecture," emphasized Shady El Damaty, co-founder of the Holonym Foundation, to Cointelegraph, "but also a philosophy that advocates for user control, privacy, and autonomy. The biometric model of World fundamentally contradicts this理念."

El Damaty pointed out that despite the use of tools such as multi-party computation (MPC) and zero-knowledge proofs (ZK), World’s reliance on custom hardware Orb and centralized code deployment undermines its claimed commitment to decentralization. "This design is essentially aimed at achieving its 'unique human recognition' goal, but the concentration of power will create single points of failure and control risks, ultimately undermining the core promise of decentralization."

The spokesperson for World refuted this by stating: "We do not use centralized biometric infrastructure," and emphasized that the World App operates in a non-custodial manner, allowing users to always have control over their digital assets and World ID. The project team stated that after the Orb generates the iris code, "the iris photo will be sent to the user's phone in an end-to-end encrypted data packet and will be immediately deleted from the Orb," and that the iris code is processed through anonymized multi-party computation, "and absolutely no personal data is stored."

Privado ID and Billions.Network co-founder Evin McMullen stated that World’s biometric model is not "inherently contradictory to" the principles of decentralization, but still faces challenges in specific implementation aspects such as data centralization, trust assumptions, and governance mechanisms.

Common tricks of technology overstepping boundaries?

El Damaty compares OpenAI's large-scale collection of "unauthorized user data" to World’s collection of biological information. He believes this reflects a common pattern among tech companies that, in the name of innovation, engage in data plundering, warning that such practices might erode privacy rights under the guise of progress and promote the normalization of surveillance.

"The irony is evident," El Damaty pointed out, "OpenAI started by training its model using massive amounts of unauthorized user data, and now Worldcoin is extending this radical data collection approach into the realm of biometrics." In a 2023 California class action lawsuit, OpenAI and Microsoft were accused of scraping 300 billion words from the internet without consent, including personal data of millions of users (including children); in 2024, the Canadian Media Alliance also sued OpenAI for using its content to train ChatGPT without authorization.

World firmly opposes this analogy, emphasizing its independence from OpenAI, and states that it neither sells nor stores personal data, utilizing privacy protection technologies such as multi-party computation and zero-knowledge proofs. However, questions are also directed at its user registration process—despite the project's claims of ensuring informed consent through multilingual guides, in-app learning modules, brochures, and help centers, critics remain skeptical. "World currently primarily targets people in developing countries who are more easily tempted and often unaware of the risks of 'selling' such personal data," El Damaty warned.

Since its launch in July 2023, World has faced regulatory resistance in multiple countries. Governments in Germany, Kenya, Brazil, and others have expressed concerns about the security of user biometric data. The latest setback occurred in Indonesia, where local regulators temporarily suspended the company's registration certificate on May 5.

The risk of digital exclusion

With the popularity of biometric systems like World, its long-term impact is raising concerns. Although the company touts its model as inclusive, critics point out that relying on iris scans to unlock services may exacerbate global inequality.

"When biometric data becomes a prerequisite for accessing basic services, it effectively creates social stratification," El Damaty stated, "Those who are willing (or forced) to surrender the most sensitive information gain access, while those who refuse are excluded."

World insists that its agreement does not mandate biometric verification to access basic services: "Even without a verified World ID, certain functionalities can still be used," and further clarifies that the system employs Zero Knowledge Proofs (ZKPs) technology to ensure that actions cannot be traced back to a specific ID or biometric data.

There are also concerns that World could evolve into a surveillance tool—especially in authoritarian countries—due to its centralized storage of biometric data, which may be abused by authorities. World refuted this claim, emphasizing that its ID protocol is "open-source and permissionless," meaning that even government applications cannot associate user behavior with their biometric data.

The controversy has also spread to the governance level. While World claims that its protocol is moving towards decentralization (such as open source contributions and the governance section in the white paper), critics argue that substantive user ownership is still lacking. "We need to establish systems that can verify humanity without the need for centralized storage of biometric or personal data," El Damaty pointed out, "this means adopting zero-knowledge proofs, decentralized governance, and open standards, empowering individuals rather than corporations."

The urgency of developing a secure identity authentication system is not without reason. As artificial intelligence technology continues to advance, the boundaries between human and non-human actors in cyberspace are becoming increasingly blurred.

"The risks at the intersection of artificial intelligence and identity verification are not limited to any particular type of government system or region," said McMullen from Privado ID. She pointed out that without reliable verification mechanisms for humans and AI agents, the digital ecosystem will face increasingly severe threats—ranging from misinformation and fraud to national security vulnerabilities.

McMullen added: "This is a national security nightmare—unaccountable, unverifiable non-human actors could now infiltrate global systems and networks, while traditional systems were never designed to address such verification and contextual logic."

View Original
The content is for reference only, not a solicitation or offer. No investment, tax, or legal advice provided. See Disclaimer for more risks disclosure.
  • Reward
  • Comment
  • Share
Comment
0/400
No comments