Don't look into the Orb

Worldcoin is an exploitative crypto project with a new coat of AI paint

Don't look into the Orb
Smiling people trying to entice you to peer into the Orb. Source: Worldcoin

Have you peered into the metallic Orb to trade your biometric data for $50 worth of crypto tokens? If not, you’re not alone. Worldcoin claims about 2 million people have had their irises scanned so far, but back in 2021 co-founder Sam Altman — yes, the same one from OpenAI — said they’d have reached a billion by now. Talk about missing a target.

But that doesn’t mean the company doesn’t have grand ambitions. Altman hopes you’ll eventually be among the 8 billion people in the company’s database. That target is a long way away, and there are also plenty of reasons to be skeptical it will ever be reached or that it would be a positive development if it does.

What is Worldcoin?

Worldcoin was founded in 2019 and rode the wave of cryptocurrency hype in the years that followed. The idea is that silver orbs positioned around the world will scan people’s irises then connect those scans to unique World IDs, creating a system of authentication — or “proof of personhood” — beyond the government-administered methods we use today. That will supposedly enable a new financial system, means of service delivery, and possible infrastructure for the payment of a crypto-based universal basic income (UBI).

Remember, the crypto ideology is all about enabling a libertarian world where the core functions of society are transferred to a “decentralized” computer network that’s actually under the control of a small number of powerful entities who hope to use tech libertarian delusions to their financial advantage. Worldcoin, for example, claims to be decentralized, but is actually centralizing all of the biometric data it collects in a shady firm controlled by powerful Silicon Valley figures with the funding of Andreessen Horowitz, not to mention early support from the collapsed crypto hedge fund Three Arrows Capital and disgraced FTX founder Sam Bankman-Fried.

The final design for the Orb. Source: Worldcoin

However, the air went out of the crypto bubble a while ago and we’re now in a moment of AI hype that began when OpenAI unleashed ChatGPT onto the world last November, and Altman paired it with a compelling narrative of AI transformation. Now when he’s asked about Worldcoin, he doesn’t spend as much time on the crypto angle — though Worldcoin crypto tokens are still a key part of the pitch. Instead, he talks about the need to authenticate real humans in a world where AI tools can be used to create fake identities, computers with human-level intelligence could be just around the corner, and rapidly improving AI will be able to fund a UBI.

To be clear, these are things Altman and the company are saying; that doesn’t mean they’re true. And despite all those grand claims, good luck finding any details on how they’re all supposed to work out in practice. Ultimately, Altman’s promises are just another way to justify a tech product that shouldn’t exist.

Orbs powered by exploitation

Worldcoin has been in beta for a while, trialing its Orbs in the Global South like a true tech colonizer. At the end of July, the project officially launched with plans to roll out up to 1,500 Orbs to 35 cities spanning 20 countries where people could have their eyes scanned to create a World ID in exchange for 25 Worldcoin crypto tokens — unless there’s regulatory uncertainly, like in the United States. We’ll see how that works out though, since as few as 150 Orbs may actually be in service right now.

The experience of the past few years provides many reasons to be skeptical of what Worldcoin is actually doing with its Orbs, iris scans, and crypto tokens. Reporting from MIT Technology Review and Buzzfeed News last year examined the company’s operations in countries throughout Africa, Asia, and South America where it recruited locals to be Orb operators and set up a system where they’d convince people to have their eyes scanned in exchange for Worldcoin tokens, free t-shirts, local currency, and even the chance to win a pair of Apple AirPods — whatever would get users to part with the biometric data.

Operators and users told the journalists that Worldcoin hadn’t held up their end of the bargain. In some cases, operators were paid pennies for new signups, meaning they certainly didn’t have any time to explain what people were agreeing to by having their irises scanned. MIT Technology Review spoke to an operator in Kenya who estimated about 40% of users had privacy concerns that Worldcoin didn’t have answers for, and it found many users suspected the project was a scam but signed up anyway because they needed the money.

Worldcoin also appears to use the operators for plausible deniability and to take the heat when their promises aren’t fulfilled. Users complained the company didn’t deliver on the financial benefits they offered, and some never received their money at all. According to Buzzfeed News, their ire was directed at operators, not the company, since those are the people who signed them up. “This was all a lie this worldcoin is the same as other scams. Prove me wrong if l am talking lies,” wrote one user who was fed up with the delays in accessing the promised money.

The company preyed on people who they knew would part with their biometric data for a small payment out of desperation. In some cases, they didn’t even understand what digital currencies were, as an operator in Sudan told MIT Technology Review. The journalists at that publication put it very plainly after asking Worldcoin CEO Alex Blania about what they’d found. “For Worldcoin, these legions of test users were not, for the most part, its intended end users,” they wrote. “Rather, their eyes, bodies, and very patterns of life were simply grist for Worldcoin’s neural networks.”

In short, the company didn’t care about the impacts on the people testing and operating its products because, just like the Kenyan workers paid $2 an hour to train ChatGPT and who now report mental health impacts from the graphic content they were made to read. They were only necessary to train systems that the founders hope will really be used by more affluent tech libertarians and people in the Global North. That’s exactly why the company hasn’t addressed the serious privacy concerns with its product, and has already fueled the emergence of a black market in biometric data and Worldcoin accounts.

Worldcoin’s false promises

Despite promising empowerment and decentralization, Worldcoin is no different from any other tech company whose ultimate goal is profit and control. If Worldcoin’s ambitions are accurate — and that’s a big if — what it’s really trying to do is supplant the state with a global identification system that it says will be decentralized, but is ultimately controlled by the people who run the company.

But even if it doesn’t form the basis for a new global means of authentication, finance, and the distribution of a UBI, the business model is pretty easy to understand. Right now, it says the focus is making money on the Worldcoin token when it appreciates in value, and at least 10% of coins were already allocated to Worldcoin employees, with another 10% to Andreessen Horowitz even before it began trading. In that sense, it’s like any crypto project where investors hope to cash out when the token peaks — which is exactly what venture capitalists did throughout the crypto boom.

The company also controls a vast trove of biometric data that it says it has no plans to sell, but I wouldn’t be surprised if the pressure to maximize profits causes that promise to be watered down. Worldcoin also said it would register as a non-profit and become decentralized, but has provided little detail on how that will happen. And let’s not forget what happened with Altman’s other company, OpenAI: it was a non-profit before forming a for-profit arm, hiding its data, and signing a $10 billion deal with Microsoft.

One thing that stands out to me is how we’re repeating the hype cycle of ten years ago, but with an even greater degree of corporate control this time. As I discussed in a recent piece about ChatGPT, robots and AI were supposed to take most people’s jobs in the mid-2010s, and that caused a lot of people to start advocating for a UBI to ensure people could still survive in the jobless future. But those predictions never came to pass, and most of the UBI support dwindled along with it.

Now, ChatGPT has brought back the narratives about AI taking our jobs and the need for a UBI too, and Altman has positioned himself to reap the benefits. His AI company gets more attention than any of the others, which has helped fuel its value and ensure it can sign deals with major companies, even as users report its quality is declining. With Worldcoin, he’s set up a complementary company that he can present as part of the solution to some of the problems OpenAI is creating and can distribute that UBI — if only enough people get their eyes scanned.

Shut it down

As usual, Worldcoin is a power play — and one that needs to be challenged. It’s far less likely the company will gain the prominence of OpenAI, but in the meantime it’s collecting very sensitive information from people based on false promises. If the company is just a means to pump a token and cash out, that’s bad enough; but if it really does strive to achieve its grander ambitions, that would have serious consequences for us all.

Dan McQuillian, author of Resisting AI, has warned that expanding the scope of AI systems and implementing them into more essential services will ultimately shape the lives of people all around the world, and curtail the choices available to them. We’ll all become the Uber drivers kicked off the app by an algorithm who are then unable to even speak to a human to get the problem resolved. These tools take away our agency, and shift power to those who control them. As McQuillian put it, “AI can decide which humans are disposable.”

There’s a similar risk in a project like Worldcoin: if it does become a global identification system with other services built on top of it, there will be major equity problems regardless of whether Altman assures us otherwise. Operators have already told journalists the Orbs frequently break, don’t detect people properly, or even allow people to be scanned multiple times, while a user in Chile told MIT Technology Review that once he lost access to his account, the company couldn’t help him retrieve it.

Worldcoin isn’t our path to a better world, even if it did work how Altman promised. It fuses the worst parts of the crypto industry’s libertarian ambitions with the empty justifications that undergird an AI boom that already feels to be waning. Regulators in Kenya and Europe are paying attention. They should shut it down before more people are convinced to peer into the Orb, and give their peers a precedent to follow.