Deception, exploited workers, and free cash: How Worldcoin recruited its first half a million test users
[ad_1]
In the end, it was something that Blania said, in passing, during our interview in early March that helped us finally begin to understand Worldcoin.
“We will let privacy experts take our systems apart, over and over, before we actually deploy them on a large scale,” he said, responding to a question about the privacy-related backlash last fall.
Blania had just shared how his company had onboarded 450,000 individuals to Worldcoin—meaning that its orbs had scanned 450,000 sets of eyes, faces, and bodies, stored all that data to train its neural network. The company recognized this data collection as problematic and aimed to stop doing it. Yet it did not provide these early users the same privacy protections. We were perplexed by this seeming contradiction: were we the ones lacking in vision and ability to see the bigger picture? After all, compared with the company’s stated goal of signing up one billion users, perhaps 450,000 is small.
But each one of those 450,000 is a person, with his or her own hopes, lives, and rights that have nothing to do with the ambitions of a Silicon Valley startup.
Speaking to Blania clarified something we had struggled to make sense of: how a company could speak so passionately about its privacy-protecting protocols while clearly violating the privacy of so many. Our interview helped us see that, for Worldcoin, these legions of test users were not, for the most part, its intended end users. Rather, their eyes, bodies, and very patterns of life were simply grist for Worldcoin’s neural networks. The lower-level orb operators, meanwhile, were paid pennies to feed the algorithm, often grappling privately with their own moral qualms. The massive effort to teach Worldcoin’s AI to recognize who or what was human was, ironically, dehumanizing to those involved.
When we put seven pages of reporting findings and questions to Worldcoin, the company’s response was that nearly everything negative that we uncovered were simply “isolated incident[s]” that ultimately wouldn’t matter anyway, because the next (public) iteration would be better. “We believe that rights to privacy and anonymity are fundamental, which is why, within the next few weeks, everyone signing up for Worldcoin will be able to do so without sharing any of their biometric data with us,” the company wrote. That nearly half a million people had already been subject to their testing seemed of little import.
Rather, what really matters are the results: that Worldcoin will have an attractive user number to bolster its sales pitch as Web3’s preferred identity solution. And whenever the real, monetizable products—whether it’s the orbs, the Web3 passport, the currency itself, or all of the above—launch for its intended users, everything will be ready, with no messy signs of the labor or the human body parts behind it.
Source link