
Iris scanner in a World ID shop in São Paulo. Photo: Leu Britto/Agência Mural, used with permission
This article by Artur Ferreira and Isabela do Carmo first appeared on Agência Mural's website on February 25, 2025. An edited version is being republished on Global Voices under a content partnership agreement. The names of people who participated in the project were changed to protect their identities.
Selling iris data? Ocular biometrics? These terms, not commonly heard until recently, have become a regular topic of discussion among friends, families, and communities in São Paulo.
This follows recent online publications inviting people to scan their irises in exchange for about 600 Brazilian reais (about 105 USD), paid in the cryptocurrency World. According to the company, from November 2024 to February 2025, at least 400,000 São Paulo residents submitted this data, which is considered to be the most personal and unique in terms of identification.
“I saw it as an opportunity to start investing in cryptocurrencies,” said Débora, 51, a resident of Vila Alpina in eastern São Paulo. Her son, Bruno, 26, also sold his data. The submission process took a few minutes.
The buyer of their iris scans was Tools for Humanity, a tech company which manages the World ID project in Brazil.
“I don’t think it’s ‘taboo’ to scan irises. Banks and authentication systems on mobile phones use face and iris recognition programs. When they attended to me, they said they used the iris because fingerprints could be circumvented,” Bruno recalled.
His real interest was cryptocurrency. “I already had other digital wallets. You can leave the money there earning interest, without withdrawing,” he said.
For those who want to withdraw this money, however, it is not that simple: withdrawals in the app run by World are in installments, and can only be done once a month.
“We have an increasingly aging population, a country with socio-economic problems. [This project] is a risk for people with lower-incomes, who can turn into data points for a company that [now] has [their] unique data, which you will need in the future,” said Mário Gazziro, professor at the Department of Information Engineering at UFABC (Federal University of ABC).
Data protection
The World ID store in a mall in Santo Amaro, southern São Paulo, is discreet. There is no logo, nor clear identifying signs. With the windows open, you can see the scanners, which resemble crystal balls with a touch of science fiction.
At the counter, some employees were waiting for the first customers. Invited to watch a video and download the company’s app, they were then told to position themselves in front of the metal globe. Seconds later, it was done: the iris was scanned and the cryptocurrency credit sent.
According to what the company told Agência Mural, the iris scan is used to train systems for verifying website users’ identities, including for shopping platforms, banks, and financial investments. The company said the idea was to develop the technology and sell it to third parties. The software for verifying identities trains digital systems to differentiate human irises, and so prevent fraud.
Just four days after our reporter’s visit to one of the stores, on February 11, 2025, Tools for Humanity was subject to a preventive measure, prohibiting any financial compensation for the collection of iris data, from ANPD (National Data Protection Authority), linked to the Ministry of Justice and Public Security. This practice is seemingly at odds with Brazil’s LGPD (General Data Protection Law).
The company tried to appeal the decision, without success. After the currency transfer, users of World’s application, which gives access to the cryptocurrencies, reported difficulties in using the services, making withdrawals, or checking their balance, according to news outlet G1.
‘For the money’
The LGPD considers irises to be extremely sensitive personal data, as experts consider the iris a more accurate and secure way to identify a person than fingerprints. As such, according to the law, informed and safe consent should be required to collect this data.
When asked, World ID said the collection of iris data aims to help differentiate real human interactions from artificial intelligence reactions, “as well as increase access to the global digital economy.”
“It’s like that famous saying: when the offerings are too much, the saint gets suspicious,” observed Professor Gazziro. Why would anybody pay for data if it wasn't important?
Gisele had thought about this, but decided to go through with the procedure. She went there with her two daughters. “The truth is, most people do it for the money. I already knew about the project; I had researched it, and they made me feel comfortable when I was attended to,” she said.
Some days after these interviews, she and other interviewees had difficulties in using the World ID app, after it was blocked by the Data Protection Authority.
Understanding risks
A journalist from Mural, Isabela do Carmo, visited one of the company’s shops in Heliopolis, one of São Paulo’s largest favelas.
“Before I could even express my concerns, I was led, without much explanation, into a small room. There, one of the employees was preparing to bring me to the machine that would take the photo of my iris, and I immediately refused,” she said. “He explained that the idea was to collect data to ‘improve the technology in the future.’”
A Heliópolis resident, approximately 60 years old, who agreed to undergo the procedure, told the journalist that she took part in the project for the payment, but had no idea what the company really did.
The use of the iris to identify system users appears likely to be increasingly common in the near future, particularly to ensure more security in these processes.
“The iris can be used, for example, to improve authentication techniques for bank passwords. The increasingly specific and unique data nowadays can also be used to hone personalized advertizing techniques,” said Gazziro.
For him, the most ethical thing for World ID to do would be to use the iris data only for the training of AI (artificial intelligence), and ensure that no information about the eyes of any participant is stored.
The company said they “do not store any personal data, including iris data, and the selected World ID holders are anonymous.” The technology, they say, allows verifying “humanity, not identity,” while “preserving the privacy and security” of participants.
In other countries that have legislation similar to the Brazilian LGPD, such as Spain, Portugal, South Korea, and Argentina, Tools for Humanity and World ID faced similar difficulties in their operations.
The company’s response
After the report’s publication, World ID’s media team contacted Agência Mural to reiterate that the project did not sell the personal data nor store it.
They said the scanner (called an “Orb”) generates a binary code for each recorded iris and creates a verified profile. The images are encrypted and sent to the app on the users’ cell phones while being erased from the Orb.
The company also guarantees that “biometric images never leave the Orb; that is, they are not sent to any cloud or database,” and it uses advanced privacy technology, AMPC (Anonymized Multi-party Computation).
The company also stated that the “anonymized data is stored in databases operated by trusted third parties, including universities in the US, Zurich, and Germany.”
Regarding the payments, the company explained that part of the payment is made just after the scan, while another part is paid monthly over the course of a year.
“There has never been, by World ID, any equivalence of the cryptocurrency tokens’ values with Brazilian currency or any other, ” the company said. “Values vary according to market conditions, never by internal company criteria.”
The company also claimed that those who could not access data and credits had not used the app correctly. It reiterated that it was in compliance with Brazilian legislation.