World, led by Sam Altman, CEO of OpenAI, is at the center of a global debate after Germany determined the exclusion of iris data collected in the European Union. The decision by BayLDA, Bavaria’s data protection authority, is an example of how biometric data collection is moving from the realm of science fiction to directly impacting people’s lives.
According to the European Data Protection Board (EDPB), 70% of Europeans consider biometric data collection invasive, highlighting concerns about how this information is used and stored.
Alan Nicolas, an artificial intelligence expert for businesses and founder of Academia Lendár[IA], warns that the impact of this decision goes far beyond European borders. ‘The use of biometric data is no longer a question of the future. People need to understand that by providing this data, they are putting their privacy and personal security at stake. Regulation must be clearer to protect individuals,’ he says.
What’s at stake with iris scanning
The BayLDA decision forced World to delete previously collected iris data, claiming there was insufficient legal basis for the collection. Although the company states that iris codes have already been voluntarily deleted, the order requires new procedures to ensure compliance with European rules.
Damien Kieran, Chief Privacy Officer at Tools for Humanity, emphasized the need for a more precise definition of anonymization in the European Union. He assures that iris images are not stored, but critics of the practice raise questions about the tracking and use of these codes.
Why this matters to everyone
In Brazil, World has activated 20 collection points in São Paulo, where it has already scanned the irises of over 189,000 people. Although the company promises anonymity, experts point out that biometric data is highly sensitive and could be exploited for unauthorized purposes. ‘The debate is essential because we are dealing with information that could be used for control or surveillance, something that affects everyone, whether in Europe or Brazil,’ comments Nicolas.
In other countries, such as Spain and Kenya, the project has also faced legal barriers. In the Spanish case, collection was halted after the Data Protection Agency deemed the practices violated privacy standards.
From fiction to reality
Alan Nicolas explains that just a few years ago, the use of biometric data to create digital identities was a theme in science fiction movies. Today, it is a reality influencing everything from website authentication to combating fake profiles and deepfakes. ‘This is no longer fiction. The question now is how to ensure these technologies benefit people without compromising their privacy. As always, technology is not the villain. What needs care is how people use it,’ he emphasizes.
The German decision shows that regulation needs to keep pace with advances in artificial intelligence and biometric technologies. ‘The biggest challenge is educating people about the risks and ensuring governments and companies work together to create clear rules. Unfortunately, no legislation anywhere in the world can keep up with the advances and ethical questions raised by these new possibilities. We are left relying on technological education for all, so they are aware of the potentials and dangers of each tool,’ concludes Nicolas.