A World, led by Sam Altman, CEO of OpenAI, is at the center of a global debate after Germany determined the exclusion of iris data collected in the European Union. The decision by BayLDA, the data protection authority of Bavaria, is an example of how the collection of biometric data is moving from the realm of science fiction to directly impact people’s lives.
According to the European Data Protection Board (EDPB), 70% of Europeans consider the collection of biometric data invasive, highlighting concerns about how this information is used and stored.
Alan Nicolas, an artificial intelligence specialist for businesses and founder of the Lendár[AI] Academy, warns that the impact of this decision goes far beyond European borders. ‘The use of biometric data is no longer a future issue. People need to understand that by providing this data, they are putting their privacy and personal security at stake. Regulation must be clearer to protect individuals,’ he states.
What’s at stake with iris scanning
The decision by BayLDA forced World to delete previously collected iris data, claiming there was not enough legal basis for the collection. Although the company states that iris codes have already been voluntarily deleted, the order requires new procedures to ensure compliance with European rules.
Damien Kieran, Chief Privacy Officer of Tools for Humanity, emphasized the need for a more precise definition of anonymization in the European Union. He ensures that iris images are not stored, but critics of the practice raise doubts about the tracking and use of these codes.
Why this matters to everyone
In Brazil, World activated 20 collection points in São Paulo, where it has already scanned the irises of more than 189,000 people. Although the company promises anonymity, experts point out that biometric data is highly sensitive and can be exploited for unauthorized purposes. ‘The debate is essential, because we are dealing with information that can be used for control or surveillance, something that affects everyone, whether in Europe or Brazil,’ says Nicolas.
In other countries, such as Spain and Kenya, the project also faced legal barriers. In the Spanish case, the collection was halted after the Data Protection Agency considered the practices to violate privacy regulations.
From fiction to reality
Alan Nicolas explains that a few years ago, the use of biometric data to create digital identities was the subject of science fiction films. Today, it is a reality that influences everything from authentication on websites to combating fake profiles and deepfakes. ‘It’s no longer fiction. The question now is how to ensure that these technologies benefit people without compromising their privacy. As always, technology is not the villain. What needs to be taken care of is how people use it,’ he emphasizes.
The German decision shows that regulation needs to keep pace with the advancement of artificial intelligence and biometric technologies. “The biggest challenge is to educate people about the risks and ensure that governments and companies work together to establish clear rules. Unfortunately, no legislation in the world is able to keep up with the advancements and ethical issues raised by these new possibilities. We must rely on everyone’s technological education so they will be aware of the potentials and dangers of each tool,” Nicolas concluded.