Idec (Institute for Consumer Protection) and the Internet Lab on Tuesday (25) released a best practice guide for companies using facial recognition technologies that are increasingly being used in the public and private sectors.
A number of recommendations include the need for people to consent to their faces being captured by cameras, greater transparency in the treatment of the data, additional protection for children and young people and information on suitable locations for the installation of cameras.
Facial recognition has no specific regulations in Brazil, but the Data Protection Act, which came into force in September, also addresses this issue and requires citizens to be more transparent about the practices followed by companies.
Facial data is personal and biometric and is therefore considered sensitive by law. According to the principles of the Data Protection Act, the purpose of capturing the facial image must be clear, the process must be transparent, secure and non-discriminatory.
“Last year we sent several letters to companies with questions about security measures. It is not legally impossible to use it, but safety precautions are necessary to ensure that the rights are preserved, ”says Bárbara Simão, researcher at Idec and one of the authors of the study.
Brazil has recently had cases where the use of this technology has been viewed as inappropriate or undisputed. One of the most emblematic legal episodes comes from ViaQuatro, a concessionaire for line 4 yellow of the São Paulo metro, who installed cameras to identify emotions in the capital’s metro.
The devices were placed next to billboards. The aim was to recognize what people said in the face of a particular advertisement.
At the time, Idec was filing a public civil lawsuit alleging the practice was not transparent and functional without the possibility of citizen consent. An injunction demanded that the cameras be switched off.
In a note, ViaQuatro says that the installed system “has no function or device that enables facial recognition and no technology that enables the identification or storage of personal data or images”.
The company has proven the difference between the recognition system and the face recognition system. The trial is still pending.
Since then there have been a number of reports from consumer rights authorities to companies. Hering was fined R $ 60,000 that year by Senacon (National Consumers’ Secretariat), affiliated with the Justice Department, for “abusive behavior” while using technology in a shop in Shopping Morumbi, São Paulo.
With no explicit information for customers, the store adopted a camera system that could collect data on the gender, age group, and mood of those who visited the space.
Quod, a credit bureau that brings together the main banks in Brazil, Itaú and 99, a mobility application, were also interviewed. Idec feared that the images obtained through the recognition could be used in connection with the credit risk assessment.
All replied to the organization that the technology was not mandatory and that the system was only intended to prevent fraud.
In a note, 99 stressed that the technology is only used to promote the security of the platform in order to “prevent different people from posing as drivers registered in the application”.
Zaitt, an “autonomous market” operating in São Paulo without a companion, also received a letter from the organization, responding that the technology was only used to verify the entry of people but was not mandatory and that a permit could be made by QR code.
However, the adoption of the technology is more common and more sensitive in the public sector, especially among security agents. A recent study by the Igarapé Institute identified the practice of facial recognition by authorities and private partners in at least 48 cases since 2011 in 37 cities.
The issue gained prominence in the public debate from 2018 when hearings were held in Congress and the Public Ministry to discuss possible regulation of the issue.
“The need for transparency already existed, but now [com a lei de dados]there is a bigger obligation. When data is collected and processed, the process must be communicated to the data subjects. This is the first guideline we are proposing, “says Nathalie Fragoso, coordinator at InternetLab and author of the study.
The consent of the people can be given by an employee interviewing customers of the company or via a QR code system so that the consumer enables the capture of his picture via mobile phone.
The main risks in using technology are abuse of rights and control (when the data is eventually released to police and government agencies, a surveillance system is created), discrimination and bias, invasion of privacy, misrecognition of emotions and incidents of security.
When it comes to discrimination, for example, recent studies show that black women have a higher error rate with these tools than other groups. This is partly due to the delay in rendering black faces in databases.
According to the document, which is based on data from the IBGE, only 38.5% of whites in Brazil do not use the internet, compared with 60.5% of the black population.
“Among other things, this leads to less data on this population (for example, a lower number of photos in social networks that can be used to train facial recognition algorithms),” the study says.
Organizations strongly advise the private sector not to use technology, directly or indirectly, to deny goods or services, vary prices, or offer adverse conditions in order to avoid racial, gender and ethnic discrimination.
At least three proposals to regulate the issue are being debated in Congress today, in addition to a number of government proposals in legislative assemblies.
Good practices for the private sector
List prepared by Idec and Internet Lab
Proportionality analysis and compliance with the principles s
Before using a facial recognition system, the company is advised to check that it is the only way to achieve its goal. It has to be analyzed whether the survey purposes are in line with the legislation. Transparency for the owners
It is important to provide complete information on: the use of recording equipment, what data is being collected, what form of treatment and for what purposes it is being given. In addition, it is important to provide information about the term, storage conditions, safety measures and the possibility of disclosure to third parties. Public transparency
The practices used in implementing and running the technology must be documented in reports on the impact on the protection of personal data. approval
It must be provided by the owner before the facial data collection begins. Where the cameras are used
It is stated that they will be installed in places where prior consent can be obtained. Anti-discrimination measures
The technology should not be used to refuse goods and services, change prices, or offer adverse conditions because facial recognition has shortcomings in recognizing different profiles of people. Exclusion, anonymization and protection of biometric data
Images have to be deleted after a certain period of time and the data anonymized so that they cannot be easily identified by a person, children and adolescents
Face recognition can only take place with the consent of the relevant security incident
All incidents should be reported to affected persons and authorities