The Federal Trade Commission today issued a warning that the increasing use of consumers’ biometric information and related technologies, including those powered by machine learning, raises significant consumer privacy and data security concerns and the potential for bias and discrimination.
Biometric information refers to data that depict or describe physical, biological, or behavioral traits, characteristics, or measurements of or relating to an identified or identifiable person’s body.
“In recent years, biometric surveillance has grown more sophisticated and pervasive, posing new threats to privacy and civil rights,” said Samuel Levine, Director of the FTC’s Bureau of Consumer Protection. “Today’s policy statement makes clear that companies must comply with the law regardless of the technology they are using.”
In a policy statement, the Commission said the agency is committed to combatting unfair or deceptive acts and practices related to the collection and use of consumers’ biometric information and the marketing and use of biometric information technologies.
Recent years have seen a proliferation of biometric information technologies. For instance, facial, iris, or fingerprint recognition technologies collect and process biometric information to identify individuals. Other biometric information technologies use or claim to use biometric information in order to determine characteristics of individuals, ranging from the individuals’ age, gender, or race to the individuals’ personality traits, aptitudes, or demeanor.
Consumers face new and increasing risks associated with the collection and use of biometric information. For example, using biometric information technologies to identify consumers in certain locations could reveal sensitive personal information about them such as whether they accessed particular types of healthcare, attended religious services, or attended political or union meetings. Large databases of biometric information could also be attractive targets for malicious actors who could misuse such information. Additionally, some technologies using biometric information, such as facial recognition technology, may have higher rates of error for certain populations than for others.
In recent years, the FTC has brought enforcement actions against photo app maker Everalbum and Facebook, charging they misrepresented their uses of facial recognition technology. The FTC also issued a report about facial recognition in 2012 that recommended best practices to protect consumers’ privacy.
Today’s policy statement warns that false or unsubstantiated claims about the accuracy or efficacy of biometric information technologies or about the collection and use of biometric information may violate the FTC Act. The policy statement also notes that it will consider several factors in determining whether a business’s use of biometric information or biometric information technology could be unfair in violation of the FTC Act including:
- Failing to assess foreseeable harms to consumers before collecting biometric information;
- Failing to promptly address known or foreseeable risks and identify and implement tools for reducing or eliminating those risks;
- Engaging in surreptitious and unexpected collection or use of biometric information;
- Failing to evaluate the practices and capabilities of third parties, including affiliates, vendors, and end users, who will be given access to consumers’ biometric information or will be charged with operating biometric information technologies;
- Failing to provide appropriate training for employees and contractors whose job duties involve interacting with biometric information or technologies that use such information; and
- Failing to conduct ongoing monitoring of technologies that the business develops, offers for sale, or uses, in connection with biometric information to ensure that the technologies are functioning as anticipated and that the technologies are not likely to harm consumers
The Commission voted 3-0 during an open Commission meeting to adopt the policy statement.
FTC staff who worked on this matter include Robin Wetherill and Amanda Koulousias.