security

Not home alone: FTC says Ring's lax practices led to disturbing … – Federal Trade Commission News


Many consumers who use video doorbell and security cameras want to detect intruders invading the privacy of their homes. Consumers who installed Ring may be surprised to learn that according to a proposed FTC settlement, one “intruder” that was invading their privacy was Ring itself. The FTC says Ring gave its employees and hundreds of Ukraine-based third-party contractors up-close-and-personal video access into customers’ bedrooms, their kids’ bedrooms, and other highly personal spaces – including the ability to download, view, and share those videos at will. And that’s not all Ring was up to. In addition to a $5.8 million financial settlement, the proposed order in the case contains provisions at the intersection of artificial intelligence, biometric data, and personal privacy. It’s an instructive bookend to another major biometric privacy case the FTC announced today, Amazon Alexa

Ring sells internet-connected, video-enabled cameras and doorbells used by millions of Americas to secure their homes. Ring rang that security bell throughout an extensive marketing campaign, pitching its products as “Small in size. Big on peace of mind.” But the FTC says that despite the company’s claims that it took reasonable steps to ensure that Ring cameras were a secure means for consumers to monitor private areas of their homes, the company exhibited a fast-and-loose approach to customers’ highly sensitive information.

Although Ring’s employee handbook prohibited the misuse of data, the disturbing conduct of some staffers suggest that the admonition wasn’t worth the paper it was printed on. Rather than limit access to customer videos to those who needed it to perform essential job functions – for example, to help a consumer troubleshoot a problem with their system – Ring gave “free range” access to employees and contractors. Is there any doubt where that slipshod policy and lax controls would inevitably lead? During a three-month period in 2017, a Ring employee viewed thousands of videos of female users in their bedrooms and bathrooms, including videos of Ring’s own employees. Rather than detecting what the employee was up to through its own technical controls, Ring only learned about the episode after a female employee reported it. That’s just one example of what the FTC calls Ring’s “dangerously overbroad access and lax attitude toward privacy and security.”

Readers Also Like:  Cloud Computing RFI: What we heard and learned - Federal Trade Commission News

Although Ring changed some of its practices in 2018, the FTC says those measures didn’t solve the problem. You’ll want to read the complaint for details, but even after those modifications, the FTC cites examples of an unauthorized “tunnel” that allowed a Ukraine-based contractor to access customer videos, an incident where a Ring employee gave information about a customer’s videos to the customer’s ex-husband, and a report from a whistleblower that a former employee had provided Ring devices to numerous individuals, accessed their videos without their knowledge or consent, and then took the videos with him when he left the company.

But the threats to consumers’ personal privacy didn’t just come from inside Ring. The complaint charges that until January 2020, Ring failed to address the risks posed by two well-known forms of online attack: “brute force” (an automated process of password guessing) and “credential stuffing” (taking usernames and passwords stolen during other breaches and using them to access Ring). The FTC says Ring’s security failures ultimately resulted in more than 55,000 U.S. customers experiencing serious account compromises.

How serious was the invasion of consumers’ privacy? In many cases, bad actors took advantage of the camera’s two-way communication functionality to harass and threaten people whose rooms were monitored by Ring cameras, including kids and older adults. Describing their experiences as terrifying and traumatizing, customers reported numerous instances of menacing behavior emanating from voices invading the sanctity of their homes via Ring:

  • An 87-year old woman in an assisted living facility was sexually propositioned and physically threatened;
  • Several kids were the object of hackers’ racist slurs;
  • A teenager was sexually propositioned;
  • Hackers cursed at women in the privacy of their bedrooms;
  • A hacker threatened a family with physical harm if they didn’t pay a ransom in Bitcoin; and
  • A hacker told a customer they had killed the person’s mother and issued the bone-chilling warning “Tonight you die.”

One Ring employee put it this way: “Unwittingly, we aid and abet those [hackers] who breached the data by not having any mitigations in place.”

Creepy employees and sinister hackers weren’t the only ones who wrested control of personal data from consumers. According to the complaint, without getting consumers’ affirmative express consent, Ring exploited their videos to develop image recognition algorithms – putting potential profit over privacy. Hiding its conduct in a dense block of legalese, Ring simply told people it might use their content for product improvement and development and then extrapolated purported “consent” from a check mark where consumers acknowledged they agreed to Ring’s Terms of Service and Privacy Policy.

Readers Also Like:  U.S. House of Representatives Supports Dual-Use Acquisition ... - PR Newswire

The complaint alleges Ring violated the FTC Act by misrepresenting that the company took reasonable steps to secure home security cameras from unauthorized access, unfairly allowed employees and contractors to access video recordings of intimate spaces within customer’s homes without customers’ knowledge or consent, and unfairly failed to use reasonable security measures to protect customers’ sensitive video data from unauthorized access.

In addition to the $5.8 million required payment for consumers, the proposed order includes some provisions common in FTC settlements and important new provisions that merit careful attention not only from the tech sector but from any company using consumer data (and, let’s face it, that’s just about everyone). You’ll want to read the order carefully, but here are some highlights. The order prohibits Ring from making misrepresentations about the extent to which the company or its contractors can access customers’ videos, payment information, and authentication credentials. In addition, for the period when Ring had inadequate procedures for getting consumers’ consent, the company must delete all videos used for research and development and all data – including models and algorithms – derived from those videos. 

Ring also must implement a comprehensive privacy and security program that strictly limits “human review” of customers’ videos to certain narrow circumstances – for example, to comply with the law or to investigate illegal activity – or if the company has consumers’ express informed consent. The company also must up its security game with specific requirements of multi-factor authentication, encryption, vulnerability testing, and employee training.

In addition to notifying customers about the lax video access practices cited in the complaint, in the future, Ring must notify the FTC of any security incident that triggers notification under other laws and any privacy incident involving unauthorized access to videos of ten or more Ring accounts.

Readers Also Like:  The race for chip supremacy: US vs. China - Ghacks

What can other companies take from the proposed settlement in this case?

Who’s in charge of consumers’ data? Consumers.  Consumers – not companies – should be in control of who accesses their sensitive data. Furthermore, decades of FTC precedent establishes that businesses can’t use hard-to-find and harder-to-understand “disclosures” and pro forma check boxes to fabricate “consent.” 

Develop algorithms responsibly.  According to the FTC, Ring accessed customers’ videos to develop algorithms without their consent. If you’ve entered the AI arena, the FTC’s message is clear: Consumers’ personal information isn’t grist that companies may use at will. The FTC will hold businesses accountable for how they obtain and use the consumer data that powers their algorithms. In addition, companies that rely on human review to tag the data used to train machine learning algorithms must get consumers’ affirmative express consent and put safeguards in place to protect the privacy and security of that information.

The FTC is “biometrically” opposed to the misuse of highly sensitive categories of personal information.  Biometric data – whether in the form of fingerprints, iris scans, videos, or something else – warrants the highest level of protection. If you haven’t read the FTC’s May 2023 Policy Statement on Biometric Information, make it next on your reading list.

The FTC works to keep consumers safe at home.  If there is one place people should be free from prying eyes, it’s at home. And if there’s one group that merits particular protection at home, it’s children. Now imagine the terror experienced by people – including youngsters – who were threatened in their beds by someone who gained access through a device bought for protection. The FTC’s action against Ring demonstrates the risks to privacy and security that a company’s cavalier data practices can inflict on consumers and kids.
 



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.