security

Finally Some Clear Thinking on Child Sexual Abuse and … – Lawfare


President Biden recently told Big Tech he’s coming after them. Whether the administration will succeed is questionable—the Frightful Five have a strong lobbying presence in Washington—but one thing is clear: The climate in D.C. around the tech sector has changed. Politicians are a lot less enamored of Silicon Valley than during the lovefest of a decade ago.

The president outlined three reasons for his concerns. The first is massive privacy infringements by the tech sector. That’s been a concern since the arrival of cookies in web browsers in 1994. But with mobile devices and their ability to connect one’s physical actions with online ones, tech’s privacy invasion is now a frequent headliner. Stay tuned for whether there’s action in this congressional session on the American Data Privacy and Protection Act.

The second is the free rein provided by Section 230 of the Communications Decency Act of 1996. The law states that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” and protects the Facebooks, Twitters, and other sites with user-generated content from libel suits. Congress passed that section nearly three decades ago, when the internet was in a nascent state. Big Tech has grown up since then, and the president—and many others—have grown concerned about the lack of algorithmic transparency that chooses what content and ads get displayed to us. President Biden also raised concerns about cyberstalking and child sexual exploitation, an issue I will return to shortly.

The third issue the president introduced was competition. Indeed, Silicon Valley is now replicating the monopolistic patterns of other communication technologies. This is detrimental to democracy. People are not well served when a single company such as Meta curates the news in order to keep them engaged on its platform. And monopolies are detrimental to the economy when it effectively prevents startups from getting anywhere. Federal Trade Commission Chair Lina Khan is on this issue.

Biden is right to raise the issue of online child sexual abuse and exploitation (CSAE), and he should look to a recent report by Laura Draper to address it. 

Digitization and the internet have simplified the ability to communicate and store data and, in the process, made CSAE easier to conduct and share. Such abuse, which includes the production and sharing of child sexual abuse materials (CSAM), online trafficking, and real-time videos of CSAE, are rapidly on the rise. The investigations are difficult to do, as I have written about in Lawfare. Part of the problem lies in the fact that the victims are children: Sometimes those who should be protecting them are the actual exploiters.

Encryption, especially end-to-end encryption (E2EE), which allows only the sender and recipient to view the unencrypted message, exacerbates aspects of the situation. By providing robust communications security, E2EE is a critical component of communications security systems, one increasingly found in widely used consumer products, including WhatsApp, Signal, and Proton Mail. Unfortunately, this robust security interferes with the prevention and investigation of CSAE cases because encryption blocks an investigator from seeing the communications—and whether a suspect is sending, receiving, or storing CSAM.

Since the late 2010s, a fight has raged over the impact of increased availability of E2EE on CSAE investigations. The entrenched positions—law enforcement arguing against E2EE deployment while computer security experts and civil libertarians asserting that widespread availability of E2EE encryption is critical for securing communications—have led to few solutions. Last fall, Laura Draper stepped into this fight. Instead of rehashing old arguments, Draper carefully unpacked online CSAE crime in a well-documented report. Observing that end-to-end encryption is becoming the default for communications, Draper focuses on how society should proceed with that as a given. By treating E2EE as a given, Draper moves the discussion forward instead of falling victim to old, immovable debates.

Draper’s report does two important things. First, she observes that online CSAE is not one problem, but an amalgam of four problems, which she delineates. Second, she notes that, although the internet facilitates CSAE, solutions—both investigative and preventative—need not be limited to technical ones. Because interventions and investigations are most effective when tailored to the type of CSAE activity, I will cover each separately. Before I do so, let’s start with some numbers.

In the United States, the National Center for Missing and Exploited Children (NCMEC), a Department of Justice-funded clearinghouse for information on missing and sexually abused children, collects evidence of CSAE and shares it with authorized parties. In 2021, NCMEC received 29 million reports of online sexual exploitation—10 times more than a decade earlier. NCMEC reported that video files, which include real-time sexual abuse of children, increased over 40 percent between 2020 and 2021. Almost all reports, 94 percent, involved locations outside the United States.

Readers Also Like:  Facebook parent Meta makes public its ChatGPT rival LLaMA - ABC News

Sharing of CSAM predates the internet, but the ease of distribution created by the internet has vastly increased the sharing of such content. How quickly CSAE is growing is, nonetheless, not entirely clear (some of the increased numbers appear to be due to better detection capabilities; these tools do not work in E2EE environments). Some content is frequently shared. As Draper notes, Facebook, for example, found that 90 percent of the content that it provided in October and November 2021 to NCMEC was “the same as or visually similar to previously reported content,” with just six videos constituting half of Facebook’s reports.

The first type of content Draper addresses is the production and distribution of CSAM, which consists of photographs or videos of sexual abuse of a child. Such material is often redistributed (for example, per Facebook’s statistics above). This content existed long before digitization and the internet made copying and distributing of such material simple to do. Draper points out that, surprisingly, some of that redistribution occurs not out of interest in CSAM but out of anger that such images exist. Images are also shared by people who find them humorous (yes, this is hard to believe). In 2020-2021, Facebook conducted a study of 150 users who were sharing CSAM; the company discovered that 75 percent of users sharing CSAM fell into the above two categories. Yet sharing causes harm to a child regardless of the reason behind it. A study by the Canadian Centre for Child Protection noted that 70 percent of victims “worry constantly” that they may be recognized from CSAM photos circulating online. This fear continues years after the original abuse.

E2EE severely complicates investigating CSAM. Draper observes that some methods for limiting the sharing of CSAM are implemented by the platforms themselves. WhatsApp, for example, deletes 300,000 accounts a month on suspicion of sharing CSAM. The details of how Meta makes this determination are not public.

Investigations are important, but interventions to prevent the problem in the first place are even better. Draper points to several possible directions for doing this. One is efforts to prevent digital cameras from filming CSAM in the first place. Another intervention is to stop the sharing of CSAM. Recall that a large percentage of sharing of CSAM comes from people who, rather than being explicitly interested in CSAM, either are angered by the images or find them humorous. Draper recommends simplifying the process of reporting CSAM and providing warnings to users with intent to deter them from sharing material that is illegal to possess. While warning users does not always have positive effects, here the consequences of possession are costly. Given the high percentage of people who share CSAM without having an explicit interest in the abusive material, decreasing that percentage could have a large impact not only by reducing CSAM spread but also by enabling law enforcement to focus on the perpetrators of the criminal activity.

The second type of CSAE that Draper discusses is perceived first person (PFP) material. This relatively new phenomenon consists of nude imagery that is taken voluntarily by the children themselves (99 percent of the self-generated photos were female). Draper explains that such sharing is partially the result of changing norms on the sharing of nude photos. In 2021, 34 percent of U.S. teens between ages 13 and 17 agreed that sharing nude photos of themselves was normal. Perhaps more surprisingly, Thorn, an international organization focusing on preventing sexual trafficking of children, reported that 14 percent of children aged 9–12 thought it was normal for people their age to share nude photos of each other.

Draper points out that PFP is growing rapidly. The British Internet Watch Foundation reported that of the webpages with CSAM that were removed in 2020, 68,000 had PFP images, while in 2021, 182,000 had self-generated imagery.

Draper notes that images may be shared in various ways. The child may provide an image to a trusted person whom the child does not expect to share further (but some do). Or the child produces the image that is then accessed by a third party. Children may be producing the images voluntarily, or they may be coerced into providing them, possibly as a result of originally providing such an image voluntarily. 

Readers Also Like:  Europe Daily News, 29 March 2023 | Perspectives & Events - Mayer Brown

At first, it looks like there is nothing to be done in PFP cases. The child creates a photo, then shares it, often using a communication protected by E2EE. Much like with CSAM, the ubiquity of E2EE communications in all steps after the photo is taken greatly complicates investigations. But there are some differences between PFP and the CSAM case.

The child is the one who takes the photo. Often the initial connection between child and offender—or photo and offender if the photo is posted on a public website—is not via an E2EE connection. That gives a break to investigators.

But there is an even better way to attack the PFP problem. A key to PFP investigations is that the children themselves are not part of the criminal activity. This gives the children the ability to usefully act. 

The issue is how to give children the agency to do so. Draper points out that several possible tools may encourage such actions. Sex education is a way to prevent sexual assault (online sharing of someone else’s nude photos is sexual abuse). Similar tools—including providing information about online safety, setting limits, and reporting abuse—is another. A different kind of protection is company prompts. Facebook, for example, reminds minors not to accept friend requests from people they don’t know. This is “safety by design,” an industry approach that Draper discusses that builds user safety into the product design (an example of this is WhatsApp requiring users to have a contact’s phone number in order to message them).

The third type of CSAE is internet-enabled child sex trafficking. Such activity begins with either an in-person or online connection between a trafficker and a child victim. Once a trafficker has established control over the child, the trafficker connects with offenders. The connection is quickly moved to a secure communication, vastly complicating investigation. After the trafficker receives payment, he connects the offender with the child, who is then sexually abused.

This one is particularly hard to crack. Draper reports that the median age of victims is 14. In many cases, the abuser is a family member (this is almost exclusively for victims under 10) or those in their social network. Prosecution is hard; 88 percent of victims did not want their traffickers prosecuted. So investigations are not necessarily fruitful.

Draper points to several interventions to address internet-enabled child sex trafficking, including safe community spaces for youth at risk, online safety education, safety by design, and education about online security. The interventions that Draper proposes are not high tech, but the point is that neither are the problems. Tech is simply exacerbating a deep societal problem—neglect and abuse of at-risk children—and thus Draper’s proposals squarely hit where the issues lie.

The fourth and final case of CSAE is real-time videos of child sexual abuse and exploitation. As with the previous case, the first step is for a trafficker to develop control over a child. Then the trafficker must find potential offenders. Draper notes that since traffickers don’t know who might be interested in such online abuse, they advertise themselves by “posting suggestive photos of children and using suggestive language” (another way traffickers are distinguished is that they are often from certain geographic regions).

Once a potential offender connects with a trafficker, communications between trafficker and offender typically move to encrypted channels. Draper observes that’s an important aspect for investigating this type of case: Somehow the potential offender—the person who pays someone else to sexually abuse a child and film that abuse—has to initiate contact with a trafficker who has access to an unprotected child and performs the abuse. Offenders find traffickers by the type of language and images they post; thus, so can—and do—investigators. The fact that law enforcement can do so does not mitigate the expense of such investigations; it simply shows where an investigation may have an inroad.

In discussing this fourth type of CSAE, Draper notes that this particular crime can occur, in part, because of a lack of protective family members surrounding the child. Lack of protective family members limits who might report this type of criminal activity, limiting opportunities not only for prevention but also for investigation. The result, Draper points out, is that “[i]n end-to-end encrypted environments, undercover operations are the only detection method that can identify live online child sexual abuse conducted by traffickers … in a format that can lead to criminal investigation and prosecution.” In light of a lack of witnesses who might report the crime, it is not surprising that undercover operations are an expected route for this type of online CSAE investigations.

Readers Also Like:  Secure-by-Design: Which Comes First, Code or Security? - Security Intelligence

No one but the child’s family is aware that criminal activity is occurring. That, combined with end-to-end encryption, makes clear that only noncontent or other patterns indicative of this crime might be useful for uncovering the activity. This suggests where else to look for indicia of the activity. In this case, it’s follow the money. Draper observes that “live online child sexual abuse and the sexual exploitation of children in the context of travel and tourism … yield peculiar and specific money-transfer activities, which could be helpful evidence to identify these offenders.” Patterns of communications of online CSAE are often unusual; they don’t resemble normal types of connections (for example, a normal video communication stream would show data flowing both ways at somewhat proportional levels, but online real-time CSAE demonstrates a different pattern that is simultaneously distinguishable from streaming a film). And even when a communication is encrypted, the communications metadata typically is not. Thus communications metadata may also prove useful for identifying offenders.

Draper’s proposed solutions are not high tech, but neither is the CSAE problem. CSAE is a societal one of neglected and abused children whose abuse is heightened by new technological tools. So Draper takes a balanced, whole-of-society approach. She proposes that policymakers:

  • Improve the report-to-prosecution pipeline by setting industry standards for data collection and retention; creating uniform criteria for lawful data requests; streamlining law enforcement’s reporting; establishing guidelines for triaging reports; and addressing victims’ needs throughout the process ….
  • Engage relevant stakeholders by identifying who can intervene; investing in partnerships; leveraging incentive structures; and exploring alternative sources of signals suggesting malicious conduct. Addressing this problem requires a whole-of-community approach; policymakers must think broadly to assemble the right team.
  • Prioritize upstream efforts by investing in community-based opportunities and researching tech-based interventions. End-to-end encryption creates a black box around communications in which all parties want to keep private; effective responses therefore must occur earlier in the process.

This report is quite timely. Last summer, the United Kingdom proposed amendments to the Online Safety Bill. These changes would require companies to use either known technology or develop their own to detect child sexual abuse and exploitation material. On the surface, this sounds reasonable, but the fact is that if encryption is involved, there are no known solutions. As failure to do so could result in significant fines (“£18 million or 10% of the company’s global annual turnover”), the amendments are a Trojan horse for eliminating the use of end-to-end encryption. Not to be outdone, the European Union is considering a Combating Child Sexual Abuse Online bill that would similarly require providers of hosting or communications services to “assess the risk that their services are misused to disseminate child sexual abuse material or for the solicitation of children, known as grooming.” The providers would also have to propose measures to decrease the risks.

Draper’s approach, which avoids grandstanding, is sensible and smart. Legislators should pay attention. Laws require balancing acts between competing priorities, but the interventions that Draper presents have little downside except for costs. A rush to legislative mandates on end-to-end encryption that would have a deleterious effect on private and national security should not occur when there are other, societally beneficial solutions to the underlying problems. Draper’s report should be read carefully by anyone considering legislation in this area—and most especially before they enact damaging legislation.





READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.