security

Surgeon General's Warning on Social Media and Kids Bolsters … – U.S. News & World Report


The country’s top public health official issued a rare warning to Americans this week, calling the use of social media by children and adolescents an “urgent public health issue” and urging policymakers, parents, technology companies and schools to do something about it.

The warning – issued only during extraordinary circumstances – is outlined by Surgeon General Vivek Murthy in a 19-page report documents how “extreme, inappropriate, and harmful content” is easily accessible by children and adolescents, how social media perpetuates body dissatisfaction, disordered eating behaviors, social comparison and low self-esteem, and how roughly two-thirds of adolescents are regularly exposed to hate-based content.

Murthy’s advisory is quick to note that the current body of evidence about the impact of social media indicates that it may have benefits for some children and adolescents – building community and safe spaces, for example – but that those pale in comparison to the “ample indicators that social media can also have a profound risk of harm” to their mental health and well-being.

“At this time, we do not yet have enough evidence to determine if social media is sufficiently safe for children and adolescents,” he wrote.

Such advisories are reserved for “significant public health challenges that require the nation’s immediate awareness and action,” according to an explanation about how the public should receive the information. For an increasing number of public schools across the country whose leaders have been sounding the alarm for years, the warning bolsters a wave of new litigation seeking to hold those social media companies accountable.

Editorial Cartoons on Education

Seattle Public Schools leaders filed a complaint in January against the companies operating TikTok, Instagram, Facebook, SnapChat and YouTube, alleging that school districts like theirs are “at the front lines of the youth mental health crisis” and that social media sites are putting children under incredible strain as a result of increased screen time, unfiltered content and potentially addictive properties of social media.

Seattle was the first large urban school district to take on social media giants, but it was far from the last. Over the last four months, school leaders filed similar lawsuits in Pittsburgh and Bucks County in Pennsylvania, in San Mateo, California; in Chatham, New Jersey; Bay County in Florida, which includes Panama City; and in Mesa and Scottsdale in Arizona.

At least 11 school districts in Kentucky, including Jefferson County and Fayette, the largest school system in Kentucky, approved a resolution to file a lawsuit against the companies, alleging that they “designed their platforms to maximize the time youth spend using them and addict youth to their platforms” – a strategy that they say has been “harmful to the mental, behavioral, and emotional health of youth and is associated with increased rates of depression, anxiety, low self-esteem, eating disorders, and suicide.”

Three of Alabama’s largest districts did the same, alleging “an egregious breach of the public trust.”

It’s unclear exactly how many school districts have sued the social media giants, but the number counted by U.S. News is north of two dozen.

The mounting lawsuits come at a precarious moment for K-12 students and the schools that serve them – in the wake of an isolating pandemic that shuttered schools in some parts of the country for the better part of two years and decimated their support staff. An analysis by Chalkbeat shows that among 18 of the country’s largest school districts, 12 started this school year with fewer counselors or psychologists than they had in fall 2019 – nearly 1,000 unfilled mental health positions.

Readers Also Like:  Accelerate Your Security Transformation with CrowdStrike and ... - CrowdStrike

The lack of resources has been particularly distressing for teen girls. New data from the CDC shows that nearly 3 in 5 (57%) of U.S. teen girls felt persistently sad or hopeless in 2021 – double that of boys, representing a nearly 60% increase and the highest level reported over the past decade.

And equally problematic for LGBTQ+ children – and especially transgender children – as Republican-controlled states pass laws limiting their access to books that center LGBTQ+ issues, strip their access to gender-affirming care and their ability to play on sports teams, bar education about sex and gender and block educators from being sources of support.

Roughly 45% of LGBTQ+ youth seriously considered attempting suicide in the past year, and 1 in 5 transgender and nonbinary youth attempted suicide, according to the Trevor Project.

Murthy’s warning this week is hardly his first on the subject. In fact, he’s been sounding the alarm on the teen mental health crisis for more than a year now, urging the public not to become “numb to these numbers.”

“These are not normal numbers, this should not be happening in our society,” he said alongside Education Secretary Miguel Cardona in February while visiting a school in Virginia’s Fairfax County.

Murthy, Cardona and Health and Human Services Secretary Xavier Becerra have implored states and school districts to use their coronavirus relief aid to help bolster mental health support in K-12 schools.

Some states are heeding those calls: In January, New Jersey Gov. Phil Murphy, a Democrat, announced a $14 million mental health grant program to support the highest need K-12 schools. A month later, North Carolina Gov. Roy Copper, a Democrat, announced that the state would pour nearly $8 million into suicide prevention training for university and community colleges and create a mental health hotline or students. And in Rhode Island, Democratic Gov. Daniel McKee unveiled a $7 million program to train school employees to detect mental illness and suicide risk.

And states like Arizona, California and South Carolina are among a growing number raising Medicaid reimbursement rates to push behavioral health providers to offer services in schools.

President Joe Biden, for his part, blasted social media companies for contributing to the teen mental health crisis in his most recent State of the Union address and called on Congress to pass legislation that limits how tech companies collect data from kids and bars them from advertising to minors.

“When millions of young people are struggling with bullying, violence, trauma, we owe them greater access to mental health care at their schools,” he said. “We must finally hold social media companies accountable for the experimenting they’re running on children for profit.”

But Congress has yet to act in any meaningful way to regulate the social media industry’s impact on the mental health of adolescents.

Sen. Richard Blumenthal, Connecticut Democrat, and Sen. Marsha Blackburn, Tennessee Republican, introduced the bipartisan Kids Online Safety Act last year, making a last-ditch effort at the end of the year to include it in the omnibus spending package – to no avail. They’re making another push now, having held a hearing in March on the legislation in the Senate Judiciary Committee, where the duo made a rare bipartisan pledge to “act swiftly” on the matter.

“Our kids are literally dying from things they access online, from fentanyl to sex trafficking to suicide kits,” Blackburn said during the hearing. “It’s not too late to save the children and teens who are suffering right now because Big Tech refuses to protect them.”

The measure aims to shield children from harmful content and would require social media companies to establish parental controls for anyone under the age of 16. It would also require social media companies to create a way to protect children from addiction, stalking, exploitation and other “dangerous material.”

Meanwhile, in the House, TikTok CEO Shou Zi Chew took a bipartisan bruising last month in Washington, when he appeared before the Energy and Commerce Committee whose members blasted the app’s parent company, ByteDance, as a national security threat, charging that the Chinese government can use the app to gather sensitive data and personal information.

But many in the hearing room were more concerned with the havoc the app is already wreaking domestically, including the parents of Chase Nasca, who was 16 when he jumped in front of a Long Island Rail Road train last year.

Dean and Michelle Nasca sued TikTok’s parent company, ByteDance, last month, alleging that the app directed more than 1,000 videos promoting suicide, hopelessness and self-harm to their son, even though he never searched for those terms. Like the movement underway among school districts, the Nascas join a similarly growing group of parents suing social media giants for the deaths of their children.

“While the United States government has primarily been focused on protecting our national security, they need to focus more on protecting our nation’s children,” says Matthew Bergman, attorney for the Nascas and founding director of the Social Media Victims Law Center. “We are seeking to hold TikTok accountable for engaging in dangerous and harmful practices that put our children at risk of self-harm all in the name of engagement to increase their ad revenues.”

As it stands, dozens of states have already banned or restricted the use of TikTok on government devices. Montana became the first to ban TikTok to the public earlier this month in an effort to prevent the Chinese government from gaining access to personal information – though days later, the company sued the state in response.

Utah stands out as the only state that successfully legislated teen social media use, passing a law earlier this year that requires social media firms to get parents’ consent for children to use their apps. Among other things, the law gives parents full access to their children’s online accounts, including posts and private messages, imposes a curfew that blocks children’s access between 10:30 p.m. and 6:30 a.m. and bars social media companies from collecting data on children or targeting them for advertising.

A handful of other states – including Arkansas, California, Louisiana, New Jersey, Ohio and Texas are considering similar regulations. In the Golden State, home to many of the social media giants and other big tech firms, a bill introduced in March would make companies liable for using designs, algorithms or features that they know could lead minors to purchase fentanyl, become addicted to their platforms or cause eating disorders, suicide and other forms of self-harm.

“To maximize user engagement and increase profits, TikTok creates and co-creates harmful content and deliberately targets children in the United States with violent, dangerous, extreme and psychologically disturbing content from which they cannot look away,” Bergman says. “In China’s version of TikTok, minors 14 and under are limited to 45 minutes per day online and are directed to science experiments, museum exhibits, patriotic and educational videos.”

Murthy’s advisory tells a very different story of social media use in the U.S., where up to 95% of adolescents aged 13 to 17 report using a social media platform, with more than a third saying they use social media “almost constantly.” And while 13 is the commonly required minimum age to log into social media platforms in the U.S., nearly 40% of children ages 8 to 12 use social media.

Research cited in the advisory show that adolescents aged 12 to 15 who spent more than three hours per day on social media faced double the risk of experiencing poor mental health outcomes, including symptoms of depression and anxiety. As of 2021, eighth and 10th graders spend an average of 3 1/2 hours per day on social media.

Among other things, Murthy urged policymakers to strengthen protections for children interacting with social media platforms, develop age-appropriate health and safety standards for technology platforms, find ways to protect children and adolescents from accessing harmful content, limit the use of features that attempt to maximize engagement, regularly assess risks to children and adolescents and require a higher standard of data privacy to protect them from exploitation and abuse.

“Our children have become unknowing participants in a decades-long experiment,” Murthy said in a statement. “And while there is more we have to learn about the full impact, we know enough now to take action and protect our kids.”



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.