enterprise

Coming regulations mean game developers must be proactive on safety and trust


Presented by Modulate


New safety and privacy regulations are set to impact game developers across the globe. As part of VentureBeat’s most recent Spotlight webinar on fighting community toxicity, we brought together industry experts from Xbox, Modulate and legal firm Osborne Clarke to discuss how to stay ahead of the new regulatory landscape while centering on player safety.

Watch free on demand now.


Over the last year or two, regulators have been paying far more attention to the idea of trust and safety, both in terms of child safety and violent radicalization, but also staying aware of the toxicity that can flourish on platforms that aren’t sufficiently monitored, and how that impacts mental health.

More than a dozen new and pending internet trust and safety regulations from the United States and the EU to Australia, Ireland, the U.K. and Singapore are set to impact the gaming industry, targeting hate speech, harassment, misinformation and danger to underage children. The eSafety Commissioner in Australia has been working to produce guidance for a variety of industries on engaging more effectively with trust and safety, with a particular emphasis on gaming and social platforms, said Mike Pappas, CEO and co-founder of Modulate.ai.

And it was one of the few spaces where the industry’s own code of regulators produced for the eSafety commission for review was denied, because the commission felt that it was insufficiently protective in a variety of ways, “especially around looking for platforms to, be more proactive in anticipating the kinds of harms that can happen online, and doing something more aggressive to mitigate those harms,” he added. “That general theme of being proactive and thoughtful up front, and safety by design, is the biggest theme of the regulation today.”

Readers Also Like:  GitHub will allow enterprise cloud customers to store data in the EU - TechCrunch

In the EU, the recent Digital Services Act (DSA) is now in force, and its impact is wider reaching than most developers expect.

“It introduces completely new rules on liability dealing with disinformation, illegal goods and content, cyber violence, dark patterns and targeted advertisements and at the same time, it also introduces potential fines up to 6% of the global turnover of a company,” said Konni Ewald, partner and head of tech, media and comms, at law firm Osborne Clark Germany. “And very important, it’s relevant for all companies doing business within the EU, even if they do not have a physical presence within the EU.”

Putting players at the center

All of these new laws must be the impetus for studios to be far more proactive in the ways they think about risk preparedness and more aggressive in the way they mitigate risk on their platforms, Pappas added. Safety by design, in other words.

“It’s not just about checking that regulatory box,” added Kim Kunes, general manager, Xbox trust and safety at Microsoft. “It’s about how we do this in a way that’s good for our players. How do we do it in a way that helps to drive consistency across platforms and experiences for players?”

Developers need to find ways to ensure users know what the platform is doing with their data, how content moderation functions, and clearly communicate new policies, whether they’re regulatory requirements or elevated community standards and expectations. It means taking in feedback from players as well, as the landscape continues to evolve.

Readers Also Like:  Meta Quest serves up Pickleball One VR fitness game on Oct. 19

Preparing for the upcoming changes

As regulatory change becomes more ubiquitous across markets, many companies, especially the smaller ones, are finding it a challenge to fully understand the landscape and how to meet the requirements in a way that stays true to company goals, as well as provide the best consumer experiences within your platform or service. Many of these new rules and regulations are incredibly complex, making it hard to nail down what they mean and needs to be done.

And once obligations are clearly outlined, new policies and reporting obligations must be set in place alongside terms and conditions, and privacy and safety made a top priority. But it’s not only that, he added. For instance, the DSA targets illegal content, which often occurs when you allow users to generate their own work, or give them spaces to communicate. Then a company has the challenge of identifying what illegal content actually is — because the DSA doesn’t provide a definition.

“If we’re honest, game companies in the past have very much relied on reports by other users,” Ewald said. “But under the new rules, this may not be enough anymore. I need to think about new ways of identifying illegal content.”

Looking toward the future

In conversations with regulators, the thing Pappas is most consistently hearing is that companies shouldn’t overthink this.

“The thing we’re asking for is for the platforms to show that they themselves are taking some responsibility for the safety of their users and trying,” he said. “The more the platform can show that they’re genuinely, authentically trying, the more lenience I think there will be from the regulatory side,” adding he cannot promise that — but the trust and safety space is not mature enough to be at the nitpicking, and fine-grained detail level.

Readers Also Like:  Earning Potential: Trends in Business Tech for 2024 - MaltaToday

“For now, the mission is to try our best and start more proactively and more responsibly doing what we can on the platform side to protect the users that come into these spaces we build,” he said. “My guidance for anyone, large or small, is to find partners that you can trust, that you can engage with in a deeper way to talk through what those strategies should look like. The more that we as an ecosystem are working together with each other through those kinds of partners of relations, that naturally brings us all into alignment as a full industry.”

To learn more about the ways developers can protect their game communities, meet regulatory standards while continuing to center their players and more, don’t miss this VB Spotlight.

Watch free on demand now!

Agenda

  • Staying on top of the evolving regulatory landscape
  • Developing company policies and culture around trust and safety
  • Why AI isn’t enough to keep players safe
  • Beyond user reports: implementing proactive monitoring
  • And more

Speakers

  • Mike Pappas, CEO and co-founder, Modulate.ai
  • Kim Kunes, general manager, Xbox trust and safety, Microsoft
  • Konni Ewald, partner and head of tech, media and comms, Osborne Clark Germany
  • Rachel Kaser, Gaming & Technology Writer, GamesBeat (moderator)



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.