technology

Facebook-parent Meta breaks up its Responsible AI team


Mark Zuckerberg, CEO of Meta, attends a U.S. Senate bipartisan Artificial Intelligence Insight Forum at the U.S. Capitol in Washington, D.C., Sept. 13, 2023.

Stefani Reynolds | AFP | Getty Images

Meta has disbanded its Responsible AI division, the team dedicated to regulating the safety of its artificial intelligence ventures as they get developed and deployed, according to a Meta spokesperson.

Most members of the RAI team have been reassigned to the company’s Generative AI product division, while some others will now work on the AI Infrastructure team, the spokesperson said. The news was first reported by The Information.

The Generative AI team, born in February, focuses on developing products that generate language and images to mimic the equivalent human-made version. It came as companies across the tech industry poured money into machine learning development so as not to get left behind in the AI race. Meta is among the Big Tech companies that have been playing catch-up since the AI boom took hold.

The RAI restructuring comes as the Facebook parent nears the end of its “year of efficiency,” as CEO Mark Zuckerberg called it during a February earnings call. So far, that has played out as a flurry of layoffs, team-mergers and redistributions at the company.

Ensuring the safety of AI has become a stated priority of top players in the space, especially as regulators and other officials pay closer attention to the nascent technology’s potential harms. In July, Anthropic, Google, Microsoft and OpenAI formed an industry group focused specifically on setting safety standards as AI advances.

Readers Also Like:  Freeview issues urgent advice as your TV could stop working today

Though RAI employees have now been dispersed throughout the organization, the spokesperson noted that they will continue to support “responsible AI development and use.”

“We continue to prioritize and invest in safe and responsible AI development,” the spokesperson said.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.