science

UK should play leading role on global AI guidelines, Sunak to tell Biden


Rishi Sunak will tell Joe Biden next week the UK should become a global hub for developing international regulation of artificial intelligence, as the prime minister rapidly shifts his position on the emerging technology.

Sunak will travel to Washington DC on 7 and 8 June for meetings with the US president, as well as members of Congress and business leaders. Officials have told the Guardian that while there, Sunak intends to raise the issue of AI regulation, and specifically call for Britain to play a leading role in coordinating the formulation of global guidelines for its use.

The British government issued a white paper on AI this year, which spoke mainly of the benefits of AI rather than the risks it poses. But ministers are changing that position quickly, as experts warn the technology could present an existential threat to humankind.

Last week, Sunak met four top technology executives to discuss how to regulate the industry. This week he indicated he was paying attention to the recent warning by 350 global AI experts that it should be taken as seriously as the threat posed by pandemics or nuclear war.

“AI clearly can bring massive benefits to the economy and society,” he said. “But we need to make sure this is done in a way that is safe and secure. That’s why I met last week with the CEOs of major AI companies to discuss what are the guardrails that we need to put in place, what’s the type of regulation that should be put in place to keep us safe.”

Readers Also Like:  Local STEM event exposed young women to science, technology, math and engineering careers - American Heart Association

Referring to this week’s expert warning, he added: “People will be concerned by the reports that AI poses an existential risk like pandemics or nuclear wars – I want them to be reassured that the government is looking very carefully at this.”

But he also signalled he wanted the UK to play a significant role in creating a set of global guardrails that would govern how countries around the world develop the technology.

Sunak said: “I think the UK can play a leadership role, because ultimately, we’re only going to grapple with this problem and solve it if we work together not just with the companies, but with countries around the world. It’s something that I’ve already been discussing with other leaders at the G7 summit the other week, [and] I’ll be doing that again when I visit the US very soon.”

Sam Altman, the chief executive of OpenAI, which created ChatGPT, has called for world leaders to establish an equivalent to the International Atomic Energy Agency. Darren Jones, the Labour MP who chairs the business select committee, has urged Sunak to promote the UK as a potential host for such an organisation.

British government sources told the Guardian that creating a new international organisation was not a realistic option, but they did want to play a role in helping coordinate the disparate regulatory efforts by European, Asian and American countries.

skip past newsletter promotion

UK officials believe their principles-based approach is more likely to find international favour than the EU stance of choosing to ban certain individual AI products, such as facial recognition software.

Experts say there are two broad categories of risk that are created by AI. The first are the short- to medium-term ones that the technology could be misused, whether to create disinformation that is indistinguishable from reality, or to make hiring and firing decisions that end up being discriminatory.

The second is the much longer-term prospect that AI could become sentient and start pursuing goals for which it has not been programmed.

Some in the industry are arguing for guardrails to be put in place such as forcing developers to share information about the datasets they use to train their AI programmes, or banning them from selling their products to certain people.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.