security

Joe Biden says risks posed by AI to security, economy need addressing – Economic Times


The risks of artificial intelligence to national security and the economy need to be addressed, U.S. President Joe Biden said on Tuesday, adding he would seek expert advice. “My administration is committed to safeguarding Americans’ rights and safety while protecting privacy, to addressing bias and misinformation, to making sure AI systems are safe before they are released,” Biden said at an event in San Francisco.

Biden met a group of civil society leaders and advocates, who have previously criticized the influence of major tech companies, to discuss artificial intelligence.

“I wanted to hear directly from the experts,” he said.

Several governments are considering how to mitigate the dangers of the emerging technology, which has experienced a boom in investment and consumer popularity in recent months after the release of OpenAI’s ChatGPT.

Biden’s meeting on Tuesday included Tristan Harris, executive director of the Center for Humane Technology, Algorithmic Justice League founder Joy Buolamwini and Stanford University Professor Rob Reich.

Regulators globally have been scrambling to draw up rules governing the use of generative AI, which can create text and images, and whose impact has been compared to that of the internet.

Biden has also recently discussed the issue of AI with other world leaders, including British Prime Minister Rishi Sunak whose government will later this year hold a first global summit on artificial intelligence safety. Biden is expected to discuss the topic with Indian Prime Minister Narendra Modi during his ongoing U.S. visit.

European Union lawmakers agreed last week to changes in draft rules on artificial intelligence proposed by the European Commission in a bid to set a global standard for a technology used on everything from automated factories to self-driving cars to chatbots.

Readers Also Like:  RecFaces Hosts Exclusive Webinar For India | Security News - SecurityInformed



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.