security

Warner Calls on AI Companies to Ramp Up Safety & Security … – Senator Mark Warner


WASHINGTON – U.S. Sen. Mark R. Warner (D-VA), Chairman of the Senate Select Committee on Intelligence, today urged several artificial intelligence (AI) companies to take additional action to promote safety and prevent malicious misuse of their products. In a series of letters, Sen. Warner applauded certain companies for publicly joining voluntary commitments proposed by the Biden administration, but encouraged them to broaden their efforts, and called on companies that have not taken this public step to commit to making their products more secure.

As AI is rolled out more broadly, researchers have repeatedly demonstrated a number of concerning, exploitable weaknesses in prominent products, including abilities to generate credible-seeming misinformation, develop malware, and craft sophisticated phishing techniques. In July, the Biden administration announced that several AI companies had agreed to a series of voluntary commitments that would promote greater security and transparency. However, the commitments were not fully comprehensive in scope or in participation, with many companies not publicly participating and several exploitable aspects of the technology left untouched by the commitments.

In a series of letters sent today, Sen. Warner pushed directly on companies that did not participate, including Apple, Midjourney, Mistral AI, Databricks, Scale AI, and Stability AI, requesting a response detailing the steps they plan to take to increase the security of their products and prioritize transparency. Sen. Warner additionally sent letters to companies that were involved in the Biden administration’s commitments, including Amazon, Anthropic, Google, Inflection AI, Meta, Microsoft, and OpenAI, asking that they extend commitments to less capable models and also develop consumer-facing commitments – such as development and monitoring practices – to prevent the most serious forms of misuse. 

“While representing an important improvement upon the status quo, the voluntary commitments announced in July can be bolstered in key ways through additional commitments,” Sen. Warner wrote.

Readers Also Like:  Windows 11's New 'Never Combine' Icons Feature Is Almost Unusable - tech.slashdot.org

Sen. Warner also called specific attention to the urgent need for all AI companies to make additional commitments to safeguard against a few highly sensitive potential misuses, including non-consensual intimate image generation (including child sexual abuse material), social-scoring, real-time facial recognition, and proliferation activity in the context of malicious cyber activity or the production of biological or chemical agents.

The letters follow up on Sen. Warner’s previous efforts to engage directly with AI companies to push for responsible development and deployment. In April, Sen. Warner directly called on AI CEOs to develop practices that would ensure that their products and systems are secure. In July, he also pushed on the Biden administration to keep working with AI companies to expand the scope of the voluntary commitments.

Additionally, Sen. Warner wrote to Google last week to raise concerns about their testing of new AI technology in real medical settings. Separately, he urged the CEOs of several AI companies to address a concerning report that generative chatbots were producing instructions on how to exacerbate an eating disorder. Additionally, he has introduced several pieces of legislation aimed at making tech safer and more humane, including the RESTRICT Act, which would comprehensively address the ongoing threat posed by technology from foreign adversaries; the SAFE TECH Act, which would reform Section 230 and allow social media companies to be held accountable for enabling cyber-stalking, online harassment, and discrimination on social media platforms; and the Honest Ads Act, which would require online political advertisements to adhere to the same disclaimer requirements as TV, radio, and print ads.

Copies of each of the letters can be found here.

###



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.