security

NIST Launches Generative AI Working Group – Dark Reading


Even as security companies continue releasing products and features that leverage advanced artificial intelligence (AI), researchers continue to warn about the security holes and dangers such technology creates. To help formulate guidance on how to implement generative AI in particular more safely, the National Institute of Standards and Technology (NIST) announced the formation of a new working group.

Following January’s release of the AI Risk Management Framework (AI RMF 1.0) and the March debut of the Trustworthy and Responsible AI Resource Center, NIST launched the Public Working Group on Generative AI on June 22 to address how to apply the framework to new systems and applications. The group will begin its work by developing a profile for AI use cases, then move on to testing generative AI, and finish up by evaluating how it can be used to address global issues in health, climate change, and other environmental concerns.

Generative AI has been a source of experimentation, concern, and intense business interest lately, especially since the launch of ChatGPT in November brought the state of the art into the public eye. To ensure that the working group takes the current temperature of the developer and security community, NIST said it will be joining the AI Village at DEF CON 2023 in Las Vegas on Aug. 11.

More information on the NIST generative AI working group is available on its website, including a series of video conversations with industry figures. To read the National Artificial Intelligence Advisory Committee’s new Year 1 Report in full, visit the NAIAC site.

Readers Also Like:  Top Foreign Policy Lawmakers Bolster Support For Global Tech ... - Silicon UK

Keep up with the latest cybersecurity threats, newly-discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.