security

British Intelligence Agency Warns Of Dangers Posed By AI Chatbots – NDTV


British Intelligence Agency Warns Of Dangers Posed By AI Chatbots

UK intelligence agency has warned users.

The latest technical innovations, such as ChatGPT and competing chatbots, are making people curious and dubious about them at the same time. These innovations also have benefits and security threats, just like any other technology.

The leading UK security body, the National Cyber Security Centre, has noted the harm that these chatbots can cause and cautioned users not to enter personal or sensitive information into the software in order to evade the potential hazards from them.

Two technical directors, David C and Paul J, discussed the primary causes for concern-privacy leaks and usage by cybercriminals-on the National Cyber Security Centre’s blog.

The experts said in the blog that “large language models (LLMs) are undoubtedly impressive for their ability to generate a huge range of convincing content in multiple human and computer languages.” However, they’re not magic, they’re not artificial general intelligence, and they contain some serious flaws.”

According to them, the tools can get things wrong and “hallucinate” incorrect facts, as well as be biased and often gullible.

“They require huge compute resources and vast data to train from scratch.They can be coaxed into creating toxic content and are prone to “injection attacks,” wrote the tech directors.

For instance, the NCSC team states: “A question might be sensitive because of data included in the query or because of who is asking the question (and when). Examples of the latter might be if a CEO is discovered to have asked ‘how best to lay off an employee?” or somebody asks revealing health or relationship questions. 

Readers Also Like:  Oregon attorney general warns of remote access tech support ... - KTVZ

“Also bear in mind the aggregation of information across multiple queries using the same login.”



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.