security

Samsung Bans ChatGPT Use for Company Amid Security Concerns … – The prNews Blog


Samsung has joined the list of organizations that have banned the use of ChatGPT, a foreign AI technology, due to the potential risk of sensitive information being leaked through its usage. The decision follows concerns raised by reports that the online service may cause harm to companies and their clients by leaking confidential information. The Verge reported that Samsung has warned its employees not to use AI-powered assistants for commercial purposes.

Ontario BetMGM Casino and other online services have also banned the use of the tool due to similar concerns. “We take data security and privacy very seriously,” said a Samsung representative. “After conducting a thorough review, we have decided to ban the use of the AI-based tool within our organization to ensure the protection of our sensitive information.”

The decision, which was announced on May 3, 2023, will be effective immediately. Other major tech companies like Apple and Google have also prohibited the use of AI-powered assistants within their organizations due to similar security concerns.

“While AI-based tools have the potential to revolutionize the way companies operate and create content, it is critical to balance the benefits with the risks,” said a well-known tech expert. Companies are now more conscious of the possible risks associated with foreign AI technologies and are taking steps to reduce them.

Despite the potential benefits of ChatGPT for marketing campaigns and content creation, many organizations believe that the risks of sensitive information leaks are too great to overlook. As technology continues to advance, similar issues are likely to arise, and individual enterprises will need to decide on the best course of action to protect sensitive information while maintaining client trust.

Readers Also Like:  North Korean UNC2970 Hackers Expands Operations with New Malware Families - The Hacker News





READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.