technology

ChatGPT bug fixed after security issue shares conversation titles with other users


A ChatGPT bug allowed some people to see the titles of other users’ searches (Picture: Rex)

ChatGPT owner OpenAI said it has fixed a bug that caused a ‘significant issue’ of a small set of users being able to see the titles of others’ conversation history with the viral chatbot.

As a result of the fix, users will not be able to access their chat history from between 8am and 5pm GMT on March 20, chief executive Sam Altman said in a tweet.

ChatGPT has seen a meteoric growth rate after its launch late last year as people worldwide got creative with prompts that the conversational chatbot uses to create everything from poems and novels to jokes and film scripts.

Last week, Microsoft Corp-backed OpenAI launched its artificial intelligence model GPT-4, an upgrade from GPT-3.5, which was made available to users in November.

The latest version has proven significantly more powerful than its predecessor, quickly executing complex prompts, including completing tax returns and generating complex sentence structures.

In its launch demonstration, OpenAI also showcased ChatGPT-4’s ability to read images, creating a website simply from a hand-drawn sketch of the design.

The integration of OpenAI’s GPT technology into Microsoft’s Bing has driven people to the little-used search engine, according to data from analytics firm Similarweb.

Earlier this month, tech directors from the National Cyber Security Centre warned used not to share personal or sensitive data with ChatGPT and other Large Language Models not only due to the risk of malicious or accidental leaks, but also in the potential for developers to use chats when training the program in future.

Readers Also Like:  France halts iPhone 12 sales over radiation levels


MORE : Here are the jobs most at risk of being replaced by ChatGPT – is your job safe?


MORE : Google rolls out ChatGPT rival, Bard, across the UK





READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.