Samsung: Samsung may be ‘limiting’ use of ChatGPT for employees, here’s why – Times of India

There have been some use circumstances of ChatGPT the place customers have been suggested to tread with warning. No denying that ChatGPT is a useful gizmo for getting a variety of stuff completed within the office. Nevertheless, it looks like three Samsung staff ended up leaking confidential data to the chatbot.
In accordance with The Economist Korea, Samsung staff “unintentionally” ended up sharing some commerce secrets and techniques with the chatbot. The report states that the engineers at Samsung’s semiconductor division had been allowed to make use of the chatbot to examine for supply code.
What did the staff do?
As per the report, one worker requested ChatGPT to search for errors in a confidential supply code. A second worker requested code optimisation with ChatGPT and shared code. The report notes a 3rd worker shared a recording of an organization assembly because it wished ChatGPT to make notes for a presentation. All the data is now on ChatGPT and is taken into account delicate. The ChatGPT mannequin is such that it retains all the data after which trains itself to develop into smarter.
What has Samsung’s response been?
In accordance with the report, Samsung is limiting the usage of ChatGPT for workers. Whereas a blanket ban has been enforced, the corporate is proscribing the size of prompts — or questions — staff can ask as much as 1024 bytes per particular person. Additionally, the corporate is conducting an investigation into the staff who had been concerned within the leak.
Close to ChatGPT, OpenAI has made it very clear that customers shouldn’t share any confidential data with the chatbot. OpenAI says that it isn’t capable of delete particular prompts out of your historical past. “Please do not share any delicate data in your conversations,” the corporate categorically states. It is because OpenAI says customers’ conversations could also be reviewed by its AI trainers to enhance its methods.

Image / Information Source

Leave a Reply

Your email address will not be published. Required fields are marked *