Samsung Employees Caught Leaking Data to ChatGPT

Berikut Samsung Employees Caught Leaking Data to ChatGPT
Yang terbaru kami bagikan untuk anda. Dapatkan informasi gadget terbaru hanya di – Jakarta – The sophistication of ChatGPT can actually have a bad impact, and this is experienced by Samsung employees. It was reported that 3 employees were caught leaking data to ChatGPT.

quoted Telset from Engadget on Saturday (08/04/2023), this case started when Samsung’s semiconductor division began allowing engineers to use ChatGPT.

Instead of being used to help with work, the three Samsung employees leaked their company data on ChatGPT. All three carry out data leak actions in several ways.

First there is asking the chatbot to check the source code or source code sensitive databases for errors. A second employee requests ChatGPT for code optimization and a third feeds meeting recordings into ChatGPT.

This action was immediately discovered by Samsung, and was considered an act of data leakage. This is because they indirectly share Samsung’s sensitive data with other applications, such as ChatGPT.


After learning about the security flaw, Samsung attempted to limit future indiscretions by limiting the length of employees’ ChatGPT requests to one kilobyte, or 1024 characters of text.

The company is also investigating the three employees and building its own chatbot to prevent similar accidents. Meanwhile, OpenAI as the owner of ChatGPT has long advised not to share confidential information with ChatGPT.

Because the chatbot cannot delete certain clues from your history. The only way to get rid of personally identifiable information on ChatGPT is to delete the account and the process can take up to 4 weeks.


The Samsung story is another example of why it is necessary to be careful when using a chatbot or just like you access any other application. Because we never really know where your data will end up.

Hopefully this case will serve as a lesson for us that data is very important to protect, and only we can protect data security.