ChatGPT OpenAI, which developed the system, “Bug Bounty Program” started. In this way, the company is able to involve all security researchers in the process. wants.
OpenAI, which developed ChatGPT, Here located in “Bug Bounty Program” is taking a crucial step towards making its systems more secure. Security experts who have vulnerabilities in OpenAI’s systems within the scope of the program, Will be able to receive prize money between 200 and 20 thousand dollars. In the statement made by the company in this regard, “OpenAI’s mission is to create AI systems that benefit everyone. To that end, we invest heavily in research and engineering to ensure our AI systems are safe and secure. However, as with any complex technology, we recognize that vulnerabilities and flaws can arise. We believe transparency and collaboration are crucial in addressing this situation. That’s why we invite the global community of security researchers, ethical hackers and tech geeks to help us detect and fix vulnerabilities in our systems.” it was said. The company is following a really rational and logical path in this regard. Because technological systems (especially artificial intelligence) are now really very complex, so even those who produce the system unknowingly reveal security vulnerabilities that only others can find. can take off.
YOU MAY BE INTERESTED
The Bug Bounty Program came about after a really serious ordeal recently. For those who missed it, ChatGPT went down for a while last month, users who opened the site at that time saw the chat histories of other people. Details of these chat histories could not be seen, but headlines are clearly displayed. This, of course, caused quite a stir because there has been a serious breach of privacy. Persons ChatGPT She is aware that her conversations with her can be seen by others. After all, there are those who work behind these systems, and they use the interviews to develop the “GPT” language model on which artificial intelligence is built. However, even a small part of the conversations were not welcomed by normal users, not developers.
The CEO of the developer company OpenAI, who made a statement on this subject Sam Altman“We had a major issue with ChatGPT due to a bug in an open source library. The fix for this issue has been released. Due to this problemA small percentage of users have seen the threads of other users’ conversation history. BWe are very sorry about that.” he said. On top of that, an official statement came from the company on the subject. This statement directly quoted: “We shut down ChatGPT on Monday to fix an open-source library bug that was causing some users to see threads in other users’ chat history. Our research during this period revealed that 1.2 percent of ChatGPT Plus users may have had their personal data exposed to other users. We believe the number of users whose data is exposed to others is extremely low and we are contacting people who may have been affected. We take this matter very seriously and continue to investigate.”
We took ChatGPT offline Monday to fix a bug in an open source library that allowed some users to see titles from other users’ chat history. Our investigation has also found that 1.2% of ChatGPT Plus users might have had personal data revealed to another user. 1/2
— OpenAI (@OpenAI) March 24, 2023