X accused of using data without permission to train Grok

X accused of using data without permission to train Grok


Elon Musk stopped using the platform, formerly known as Twitter, after purchasing it Xhad recently taken a new step towards training Grok.

X automatically enabled a setting last month that allowed Grok to be trained based on user posts. It is now possible to turn off this setting, which is on by default. here entering the menu and the page that appears, “Allow your posts, as well as your interactions with Grok, your inputs, and your results to be used for training and fine-tuning” option needs to be turned off. Making a statement for this setting, X stated the following:: “We may use X posts and user interactions, as well as their inputs and results on Grok, for training and fine-tuning to continually improve their experience. This means their interactions, inputs, and results may also be shared with our service provider xAI for these purposes.” This issue came to the fore again today because the process was not well received in the EU. The Irish Data Protection Commission (DPC) said it has taken X to court over its unauthorized use of user data to train Grok, which could see the company face serious penalties and forced significant changes to its process.

YOU MAY BE INTERESTED IN

Before this Elon MuskOwned by xAI’s 100,000 liquid-cooled for “Grok” Nvidia H100‘together It made a sound. Elon MuskOwned by xAIlast week X, with the support of Nvidia and some other companies exactly 100 thousand liquid cooled Nvidia H100 (very powerful GPU system focused on AI training) had commissioned a server center where they were brought together.

Elon Musk, “The world’s most powerful artificial intelligence education system” This center he said, It will allow Grok to become much more advanced and intelligent in a short time.. Musk also said, ““The infrastructure will provide a significant advantage for training the world’s strongest AI by any metric by December of this year.” the center he said, It cost the company a lot of money but gave it a much stronger hand in training large language models, which is at the heart of LLM, or generative artificial intelligence.

Musk had previously said the following on this matter: “The reason we decided to build 100,000 H100s and the next big system in-house is because our competitiveness depends on being faster than other AI companies. That’s the only way to catch up. If our destiny depends on being the fastest by a long shot, we should be the ones steering the wheel, not the backseat passenger.”

lgct-tech-game