Striking claim: ‘Apple spends millions of dollars every day, the reason is…’

Striking claim Apple spends millions of dollars every day the

It is currently not possible to say for certain whether or exactly when Apple’s artificial intelligence will come out, but allegations on this subject have been on the agenda for a long time. Rumors that Apple may launch a competitor to ChatGPT have started to spread recently, while there has been no official statement from the company going forward. However, despite all this, Apple CEO Tim Cook managed to keep the claims of a productive artificial intelligence from Apple on the agenda, saying that productive artificial intelligence is “very interesting”. Now, as always, new allegations have come on this subject, and they are quite remarkable…

CLAIM: HE SPENDS MILLIONS OF DOLLARS EVERY DAY! THE SUBJECT HE FOCUS ON IS…

According to the news from Apple Insider; Apple has increased its AI development budget to focus on developing chatbot features for Siri. Also, the company allegedly spends millions of dollars every day on research and development (R&D).

4 years ago, Apple’s head of artificial intelligence, John Giannandrea, is said to have formed a team to work on large language models (LLMs), which are the basis for productive AI chatbots like ChatGPT.

THERE IS A CLAIM OF ARTIFICIAL INTELLIGENCE TEAM BASED ON DIALOGUE: EVEN IF IT HAS ONLY 16 MEMBERS…

On the other hand, it is claimed that Apple’s conversational artificial intelligence team is called the “Foundational Models team”, and the team is led by Ruoming Pang, who previously worked at Google for 15 years. The team allegedly has a significant budget and trains advanced LLMs using millions of dollars daily.

Despite only having 16 members, their progress is said to rival those of OpenAI, which has spent more than $100 million training a similar LLM.

“AT LEAST TWO MORE TEAMS…”

At least two more teams at Apple are working on language and display models, The Information claims. Allegedly, one of the groups focuses on Visual Intelligence, which produces images, videos and 3D scenes, while the other works on multi-modal AI that can process text, images and videos.

According to another claim, Apple’s current plan is to integrate LLMs into its voice assistant Siri. This is said to allow users to automate complex tasks using natural language, similar to Google’s efforts to improve its voice assistant.

It is also claimed that Apple believes the advanced language model called Ajax GPT is better than OpenAI’s GPT 3.5.

mn-3-tech