Internet giant Googlehas been testing for a long time to make search results more social with notes was carrying out.
Googlein November 2023 “Notes” infrastructure had begun testing on certain people. Thanks to this infrastructure, search It was intended to include reader comments in the results. The feature is central to some web pages in Google Search results. The story-like format included public comments, or “notes,” as the company preferred to call them. In the first stage, the infrastructure that only works on the Google mobile application used a stylish interface that you can see in the video above. In this system, people could share notes about the quality of a recipe that appeared in the search results, for example, and other people who came across the said recipe in the results could see these notes, and if they liked the notes they saw, they could like them. People could also add a photo of the food they cooked by applying the said recipe to the notes. However, the Notes feature was not successful. After months of testing, a statement was made today that the feature was canceled.. Google, which could not get the desired result from these trials and features but learned important things, now focuses on It gives it to an artificial intelligence-based search infrastructure, but things are not going well there either.
YOU MAY BE INTERESTED IN
The internet giant wants to bring generative artificial intelligence directly to search results, and for this, it has been “Search Generative Experience (SGE)” was doing tests under the name. Behind these tests, instead of giving some results to the search terms as it is now, the site producing effective answers through generative artificial intelligence.
This infrastructure produces detailed answers for people about the topics they are looking for, and here, in addition to those published on websites, a specially trained large language model is used. The system, which adds sources to the information it receives from websites, even produces possible follow-up questions for people.
So how exactly does this system perform? The answers the system produces are not always good and sometimes really “incredibly bad” results can emerge. For example, the system has been “Cheese does not stick to pizza” In response to his search, he received an 11-year-old nonsensical Reddit post, “Use chemical glue for that” his suggestion had given.
This situation, which caused a huge stir when it fell to X, was not the only incorrect response of the system; a user was told that it would be okay to eat one stone a day. At the same time, Here The system that also makes noise with the wrong answers you can see, manually from these bad contents one by one cleaning up. Google monitors the posts shared on social media and immediately removes the nonsensical responses produced. has not yet made the infrastructure fully reliable. Because the system still recommends using glue in pizza appearedWhile other generative AI systems say this shouldn’t be done, Google still hasn’t learned the truth.
Google had previously acknowledged its mistakes and promised to make the system better. Google stated that many of the screenshots that emerged were fake.says it has implemented several measures to improve results. These include:nonsense“These include better detection mechanisms for queries, limiting the inclusion of humorous content in results, and restricting the use of user-generated content in misleading answers. The company introduces advanced protection systems for critical issues such as health to ensure higher accuracy and reliability.will not close the system despite all the negative feedback and will continue to develop it will.