An alert report on the age rating of the App Store. Certain apps supposed to be accessible to children aged four and over expose the youngest to “inappropriate and risky” content. Caution is required.

An alert report on the age rating of the App

An alert report on the age rating of the App Store. Certain apps supposed to be accessible to children aged four and over expose the youngest to “inappropriate and risky” content. Caution is required.

Security is one of Apple’s major arguments to convince consumers to buy its products and join its ecosystem. Indeed, the Cupertino company having absolute control over its devices and the services to which they provide access, it can ensure that it only offers quality and duly controlled content on its application store, the App Store.

Well, that’s in theory. Because if it is true that there are fewer corrupted applications than in the Google Play Store, that does not mean that we can walk around there with our eyes closed, especially for minors! Because the youngest are not safe from coming across problematic applications. The Apple store, for example, housed, until recently, Rencontres Ados, which turned out to be a real hunting ground for pedophiles and other sexual predators (see our article).

The App Store classifies apps into four age categories – 4+, 9+, 12+, and 17+. According to the company, parents should not have to worry about inappropriate content for their children since sorting prevents such apps from being offered to them.

Yet, a report produced by two NGOs defending child safety, Heat Initiative and ParentsTogether Action, warns of the numerous applications downloadable from the Apple app store and presented as appropriate for children, when they are not. All. The associations identified a sample of 200 applications “unsafe or inappropriate”presenting “potential risks similar to those of social media, such as sexual exploitation, eating disorders and harassment.” Most of these are chat, beauty, diet or weight loss apps, internet access and games.

App Store: problematic “children’s” apps

The two NGOs examined 800 applications present on the App Store. It emerged that 200 of them were inappropriate for minors, despite a classification stating the opposite. They have been downloaded more than 550 million times. Some are apps for chatting with strangers, like Random Chat, or with AI, like AI Girlfriend, a virtual girlfriend simulator.

The JustTalk Messenger Kids app is particularly attracting the attention of NGOs because it is supposed to allow children to chat securely with family members or friends “without being exposed to inappropriate content or interference from strangers”it turns out to be in reality almost exclusively frequented by pedophiles. With this type of messaging, younger people have a high chance of coming face to face with inappropriate content, or even malicious people. And AIs aren’t really any better, since discussions can quickly get out of hand…

© Heat Initiative and ParentsTogether Action

Others include tools that bypass restrictions and parental controls, or offer explicit content, such as sexual games – running around naked outside, simulating suggestive photo shoots, etc. –, promoting violence.

The associations also noted apps related to beauty and the body, promoting unrealistic standards of beauty – via photo editing or body analysis apps – and dangerous eating practices – some encouraging users to fast 20 hours a day and setting calorie goals to starve yourself. And that’s just a sample of the thousands of apps in the App Store. In short, the results are not very glorious for Apple…

App Store: apps not verified due to financial interests

We are therefore a long way from Apple’s marketing promises. The Cupertino company, however, claims that “The App Store is a safe and trusted place to discover and download apps” and assures parents that it is “easy to make sure their kids are interacting with age-appropriate content.” In reality, it delegates legal responsibility for age ratings to developers without carrying out serious checks.

The apple company, however, claims to take multiple precautions. The App Store matters »more than 500 specialists around the world“responsible for sorting”more than 100,000 apps” per week, she assures Wall Street Journal. “More than a million apps have been rejected due to offensive, harmful, dangerous or illegal content”. This therefore represents 40 to 50 “verified” apps per day and per person. It is therefore difficult to delve deeper into the examinations…

39497680
© Heat Initiative and ParentsTogether Action

According to the report, the situation is actually linked to financial motivations. Less strict controls favor downloads, from which Apple earns commissions. As long as those responsible for classifying applications have a financial interest in making them accessible to as many people as possible, this problem will persist.

Also, the two NGOs recommend that Apple call on independent experts to classify applications according to age. “Just as with films, television programs and video games, these experts would assess the risk to children and assign an age rating in the interest of children, not that of Apple or video development companies. applications”they estimate. Or, the company scales back its security promises, specifying that the controls reduce the risk of problematic applications by relying on reports, but are not necessarily a guarantee of absolute security.

Note, however, that responsibility does not rest solely with Apple. The report highlights the crucial role of parents and guardians in protecting children from inappropriate content. This involves activating parental controls and raising awareness of the dangers of the Internet. We can therefore ask ourselves the question of whether it is reasonable, on all levels, to put into the hands of very young children devices as expensive as iPhones or iPads. But we know well that Apple remained a social marker, including among younger generations…

ccn5