Dark Mode Light Mode

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Follow Us
Follow Us
Login Login

EU starts looking into Facebook and Instagram’s kid safety because of worries about how powerful their designs are.

Sankt-Petersburg Russia February 23, 2018: Apple iPhone X on office desk with icons of social media facebook, instagram, twitter, snapchat application on screen. Social network. Starting social media app. Photo taken at company office

The European Union is formally looking into Facebook and Instagram due to concerns about child safety, the Commission stated on Thursday. Since the bloc’s Digital Services Act (DSA) internet governance framework began to apply in August of last year, parent company Meta has received a plethora of information demands.

The change may be significant because the official procedures provide EU enforcers with more investigative authority, such as the ability to carry out office inspections or take temporary measures. Any verified DSA violations might result in fines of up to 6% of Meta’s yearly worldwide revenue.

The DSA classifies Meta’s two social networks as very large online platforms (VLOPs). This implies that the business must evaluate and reduce systemic risks on Facebook and Instagram, especially in areas like children’s mental health, in accordance with an additional set of EU-supervised regulations.

Advertisement

Senior Commission officials said in a briefing for the media that they believe Meta is not adequately evaluating and reducing threats to minors.

They brought up in particular worries about the social networks’ addictive architecture and what they called a “rabbit hole effect,” in which an algorithmic content recommendation engine on the platforms may force a youngster viewing one video to explore additional comparable material.

Commission officials listed examples of information that might be harmful to children’s mental health, such as depression or material that promotes an undesirable body image.

They also worry that children might find it too simple to get around the age-assurance techniques Meta employs.

“One of the underlying questions of all of these grievances is: how can we be sure who accesses the service and how effective are the age gates—pparticularly for avoiding that underage users access the service?” a senior Commission official addressing the media today on background stated. “This is now part of our investigation to see how the steps Meta has implemented in this regard work as well.”

All in all, the EU believes Meta violated DSA Articles 28, 34, and 35. The Commission will now conduct comprehensive examinations of the two platforms’ child protection strategies.

The EU launched a similar investigation into addictive design issues on the video-sharing social network TikTok last month.

It has also already launched two DSA probes on Meta’s social networks. The Commission said last month that it will look into distinct issues with Facebook and Instagram’s handling of election integrity.

When contacted for a comment on the most recent EU probes, a Meta representative sent us an email saying, “We have spent ten years creating over fifty tools and procedures to ensure that young people enjoy safe, age-appropriate online experiences. We look forward to providing the European Commission with specifics of our efforts, since this is a problem that the whole industry is confronting.

The business also informed us that to attempt to identify children who could be lying about their age, it combines self-declared age with AI evaluations in its method of confirming the age of members of the two social networks. It stated that it trains its content reviewers to identify accounts potentially used by minors, and it also allows users to report suspected underage accounts.

Users under the age of eighteen who attempt to change their age on Meta’s platforms will be required to select and successfully complete an age verification exam, although the business did not specify which age verification tests it offers. It said, however, that internal evaluations of the effectiveness of its strategy show it is preventing kids from having access to activities that are suitable for their age. Since instituting the checks, Meta claims to have prevented 96% of teenagers from changing their Instagram birthdays from under 18 to over 18.

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Add a comment Add a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Post

Netflix is going to build its own ad server to compete with Google and Amazon.

Next Post

Aeropay is a new company that accepts funds for the cannabis business and games. It is now an option to Mastercard and Visa.

Advertisement