DH Latest NewsDH NEWSLatest NewsNEWSTechnologyInternationalBusinessMobile Apps

Instagram owner Meta fined €405 million for handling teens’ data; Report

The Irish data watchdog fined Instagram owner Meta €405 million (£349 million) for allowing teenagers to create accounts that publicly displayed their phone numbers and email addresses. The penalty was confirmed by the Data Protection Commission after a two-year investigation into potential violations of the European Union’s general data protection regulation (GDPR).

Instagram had previously allowed users aged 13 to 17 to operate business accounts on the platform, which displayed the users’ phone numbers and email addresses. The DPC also discovered that the platform had a user registration system in place, with 13-to-17-year-old users’ accounts set to ‘public’ by default. Because Meta’s European headquarters are in Ireland, the DPC regulates the company on behalf of the entire EU.

The penalty is the highest imposed by the watchdog on Meta, following a €225 million fine in September 2021 for ‘severe’ and ‘serious’ GDPR violations at WhatsApp and a €17 million fine in March this year. The fine is the second-largest under GDPR, trailing only Amazon’s €746 million fine in July 2021. ‘We adopted our final decision last Friday and it does contain a fine of €405 million,’ a DPC spokesperson said. The decision’s full details will be published next week.

Caroline Carruthers, the owner of a UK data consultancy, claimed that Instagram did not consider its privacy responsibilities when allowing teenagers to set up business accounts and displayed an ‘obvious lack of care’ in user privacy settings. ‘ The GDPR includes special provisions to ensure that any service aimed at children meets a high standard of transparency.  Instagram became a victim of this when child accounts were set to open by default rather than private. Following revelations about Instagram’s impact on teen mental health, Meta halted work on a version of the app for children last year.

Instagram said it was ‘paused’ work to address concerns raised by parents, experts, and regulators. The move came after a whistleblower, Frances Haugen, revealed that Facebook’s own research showed Instagram could affect girls’ mental health on issues such as body image and self-esteem. Instagram stated that prior to September 2019, it had placed user contact information on business accounts and informed users during the setup process. When under-18s join the platform, their account is now automatically set to private.

‘This was a major breach with significant safeguarding implications and the potential to cause real harm to children using Instagram,’ said Andy Burrows, head of child safety online policy at the NSPCC. ‘The ruling demonstrates how effective enforcement can protect children on social media while also highlighting how regulation is already making children safer online’.

According to a Meta spokesperson, ‘this inquiry focused on old settings that we updated over a year ago, and we’ve since released many new features to help keep teens safe and their information private.  When anyone under the age of 18 joins Instagram, their account is automatically set to private, so only people they know can see what they post, and adults cannot message teens who do not follow them. While we cooperated fully with the DPC throughout their investigation, we disagree with how this fine was calculated and intend to appeal it. We are still carefully reviewing the rest of the decision’.

shortlink

Post Your Comments


Back to top button