Meta Platforms, the parent company of Facebook and Instagram, allegedly designed its social platforms to attract and engage young users and it was aware of it, but did not disclose it had received millions of complaints about underage users on Instagram. Meta disabled only a small amount of those accounts.This information comes from a recently revealed legal complaint, as The Wall Street Journal (WSJ) and The New York Times reported.Initially made public with certain parts blacked out, the complaint marked the beginning of a lawsuit by the attorneys general of 33 states that was filed in late October.In the complaint, it was revealed several Meta officials admitted the company intentionally created its products to take advantage of certain aspects of young people's psychology, including impulsive behaviour, susceptibility to peer pressure and the tendency to underestimate risks. This information was reported in the news stories with the company documents.Some Meta employees in the documents also recognized Facebook and Instagram were popular among children under the age of 13, even though company policy prohibited their use.Meta responded to the complaint in a statement to The Associated Press that it believes the complaint misrepresents its efforts over the past decade to enhance the safety of teenagers' online experiences. The company mentioned having "over 30 tools to support them and their parents."Meta argued preventing younger users from accessing their services is challenging, citing age verification as a “complex industry challenge.”Meta preferred to shift the responsibility of monitoring underage usage to app stores and parents. They specifically mentioned supporting US federal legislation that would mandate app stores to obtain parental consent when youths under the age of 16 download apps.According to the WSJ report, a Facebook safety executive hinted in a 2019 email that taking stricter measures against younger users could potentially harm the company's business.The WSJ reported the same executive expressed frustration a year later, pointing out, while Facebook had been actively studying the behaviour of underage users for business purposes, it had not shown the same level of enthusiasm for finding methods to identify younger children and remove them from its platforms.According to newspaper reports, the complaint highlighted Meta sometimes has a backlog of as many as 2.5 million accounts belonging to younger children that are "awaiting action.”
Meta Platforms, the parent company of Facebook and Instagram, allegedly designed its social platforms to attract and engage young users and it was aware of it, but did not disclose it had received millions of complaints about underage users on Instagram. Meta disabled only a small amount of those accounts.This information comes from a recently revealed legal complaint, as The Wall Street Journal (WSJ) and The New York Times reported.Initially made public with certain parts blacked out, the complaint marked the beginning of a lawsuit by the attorneys general of 33 states that was filed in late October.In the complaint, it was revealed several Meta officials admitted the company intentionally created its products to take advantage of certain aspects of young people's psychology, including impulsive behaviour, susceptibility to peer pressure and the tendency to underestimate risks. This information was reported in the news stories with the company documents.Some Meta employees in the documents also recognized Facebook and Instagram were popular among children under the age of 13, even though company policy prohibited their use.Meta responded to the complaint in a statement to The Associated Press that it believes the complaint misrepresents its efforts over the past decade to enhance the safety of teenagers' online experiences. The company mentioned having "over 30 tools to support them and their parents."Meta argued preventing younger users from accessing their services is challenging, citing age verification as a “complex industry challenge.”Meta preferred to shift the responsibility of monitoring underage usage to app stores and parents. They specifically mentioned supporting US federal legislation that would mandate app stores to obtain parental consent when youths under the age of 16 download apps.According to the WSJ report, a Facebook safety executive hinted in a 2019 email that taking stricter measures against younger users could potentially harm the company's business.The WSJ reported the same executive expressed frustration a year later, pointing out, while Facebook had been actively studying the behaviour of underage users for business purposes, it had not shown the same level of enthusiasm for finding methods to identify younger children and remove them from its platforms.According to newspaper reports, the complaint highlighted Meta sometimes has a backlog of as many as 2.5 million accounts belonging to younger children that are "awaiting action.”