HomeTechnologyDid “Facebook” and “Instagram” use “aggressive tactics” in targeting...

Did “Facebook” and “Instagram” use “aggressive tactics” in targeting children?

According to a lawsuit in which children are alleged to have been harmed by Facebook and Instagram, Meta used deliberate “aggressive tactics” that included making children addicted to social media “in the name of growing up”.

The meta software engineer said it is “no secret” and “absolutely deplorable” how Facebook and Instagram use sensitive algorithms to encourage repeated and compulsive use among minors, regardless of whether the content is harmful. or not.

The redacted findings appeared in a lawsuit filed by the Daily Mail against Meta, seen.

While CEO Mark Zuckerberg has publicly clarified that his company’s claim that it prioritizes profits over safety and well-being is false, the files reveal pedophilia on both platforms and claim that the “sharing algorithm Meta uses extremist content to drive more engagement.” .

The document states that 20% of users between the ages of nine and 13 on Facebook and Instagram have had sexual experiences with an adult on the sites.

And that’s despite Meta’s “tolerant policies that prohibit abuse like child abuse.”

The Daily Mail contacted Meta, who did not comment on specific questions.

A spokesman for the lead plaintiff’s court-appointed lawyers told the Daily Mail: “These unprecedented documents show that social media companies are approaching the youth mental health crisis by their products as a public relations issue rather than a strictly social issue. internal research documenting these harms. It hinders safety measures because it reduces ‘engagement’ and prevents funding teams dedicated to protecting young people’s mental health.”

The lawsuit, filed Feb. 14 in California, says more than one-third of 13- to 17-year-olds reported using one of the defendant’s apps “almost always,” believing it to be “ sobra”, the parents involved in the disputed case.

The complaints, which were later consolidated into several class-action lawsuits, claimed that Meta’s social media platforms were designed to be dangerously addictive, leading children and teenagers to consume content that increased risk of sleep disorders, eating disorders, depression and suicide.

This case also shows that young people and children are more vulnerable to the negative effects of social media.

The unmodified version was released on March 10.

He said Thorn, an international anti-trafficking organization, published a report detailing sexual abuse issues on Facebook and Instagram in 2021 and “posted these insights on Meta .”

Thorn report, “failed to block or report [الجناة] protects minors from continued harassment” and 55% of respondents to the report have blocked or reported someone, saying they will reconnect online.

The non-confidential complaint also claims that 80% of “adult/microcommunication violations” on Facebook are due to the platform’s “People You May Know” feature.

135-136 of the publication. An internal study conducted in or around June 2020 found that 500,000 Instagram accounts of minors receive “IIC”, which stands for “inappropriate contact with children,” daily, according to in a revised statement on their website. .document.

Since then, Meta has developed the ability to reduce inappropriate interactions between adults and youth.

The company has developed technology that allows it to find accounts with potentially suspicious behavior and prevent those accounts from interacting with youth accounts.

Meta says it doesn’t show teen accounts to these adults when they browse the list of people who liked a post or see the account’s followers or follower list.

But these changes were made after 2020.

The complaint also states that Meta considered making teen user profiles “private by default” in July 2020, but decided not to after weighing the “growth effect” with “security, privacy and benefit to politics.”

On page 135, an amended section of the lawsuit, Meta said that allowing adults to connect with children on Instagram “has angered Apple who is threatening to remove them from the App Store,” and that the company has no timetable. for this. When we prohibit Adults from sending messages to minors on IG Direct”.

However, Meta moved in November 2022 to make teen user accounts private by default.

A Meta spokesperson told the Daily Mail: “The claim that we have suspended funds from work to support the public interest is false.”

The amended version of the complaint states: “Meta did the opposite instead of ‘taking it seriously’ and ‘launching new tools’ to protect children.

At the end of 2019, Meta’s mental health team stopped doing things, “cancelled funding” and “disappeared.” As mentioned, Meta has allowed security tools it knows to be bad to be offered as fixes.

A Meta spokesperson said that we have increased funding because this is a top priority for the company, as demonstrated by the more than 30 tools we offer to support young people and families. Today, there are hundreds of employees working across the company to create features for this purpose.

Meta acknowledges that Instagram users at risk of suicide or self-harm are “more likely to encounter more harmful suicide and self-harm content (through relevant detection and follow-up recommendations)”.

The lawsuit alleges that Meta’s steadfast stance on the importance of children’s safety was never taken seriously.

Source: Daily Mail

Source: Arabic RT

- Advertisement -

Worldwide News, Local News in London, Tips & Tricks

- Advertisement -