HomeEconomyWired: Facebook algorithms collect negative content about new moms

Wired: Facebook algorithms collect negative content about new moms

In a recent article, left technical publication wired He describes how Facebook’s algorithm on Instagram repeatedly shows a young mother videos and photos of sick, dying, or dead children. The constant stream of negative content on social media platforms is particularly harmful to young women and teenage girls, and China’s TikTok only adds to the negativity.

wired In an article titled “Instagram shows me childhood tragedies,” she reported that a young mother was regularly shown videos and photos of sick, dying, or dying babies, and that the Instagram algorithm had been circulating her online and had recently learned that she had a baby. After months of worrying normal new parents about her pregnancy and their kids, Instagram’s algorithm took it and raised her concerns.

Mark Zuckerberg discusses Instagram (AFP/Getty)

Shuzi Chu, CEO of TikTok Inc., Photographer: Christopher Goodney/Bloomberg

wired author:

But there was something on my screen in my first year of fatherhood that continued to surprise and frighten me. In a deep sleep spent browsing my feeds, I found myself spellbound by the news of babies and children getting sick, dying, or dying. When I watch recipe breakdowns, home repairs on TikTok, there are videos of mothers mourning the untimely death of their unbreakable children. My Instagram Explore page often includes accounts dedicated to or commemorating children with serious health problems and birth defects. My husband stopped looking at my phone and cried so much for kids I didn’t know that he (kindly, soberly) suggested a social media break.

Despite the heartbreak they cause, these videos keep popping up on my screen for a reason: because I’m watching them. Enthusiastic. I remember the names and conditions of these at-risk children living with San Filippo syndrome or undergoing chemotherapy, they had just died of myocarditis or SID. I remember my siblings and my favorite things. I looked at them. If they’re dead, I’ll check on their parents. As a tourist searching the country for sick children, “so and so-so has wings” and the scary “happy birthday in heaven!” I understood the offensive jargon of digitally mediated death, such as All social platforms inherently require participation; When I was so busy, I was shaking.

Am I eating content about sick and dead children for fun, how can one watch a horror movie? I think there is an overlap in my behavior here and in the practice of true crime buffs who so enthusiastically pick up hideous accounts of real violence, including child abduction. they develop content for all things murder related. and blood. There is a theory that the real popularity of crime, especially among women, has to do with their fear of being the victim of a crime. Watching it can provide a relaxing moment, an opportunity to release limited worries. This is undoubtedly related to my anxiety.

Instagram is also toxic to teenage girls, according to Facebook’s own internal research. WSJ As part of its “Facebook Files” series, the company released an internal study outlining its findings.

In a slide presentation published in March 2020 on an internal Facebook message board reviewed by The Wall Street Journal, “Thirty-two percent of teenage girls said that when they feel bad about their bodies, Instagram makes them feel worse.” “Comparisons on Instagram can change how young women see and describe themselves.”

Fire

“For one in three teenage girls, we make body image issues worse,” said a 2019 slide outlining research on teenage girls who face such issues.

“Teens are blaming Instagram for their increased levels of anxiety and depression,” she says in another slide. “This reaction was spontaneous and consistent across all groups.”

These issues also exist for other platforms like the very popular TikTok. In 2021, Breitbart News reported that popular Chinese video app TikTok used its algorithm to serve videos containing illegal drug use and explicit sexual content to minors.

At the time, he told the story of a 13-year-old TikTok user who searched the app for “OnlyFans”; this is the name of a subscription website primarily used to host pornographic content. The underage user watched several videos, including two ads, for accessing pornography.

Going back to the TikTok “For You” feed, which displays content to users based on their interests and videos they’ve watched before, the 13-year-old was shown several videos with the hashtag “Sex”. These videos include role-playing videos where people pretend to be related to their parents. In one video, WSJ said, when a male voice pointed to a woman wearing a latex dress, “Don’t be afraid to cry. You know that’s your daddy’s favourite.

Wall Street Magazine He noted that the 13-year-old user TikTok provided these videos for does not actually exist, this account is one of dozens of automatic accounts created. WSJ Learn the TikTok algorithm.

WSJ author:

TikTok has serviced an account registered at the age of 13, which contains at least 569 videos of drug use, links to cocaine and methamphetamine addictions, and promotional videos of online drug and equipment sales. Hundreds of similar videos appeared in the feeds of other secondary magazine accounts.

TikTok also showed young magazine users more than 100 videos suggesting paid porn sites and sex shops from accounts. Thousands of creators have flagged their content for adults only.

You can read more about TikTok on Breitbart News here. For more information on Instagram at wired here.

Source: Breitbart

- Advertisement -

Worldwide News, Local News in London, Tips & Tricks

- Advertisement -