My parents love to tell the story of a couple they met who vowed their kids would only watch PBS if they could. This was the 1970s, before mobile phones, video game systems, or something other than the three major networks, PBS, and small independent stations broadcasting reruns of “Our Miss Brooks” and B-movies with titles like “Disembodied Venusian Alien Hands Assault.” It should have been a piece of cake if it had been possible.
Imagine my parents’ surprise when they arrived one Saturday afternoon to find their friends’ 6-year-old twins watching “Starsky & Hutch” with their defeated parents watching.
My father said, “I had no idea ‘Starsky & Hutch’ was on PBS now.” The girls were perplexed, the parents were agitated, and a story was born.
There’s a larger argument here, which is that it’s difficult for parents to monitor their children’s media use, even with the best of intentions. This is particularly true in 2021, when the stakes are a little higher than your children witnessing a drug dealer murder one of Starsky’s informants or wondering what Huggy Bear does for a living.
With social media and the internet, your child can be exposed to material and people that aren’t just obscene and rebarbative, but also deadly and dangerous; parents in 2021 must ensure that their children aren’t able to access the internet’s seedier corners, especially where predators may be lurking.
Given all of this, it’s surprising that Mark Zuckerberg’s Facebook is considering a version of its Instagram platform for children under the age of 13 — a framework the Menlo Park, California-based social media giant claims isn’t a serious target for sexual predators — as a tool parents can use to combat this. The platform, they claim, will provide “parents with visibility and control over what their children are doing.”
Forty-four state attorneys general are likely to share your skepticism.
According to The Associated Press, the 44 top state law officials cited Facebook’s “checkered record” in shielding minors from dangers such as cyberbullying and pedophilia in a letter sent to the company on Monday.
“It appears that Facebook is not responding to a need, but instead creating one, as this platform appeals primarily to children who otherwise do not or would not have an Instagram account,” Maura Healey, the Attorney General of Massachusetts, spearheaded the letter.
“The attorneys general urge Facebook to abandon its plans to launch this new platform.”
Children under the age of 13 will not normally be allowed to access Instagram, and there does not seem to be widespread support for such a product — at least among parents, who will likely be more stringent with a child under 13’s knowledge diet than they would with an older adolescent.
Though Healey spearheaded the message, bear in mind that it was signed by a bipartisan group of people. It brought together New York Attorney General Letitia James, who is currently attempting to sue the NRA out of existence, and Texas Attorney General Ken Paxton, one of the most outspoken critics of the Democrats’ latest campaign to pass strict gun control legislation. If you can get those two to agree on something, it’s a sign that this is a terrible idea.
According to the letter, Facebook has a bad track record when it comes to keeping its users secure.
“Facebook has a record of failing to protect the safety and privacy of children on its platform, despite claims that its products have strict privacy controls. Reports from 2019 showed that Facebook’s Messenger Kids app, intended for kids between the ages of six and 12, contained a significant design flaw that allowed children to circumvent restrictions on online interactions and join group chats with strangers that were not previously approved by the children’s parents,” the letter said.
“Just recently, a ‘mistake’ with Instagram’s algorithm promoted diet content to users with eating disorders, where the app’s search function recommended terms including ‘appetite suppressants’ and ‘fasting’ to vulnerable people who were at risk of relapsing.”
The letter also cited research that “increasingly shows that social media can be detrimental to children’s physical, emotional, and mental well-being,” including multiple studies that found, among other things, that viewing selfies resulted in “lower self-esteem” and “lower life satisfaction.”
According to another report, “Instagram… exploits young people’s fear of missing out and desire for peer approval to encourage children and teens to constantly check their devices and share images with their followers[,]” and “Instagram… exploits young people’s fear of missing out and desire for peer approval to encourage children and teens to constantly check their devices and share photos with their followers.” “The platform’s unwavering emphasis on appearance, self-presentation, and branding poses a threat to adolescents’ privacy and well-being,” says the study.
“The evidence we’ve seen is that using social media to communicate with other people may have health benefits,” Facebook told a congressional hearing in March,” according to the letter.
According to the letter, in addition, those under the age of 13 are “simply too young to navigate the nuances of what they experience online, including pornographic content and online relationships where other users, including predators, can cloak their identities using the anonymity of the internet.”
“One report found an increase of 200% in recorded instances in the use of Instagram to target and abuse children over a six-month period in 2018, and UK police reports documented more cases of sexual grooming on Instagram than any other platform.”
Bear in mind that this was on a website that isn’t dedicated to images of children under the age of 13.
The project has yet to go live. BuzzFeed News first broke the story in March after obtaining leaked Facebook memos.
“I’m excited to announce that going forward, we have identified youth work as a priority for Instagram and have added it to our H1 priority list,” Vishal Shah, Instagram’s vice president of product, allegedly said in a post on an employee message board.
“We will be building a new youth pillar within the Community Product Group to focus on two things: (a) accelerating our integrity and privacy work to ensure the safest possible experience for teens and (b) building a version of Instagram that allows people under the age of 13 to safely use Instagram for the first time.”
The aforementioned edition of Facebook Messenger Kids was the last big product the social media giant sold to minors under the age of 13.
The company claimed that allowing such young children to use digital messaging platforms would have no negative consequences. According to Wired, many of the “experts” consulted by Facebook when creating the kids’ messaging app were also paid by the social media giant.
This time around, Facebook seems to be taking the same approach, but it’s unclear if the experts will have any conflicts of interest this time.
A Facebook spokesperson said, according to CNBC, “We agree that any experience we develop must prioritize their safety and privacy, and we will consult with experts in child development, child safety and mental health, and privacy advocates to inform it,” “We also look forward to working with legislators and regulators, including the nation’s attorneys general.”
Later, Facebook issued an amended statement, claiming that the product was targeted at children who were already online: “We want to improve this situation by delivering experiences that give parents visibility and control over what their kids are doing. We are developing these experiences in consultation with experts in child development, child safety and mental health, and privacy advocates.”
It’s worth remembering that the app isn’t yet live, so Facebook hasn’t announced any concrete steps to give parents more influence over their children’s social media use or to ensure that predators won’t use the platform to groom and assault their children.
They’ve just done one thing: reassure parents: “Don’t worry, we’ve got it covered.”
That won’t cut it, particularly considering the prevalence of predatory behavior and cyberbullying on other similar platforms.
This isn’t your child watching Starsky and Hutch bust a drug dealer. This is a completely new game, and social media firms do not and cannot benefit from it.
The risks are just too high, and the profit is negligible.
Sources: Westernjournal.com , apnews.com, mass.gov,
buzzfed.com, wired.com, cnbc.com, commonsensemedia.com