Site icon Tinybeans

Instagram for Kids Isn’t a Bad Idea. It’s a Terrible Idea

I’m going on the record saying that “Instagram for Kids” might be the worst idea I’ve ever heard, and here’s why: a platform like that will have consequences. Facebook, as a company, makes very intentional decisions—and all the negative impacts from this decision are fairly easy to anticipate. I’ve even listed them out below. And, as surely as Instagram for Kids is a bad idea, Facebook will spin its inevitable issues as “unintended consequences.” Because I’m a parent and the founder of a tech company dedicated to improving technology for our children, the consequences feel obvious, not “unintended.”

A Platform Built on Comparison & Competition
Instagram is the poster child of striving for perfection. I’ve written before that social validation is the number one thing I worry about as a parent, especially in the context of rising depression and anxiety rates among youth. Unlike many, I don’t solely place the blame on smartphones or social media, and in general, I’m pretty pro-technology. We could argue correlation vs causation all day, but I’ve seen enough anecdotal evidence to change the way I view technology and how I parent. The fact is that likes and followers matter to kids, and many measure their self-worth this way. We’ve seen social validation mechanisms like this show up in apps for kids like PopJam, but Instagram takes that to a whole new level.

An Easy Target for Online Predators
This feels like stating the obvious, but a platform where children post pictures of themselves, their friends and their lives is ripe for online predation, and cases of children being groomed and abused via adult social media platforms are already well documented. A particularly disturbing documentary from Bark Technologies demonstrated how quickly it can happen. Their team collaborated with law enforcement to create fictional profiles of teens and tweens to see how quickly predators would reach out, and within one hour of posting a profile for a fake 15-year-old girl on Instagram, seven adult men attempted to contact her. After nine days, 92 potential predators had made contact. The team then launched an 11-year-old persona, and within minutes, multiple would-be abusers reached out. The dangers are real.

Usually, platforms designed for kids need to verify that an adult is an adult, but “Instagram for Kids” may pose the opposite. It could be difficult to prevent predators from posing as children to gain access and follow young users. The last thing I want to do is instill panic in parents, but the stats are grim: from January to September 2020, the National Center for Missing and Exploited Children received 30,236 reports of possible online enticement. And those are just the ones that were reported. I can’t help but feel that “Instagram for Kids” would be enticing to predators—and while that clearly isn’t Facebook’s intention, it’s certainly easy to predict.

The Trouble with Locking It down
Kids want to feel empowered. If Facebook severely limits what kids can do on “Instagram for Kids,” they’ll hate it. Just look at the example of YouTube. It’s the number one most-loved brand among kids—blowing YouTube Kids out of the water. Any parent will confirm that kids do NOT want to feel like babies. Adult platforms simply do not retrofit easily to serve kids. They’re built with specific goals and it’s very difficult to secure them in a way that’s appropriate for children. YouTube Kids had videos with sexual content and suicide instructions. Facebook Messenger Kids had a design flaw that allowed kids to connect with strangers. Retrofitting just doesn’t work.

Kidfluencers Version 2.0
Unless Instagram for Kids is a closed platform, I think we’ll see a rise of kidfluencers. Perhaps the most notable example of this phenomenon is Ryan Kaji, the kid behind the highest-earning YouTube channel in 2018 and 2019. As a platform for youth under 13, “Instagram for Kids” might restrict ads—but how will they manage influencing? These contracts are made outside of the platform, offering individuals compensation for featuring or mentioning certain products or services in videos, photos or comments. Even some adults can’t always tell when they are being sold to, and I suspect it will be all the more difficult for children. And there’s precedent for this kind of thing: Walmart, Staples and Mattel have bankrolled endorsement deals for kids and tweens in the past. While kids who star in television and movies are protected by legislation requiring that their earnings be placed in a trust, there is nothing to protect income generated by kidfluencers—leaving kids potentially exposed to exploitation.

Should We Create Kids Cigarettes While We’re at It?
For all the reasons listed above, “Instagram for Kids” is a “hard no” for me. But you often hear people argue that kids are using the platform anyhow, so why not create a separate platform with a few more parental controls? To me, this argument is fundamentally flawed. After all, kids are often attracted to things that aren’t safe or healthy for them. Many are intrigued by smoking, drinking and drugs, but there’s a reason we don’t just lower the drinking and smoking age. As a society, we’ve agreed that some things are best left until kids grow up a bit—and I think Instagram is one of them.

The answer to children using Instagram isn’t to put up a few guardrails. It doesn’t address the root problem at all. The effects of these platforms on youth are still largely unknown, but the anecdotal evidence points to the fact that they probably do more harm than good. Using our kids as guinea pigs in a real-life experiment isn’t the answer. Kids are the fastest-growing group of internet users and have unique needs that have to be protected—not exploited by Big Tech.