Head of Instagram Adam Mosseri confirms that a version of the popular photo-sharing app for children under 13 is in the work. The Facebook-owned company knows a lot of kids want to use Instagram, Mosseri said, but there isn’t a “detailed plan yet,”
“But part of the solution is to create a version of Instagram for young people or kids where parents have transparency or control,” Mosseri told Media. “It’s one of the things we’re exploring.” Instagram’s current policy bars children under 13 from the platform.
“Increasingly kids are asking their parents if they can join apps that help them keep up with their friends,” Joe Osborne, a Facebook spokesperson said in an email to The Verge. “Right now there aren’t many options for parents, so we’re working on building additional products as we did with Messenger Kids, that are suitable for kids, managed by parents. We’re exploring bringing a parent-controlled experience to Instagram to help kids keep up with their friends, discover new hobbies and interests, and more.”
Targeting online products at children under 13 is fraught with privacy concerns and legal issues. In September 2019, the Federal Trade Commission fined Google $170 million to track children’s viewing histories to serve ads to them on YouTube, a violation of the Children’s Online Privacy Protection Act (COPPA). TikTok precursor Musical.ly was fined $5.7 million for violating COPPA in February of 2019.
Facebook launched an ad-free version of its Messenger chat platform for kids in 2017, intended for kids between the ages of 6 and 12. Children’s health advocates criticized it as harmful for kids and urged CEO Mark Zuckerberg to discontinue it. Then in 2019, a bug in Messenger Kids allowed children to join groups with strangers, leaving thousands of kids in chats with unauthorized users. Facebook quietly closed those unauthorized chats, which it said affected “a small number” of users.