With a version of Instagram for kids, Facebook Inc. says it can build a safer social-media haven for users under 13.
Yet the company faces hurdles from lawmakers who want the tech giant to keep its distance from kids — and the fact that plenty of children under 13 are already using the regular Instagram app.
The issue was brought into sharp relief this week, in the form of a letter sent by Democratic lawmakers to the company that raises fresh questions about Facebook’s recently announced plans for an Instagram-branded product designed for children under 13. Users under 13 are currently prohibited from joining any of the company’s platforms.
The letter follows a congressional hearing last month in which lawmakers from both parties bombarded Facebook chief executive Mark Zuckerberg with questions about the plan, alleging that heavy usage of the company’s platforms is unhealthy for young people.
“Facebook and Instagram are in the crosshairs, and they should be,” said Jim Steyer, founder of Common Sense Media, an advocacy group promoting safe technology use for families.
He cited research findings that heavy use of social media can undermine teenagers’ mental health in some circumstances. “Given the privacy and safety track record of Facebook and Instagram, would you let them be your kids’ babysitter?”
Mr. Zuckerberg said at the hearing that the company’s products should be used with supervision, and can help young people maintain connections with friends.
He acknowledged that plenty of children lie about their age in order to sign up for Instagram.
Instagram CEO Adam Mosseri said young users lying about their ages are a problem across the industry, and the company wants to provide solutions.
“We know people under 13 want to use the internet, want to use Instagram, and we think one of the more responsible things we can do is build a product where parents consent and have control,” he said in an interview, emphasizing that the new service is still early in its development and has no set launch date.
Instagram’s version for kids, if launched, would likely give parents tools to monitor their children’s social media accounts, rather than place filters on what content young users can see and how they can interact on the platform, Mr. Mosseri said.
A kids’ product would be entirely free of ads, he added.
“The product has to be compelling enough that it’s not going to give people a reason to lie about their age,” Mr. Mosseri said.
Federal law forbids sites from collecting personal information on users under 13 without parental consent. Google’s YouTube and ByteDance Ltd.’s TikTok have paid penalties in recent years for allegedly collecting such data.
YouTube in 2015 launched an app called YouTube Kids, which doesn’t include personalized ads, though the children’s channel is far less popular than the main YouTube platform.
The company last year announced new measures to limit the collection of data on videos designed for children, and banned pop-ups that suggest more content to watch.
Social media usage is common for kids under 13, research finds. In that demographic, roughly 30% use TikTok, while 22% use Snap Inc.’s Snapchat and 11% use Instagram, according to a survey last summer by investment research firm Piper Sandler.
Social media companies fiercely compete for young people in general, since they are potentially longtime users, as well as taste makers who help platforms remain relevant in pop culture. Advertisers also tend to prioritize spending on platforms with younger users.
An ongoing class-action lawsuit against Facebook alleges that the company knew for years that it was giving advertisers inflated user counts among the 18-to-34 year-old demographic — in some cases exceeding the actual populations of those areas — partly as a result of users lying about their ages.
“When the self-reporting data was so different than the census, we knew we had to address it,” chief operating officer Sheryl Sandberg wrote in a 2017 email to colleagues, which was included in court filings in the lawsuit. “I believe we still do.”
Facebook is fighting the lawsuit and says that no advertisers were harmed by the alleged overstatement of its young-adult user base.
Facebook faces increasing competition for younger users, even as lawmakers are threatening mostly unspecified harsher limits on the industry, in part due to what they say are the negative effects of social media on children.
At the hearing last month, Rep. Cathy McMorris Rodgers (R., Wash.) accused Facebook and Instagram — along with Twitter and YouTube — of harming the mental health of young users, including her own children.
“I’ve monitored where your algorithms lead them,” she said, citing research associating heavy screen usage with anxiety, depression and self-harm. “It’s frightening.”
In the letter sent Monday, Democratic lawmakers voiced concerns about a potential Instagram kids product, calling children a “uniquely vulnerable population online.”
The letter was signed by Sens. Edward Markey (D., Mass.) and Richard Blumenthal (D., Conn.), and Reps. Lori Trahan (D., Mass.) and Kathy Castor (D., Fla.).
The letter requested assurances from the company about ways it would protect young users from not just safety threats and sensitive content but subtler potential harms involving diminished self-esteem and addictive use.
Facebook responded by citing 2017 research from the National Parents and Teachers Association suggesting social media was already nearly ubiquitous among pre-teens, with 81% of parents reporting their child began using social media between the ages of 8 and 13.
“If we can encourage kids to use an experience that is age-appropriate and managed by parents, we think that’s far better than kids using apps that weren’t designed for them,” a spokeswoman said.
Mr. Mosseri said the company is investing heavily in algorithmic tools designed to detect underage users. Instagram recently announced new controls meant to prevent adults from contacting teens unless they have a pre-existing relationship.