WATCH LIVE: CEOs of Meta, TikTok, X and other social media companies testify in Senate hearing
Give me Mark Zuckerberg's complete opening statement at the senate hearing
Protecting Teens Online: A Parent's Guide to Social Media Safety
Scene Analysis
Chairman Durbin ranking member Graham and members of the committee.
Every day, teens and young people do amazing things on our services. They use our apps to create new things, express themselves, explore the world around them, and feel more connected to the people they care about. Overall, teens tell us that this is a positive part of their lives. But some face challenges online. So we work hard to provide parents and teens support and controls to reduce potential harms. Being a parent is one of the hardest jobs in the world. Technology gives us new ways to communicate with our kids and feel connected to their lives, but it can also make parenting more complicated. And it's important to me that our services are positive for everyone who uses them. We are on the side of parents everywhere working hard to raise their kids. Over the last 8 years, we've built more than 30 different tools, resources, and features that parents can set time limits for their teens using our apps, see who they're following. Or if they report someone for bullying. For teens, we've added nudges to remind them when they've been using Instagram for a while or if it's getting late and they should go to sleep, as well as ways to hide words or people without those people finding out. We put special restrictions on teen accounts on Instagram by default. Accounts for under sixteens are set to private, have the most restrictive content settings, and can't be messaged by adults that they don't follow or people they aren't connected to.
With so much of our lives spent on mobile devices and social media, it's important to look into the effects on teen mental health and well-being. I take this very seriously. Mental health is a complex issue, and the existing body of scientific work has not shown a causal link between using social media and young people having worse mental health outcomes. A recent National Academies of Science report evaluated over three hundred studies, and found that research, quote, did not support the conclusion that social media causes changes in adolescent mental health at the population level, end quote. It also suggested that social media can provide significant positive benefits when young people use it to express themselves, explore and connect with others. Still, we're going to continue to monitor the research and use it to inform our roadmap. Keeping young people safe online has been a challenge since the internet began. And as criminals evolve their tactics, we have to evolve our defenses too. We work closely with law enforcement to find bad actors and help bring them to justice. But the difficult reality is that no matter how much we invest or how effective our tools are. There's always more to learn and more improvements to make. But we remain ready to work with members of this committee, industry, and parents to make the internet safer for everyone. I'm proud of the work that our teams do to improve online child safety on our services and across the entire internet. We have around forty thousand people overall working on safety and security and we've invested more than twenty billion dollars in this since 2016, including around five billion dollars in the last year alone. We have many teams dedicated to child safety and teen well-being and we lead the industry in a lot of the areas that we're discussing today. We build technology to tackle the worst online risks and share it to help our whole industry get better. like Project Lantern, which helps companies share data about people who break child safety rules, and we're founding members of Take It Down, a platform which helps young people prevent their nude images from being spread online. We also go beyond legal requirements and use sophisticated technology to proactively discover abusive material. And as a result, we find and report more inappropriate content than anyone else in the industry. As the National Center for Missing and Exploited Children put it this week, META goes, quote, above and beyond to make sure that there are no portions of their network where this type of activity occurs, end quote. I hope we can have a substantive discussion today that drives improvements across the industry, including legislation that delivers what parents say they want.
A clear system for age verification, and control over what apps their kids are using. 3 out of 4 parents want App Store age verification. And 4 out of 5 want parental approval of whenever teens download apps. We support this. Parents should have the final say on what apps are appropriate for their children and shouldn't have to upload their ID every time. That's what app stores are for. We also support setting industry standards on age appropriate content and limiting signals for advertising to teens to age and location and not behavior.
At the end of the day, we want everyone who uses our services to have safe and positive experiences. Before I wrap up, I want to recognize the families who are here today who have lost a loved one or lived through some terrible things that no family should have to endure. These issues are important for every parent and every platform. I'm committed to continuing to work in these areas, and I hope we can make progress today.
Chairman Durbin ranking member Graham and members of the committee.
Every day, teens and young people do amazing things on our services. They use our apps to create new things, express themselves, explore the world around them, and feel more connected to the people they care about. Overall, teens tell us that this is a positive part of their lives. But some face challenges online. So we work hard to provide parents and teens support and controls to reduce potential harms. Being a parent is one of the hardest jobs in the world. Technology gives us new ways to communicate with our kids and feel connected to their lives, but it can also make parenting more complicated. And it's important to me that our services are positive for everyone who uses them. We are on the side of parents everywhere working hard to raise their kids. Over the last 8 years, we've built more than 30 different tools, resources, and features that parents can set time limits for their teens using our apps, see who they're following. Or if they report someone for bullying. For teens, we've added nudges to remind them when they've been using Instagram for a while or if it's getting late and they should go to sleep, as well as ways to hide words or people without those people finding out. We put special restrictions on teen accounts on Instagram by default. Accounts for under sixteens are set to private, have the most restrictive content settings, and can't be messaged by adults that they don't follow or people they aren't connected to.
With so much of our lives spent on mobile devices and social media, it's important to look into the effects on teen mental health and well-being. I take this very seriously. Mental health is a complex issue, and the existing body of scientific work has not shown a causal link between using social media and young people having worse mental health outcomes. A recent National Academies of Science report evaluated over three hundred studies, and found that research, quote, did not support the conclusion that social media causes changes in adolescent mental health at the population level, end quote. It also suggested that social media can provide significant positive benefits when young people use it to express themselves, explore and connect with others. Still, we're going to continue to monitor the research and use it to inform our roadmap. Keeping young people safe online has been a challenge since the internet began. And as criminals evolve their tactics, we have to evolve our defenses too. We work closely with law enforcement to find bad actors and help bring them to justice. But the difficult reality is that no matter how much we invest or how effective our tools are. There's always more to learn and more improvements to make. But we remain ready to work with members of this committee, industry, and parents to make the internet safer for everyone. I'm proud of the work that our teams do to improve online child safety on our services and across the entire internet. We have around forty thousand people overall working on safety and security and we've invested more than twenty billion dollars in this since 2016, including around five billion dollars in the last year alone. We have many teams dedicated to child safety and teen well-being and we lead the industry in a lot of the areas that we're discussing today. We build technology to tackle the worst online risks and share it to help our whole industry get better. like Project Lantern, which helps companies share data about people who break child safety rules, and we're founding members of Take It Down, a platform which helps young people prevent their nude images from being spread online. We also go beyond legal requirements and use sophisticated technology to proactively discover abusive material. And as a result, we find and report more inappropriate content than anyone else in the industry. As the National Center for Missing and Exploited Children put it this week, META goes, quote, above and beyond to make sure that there are no portions of their network where this type of activity occurs, end quote. I hope we can have a substantive discussion today that drives improvements across the industry, including legislation that delivers what parents say they want.
A clear system for age verification, and control over what apps their kids are using. 3 out of 4 parents want App Store age verification. And 4 out of 5 want parental approval of whenever teens download apps. We support this. Parents should have the final say on what apps are appropriate for their children and shouldn't have to upload their ID every time. That's what app stores are for. We also support setting industry standards on age appropriate content and limiting signals for advertising to teens to age and location and not behavior.
At the end of the day, we want everyone who uses our services to have safe and positive experiences. Before I wrap up, I want to recognize the families who are here today who have lost a loved one or lived through some terrible things that no family should have to endure. These issues are important for every parent and every platform. I'm committed to continuing to work in these areas, and I hope we can make progress today.
Protecting Teens Online: A Parent's Guide to Social Media Safety
Scene Analysis
Chairman Durbin ranking member Graham and members of the committee.
Every day, teens and young people do amazing things on our services. They use our apps to create new things, express themselves, explore the world around them, and feel more connected to the people they care about. Overall, teens tell us that this is a positive part of their lives. But some face challenges online. So we work hard to provide parents and teens support and controls to reduce potential harms. Being a parent is one of the hardest jobs in the world. Technology gives us new ways to communicate with our kids and feel connected to their lives, but it can also make parenting more complicated. And it's important to me that our services are positive for everyone who uses them. We are on the side of parents everywhere working hard to raise their kids. Over the last 8 years, we've built more than 30 different tools, resources, and features that parents can set time limits for their teens using our apps, see who they're following. Or if they report someone for bullying. For teens, we've added nudges to remind them when they've been using Instagram for a while or if it's getting late and they should go to sleep, as well as ways to hide words or people without those people finding out. We put special restrictions on teen accounts on Instagram by default. Accounts for under sixteens are set to private, have the most restrictive content settings, and can't be messaged by adults that they don't follow or people they aren't connected to.
With so much of our lives spent on mobile devices and social media, it's important to look into the effects on teen mental health and well-being. I take this very seriously. Mental health is a complex issue, and the existing body of scientific work has not shown a causal link between using social media and young people having worse mental health outcomes. A recent National Academies of Science report evaluated over three hundred studies, and found that research, quote, did not support the conclusion that social media causes changes in adolescent mental health at the population level, end quote. It also suggested that social media can provide significant positive benefits when young people use it to express themselves, explore and connect with others. Still, we're going to continue to monitor the research and use it to inform our roadmap. Keeping young people safe online has been a challenge since the internet began. And as criminals evolve their tactics, we have to evolve our defenses too. We work closely with law enforcement to find bad actors and help bring them to justice. But the difficult reality is that no matter how much we invest or how effective our tools are. There's always more to learn and more improvements to make. But we remain ready to work with members of this committee, industry, and parents to make the internet safer for everyone. I'm proud of the work that our teams do to improve online child safety on our services and across the entire internet. We have around forty thousand people overall working on safety and security and we've invested more than twenty billion dollars in this since 2016, including around five billion dollars in the last year alone. We have many teams dedicated to child safety and teen well-being and we lead the industry in a lot of the areas that we're discussing today. We build technology to tackle the worst online risks and share it to help our whole industry get better. like Project Lantern, which helps companies share data about people who break child safety rules, and we're founding members of Take It Down, a platform which helps young people prevent their nude images from being spread online. We also go beyond legal requirements and use sophisticated technology to proactively discover abusive material. And as a result, we find and report more inappropriate content than anyone else in the industry. As the National Center for Missing and Exploited Children put it this week, META goes, quote, above and beyond to make sure that there are no portions of their network where this type of activity occurs, end quote. I hope we can have a substantive discussion today that drives improvements across the industry, including legislation that delivers what parents say they want.
A clear system for age verification, and control over what apps their kids are using. 3 out of 4 parents want App Store age verification. And 4 out of 5 want parental approval of whenever teens download apps. We support this. Parents should have the final say on what apps are appropriate for their children and shouldn't have to upload their ID every time. That's what app stores are for. We also support setting industry standards on age appropriate content and limiting signals for advertising to teens to age and location and not behavior.
At the end of the day, we want everyone who uses our services to have safe and positive experiences. Before I wrap up, I want to recognize the families who are here today who have lost a loved one or lived through some terrible things that no family should have to endure. These issues are important for every parent and every platform. I'm committed to continuing to work in these areas, and I hope we can make progress today.
Chairman Durbin ranking member Graham and members of the committee.
Every day, teens and young people do amazing things on our services. They use our apps to create new things, express themselves, explore the world around them, and feel more connected to the people they care about. Overall, teens tell us that this is a positive part of their lives. But some face challenges online. So we work hard to provide parents and teens support and controls to reduce potential harms. Being a parent is one of the hardest jobs in the world. Technology gives us new ways to communicate with our kids and feel connected to their lives, but it can also make parenting more complicated. And it's important to me that our services are positive for everyone who uses them. We are on the side of parents everywhere working hard to raise their kids. Over the last 8 years, we've built more than 30 different tools, resources, and features that parents can set time limits for their teens using our apps, see who they're following. Or if they report someone for bullying. For teens, we've added nudges to remind them when they've been using Instagram for a while or if it's getting late and they should go to sleep, as well as ways to hide words or people without those people finding out. We put special restrictions on teen accounts on Instagram by default. Accounts for under sixteens are set to private, have the most restrictive content settings, and can't be messaged by adults that they don't follow or people they aren't connected to.
With so much of our lives spent on mobile devices and social media, it's important to look into the effects on teen mental health and well-being. I take this very seriously. Mental health is a complex issue, and the existing body of scientific work has not shown a causal link between using social media and young people having worse mental health outcomes. A recent National Academies of Science report evaluated over three hundred studies, and found that research, quote, did not support the conclusion that social media causes changes in adolescent mental health at the population level, end quote. It also suggested that social media can provide significant positive benefits when young people use it to express themselves, explore and connect with others. Still, we're going to continue to monitor the research and use it to inform our roadmap. Keeping young people safe online has been a challenge since the internet began. And as criminals evolve their tactics, we have to evolve our defenses too. We work closely with law enforcement to find bad actors and help bring them to justice. But the difficult reality is that no matter how much we invest or how effective our tools are. There's always more to learn and more improvements to make. But we remain ready to work with members of this committee, industry, and parents to make the internet safer for everyone. I'm proud of the work that our teams do to improve online child safety on our services and across the entire internet. We have around forty thousand people overall working on safety and security and we've invested more than twenty billion dollars in this since 2016, including around five billion dollars in the last year alone. We have many teams dedicated to child safety and teen well-being and we lead the industry in a lot of the areas that we're discussing today. We build technology to tackle the worst online risks and share it to help our whole industry get better. like Project Lantern, which helps companies share data about people who break child safety rules, and we're founding members of Take It Down, a platform which helps young people prevent their nude images from being spread online. We also go beyond legal requirements and use sophisticated technology to proactively discover abusive material. And as a result, we find and report more inappropriate content than anyone else in the industry. As the National Center for Missing and Exploited Children put it this week, META goes, quote, above and beyond to make sure that there are no portions of their network where this type of activity occurs, end quote. I hope we can have a substantive discussion today that drives improvements across the industry, including legislation that delivers what parents say they want.
A clear system for age verification, and control over what apps their kids are using. 3 out of 4 parents want App Store age verification. And 4 out of 5 want parental approval of whenever teens download apps. We support this. Parents should have the final say on what apps are appropriate for their children and shouldn't have to upload their ID every time. That's what app stores are for. We also support setting industry standards on age appropriate content and limiting signals for advertising to teens to age and location and not behavior.
At the end of the day, we want everyone who uses our services to have safe and positive experiences. Before I wrap up, I want to recognize the families who are here today who have lost a loved one or lived through some terrible things that no family should have to endure. These issues are important for every parent and every platform. I'm committed to continuing to work in these areas, and I hope we can make progress today.