Harcèlement, santé mentale… Comment s’y prennent concrètement les réseaux sociaux pour nous préserver de tout ça ?
Brut. a reçu Antigone Davis, vice-présidente et responsable monde de la protection des utilisateurs chez Meta (Facebook Instagram, WhatsApp…) pour qu’elle nous raconte les coulisses de son métier.
Brut. a reçu Antigone Davis, vice-présidente et responsable monde de la protection des utilisateurs chez Meta (Facebook Instagram, WhatsApp…) pour qu’elle nous raconte les coulisses de son métier.
Category
🤖
TechnologieTranscription
00:00Take a teen, you're at school, people are making fun of your outfit all day long.
00:05You come home, someone hits you up on one of our apps, they say nice outfit.
00:10We can't know that that is bullying, but you do and it hurts.
00:15And so we have a filter where you can, it's called hidden words,
00:18where you can put in words that are particular to what someone might be doing
00:22that might be harmful or you might feel sensitive about and remove those
00:26and those comments will be filtered out of your experience.
00:29So what we want to do is create the right rules but give you the tools
00:33so you can personalize that experience and make it a fun and positive and safe place for you.
00:38How does it work behind the scenes?
00:40Are there beyond AI, are there people, humans who sit behind screens
00:45and kind of sift through these reports or has it all become automated?
00:49No, it's a combination of both artificial intelligence and technology
00:55and human review in order to moderate the platform and create a safe space.
01:00If you think about something that we, there's some content that's obviously going to be violating our policies,
01:06some content it may not be as obvious.
01:09The artificial intelligence helps us to find the content that clearly is violating our policies
01:14and remove it, but there are instances where it's not clear,
01:17where there's the only way to know for sure is for a human to review and take a little bit more time.
01:22Correct me if I'm wrong, but I think sometimes we can have the impression
01:26that Meta and other platforms act in response to things that are happening
01:31rather than in a proactive way.
01:34Would you agree or disagree and how is Meta trying to proactively tackle any issues
01:41that users might have on their platforms?
01:44What I would say is that we are constantly looking at our platform
01:48to see how we can create a safe and positive experience.
01:52We also are taking feedback and responding to people's experience
01:57and their experiences are evolving in the way that the user apps are evolving
02:01and so as a result we evolve.
02:04Sometimes that can take time, which is why I think sometimes people think
02:07that we may be being reactive.
02:10When we build a tool, we do a lot to build that tool.
02:14We talk to experts. We talk to users.
02:17We may test a tool to see what makes the most sense or works the easiest.
02:21How does the testing work?
02:23I'll give you an example.
02:24When we were building our parental supervisory tools,
02:28we sat down with parents and with teens and we wanted to understand from parents
02:32what are the things that would be useful to you.
02:35And we heard things like, I want to be able to manage my teens' time.
02:39I want to have a sense of who they're engaging with, who's following them.
02:44So that was the guidance that we heard from them.
02:46We also talked to experts about how do we build these in a way that teens
02:50are going to want to use them and they're not going to push them away.
02:54How do we create that right balance?
02:56Taking all of that information in, we then started developing our tools.
03:01And what we often do is we'll launch our tools out to a smaller group
03:05of the population, test it, see how people respond,
03:08maybe make some changes to those tools to ensure that we get those tools
03:13in a place that people find them valuable.
03:15How can people find out about all of these tools?
03:19Where concretely can they go on the platforms to find out what they can use
03:24to limit their time or block out certain kind of content?
03:28If you can sum it up in two sentences,
03:31where can they go to find the tools they need to stay safe?
03:36First and foremost, go to your settings.
03:38You're going to find a lot there.
03:41The second place that I would say to go to is the family center.
03:45Particularly for parents, you will find tips there.
03:48You'll find explanations for all our tools there.
03:51So I would say go to your settings, go to our family center,
03:56and you can always go to our safety center.
03:59Those three things should really give you a full picture of the things
04:02that are out there.
04:03How then do you explain that despite all of these tools,
04:06there's still so many people who feel like they are being harassed
04:11or can be harassed on META's platforms?
04:14The Internet is a very big place, right?
04:18And just like in the offline world, you are going to have people
04:22who are going to try to do bad things on the platform.
04:26There's no chance that you'll get to a place where there's zero,
04:29just like in the offline world.
04:32There's no chance we're going to get to a place where there's zero opportunity
04:35for somebody to do something that would bother somebody
04:39or harm another person.
04:42That said, we are using artificial intelligence to find things at scale.
04:47We're using human review to find things that are specific.
04:50We're giving people controls to make sure that they can personalize
04:53where we may not have the information.
04:56We're fundamentally committed to creating that safe and positive experience
05:01and learning from our users where we can be doing better and evolving.
05:05Many studies have shown that spending too much time online
05:08can have a negative impact on our mental health, on self-esteem, etc.
05:13So would META be in favor of users spending less time on social networks?
05:20What is in it for META to encourage users to spend less time on their platforms?
05:25Yeah, it's one of the, I think, bigger misconceptions
05:28about what we want for our users.
05:32We want people to have a positive experience
05:35to be having a positive experience when they're on the app,
05:39but when they walk away from the app to feel like they had a positive experience,
05:42that they haven't spent too much time on the app,
05:44to be able to manage their time.
05:46We have different tools that we have.
05:48So Take a Break, for example, this is something that we'll use
05:51where if someone's been down for a while, a teen's been on,
05:54we'll say, hey, would you like to take a break?
05:57We have another tool called Quiet Mode.
05:59So to really encourage teens, if it's nighttime and they want to turn it off,
06:04there's something that you can use called Quiet Mode.
06:07It allows them to send an auto-reply to people,
06:10letting them know that they're taking a break
06:12and they are not on their device,
06:15but enabling them when they go back to see who may have been in touch
06:19so they don't have this fear of sort of missing out on what was happening online.
06:25We've done that knowing that that would reduce the time,
06:28and in many cases it has reduced the amount of time
06:30that people are spending on the platform.
06:32The idea is that here, just as much time as is humanly possible,
06:36it's actually have a positive time, have a positive relationship with our app.
06:41Enjoy the time that you're on, build the community,
06:43discover the things that interest you, but walk away, go explore,
06:47take that out into the offline world too.
06:50We really want to give people that opportunity to kind of manage
06:53and feel good about their time online.