Trust & Safety Manager at Roblox (ex - Google and Facebook) describes working in Trust & Safety
In this post, we catch up with Jessica Dees, Senior Manager at Roblox and former Trust & Safety Manager at Google and Facebook, on what it’s like to work in this space. What kind of problems do you work on? What kind of challenges do you face? How do you assess a good opportunity when recruiting? Find out about all of this in our discussion with Jessica below.
LED: Can you share a summary of your background and your journey leading up to working in Trust and Safety.
Jessica: I totally fell into Tech and Trust & Safety (T&S) by accident. I entered college as a theatre performance major, and I had grand plans of opening up my own theatre company one day for underprivileged youth. I paid my own way through college, so I ended up having to take on two jobs to pay for tuition and bills, and one of those jobs just so happened to be at an Apple store. I transferred to a few different stores, and I moved up the retail ladder doing customer training, eventually finding myself doing market training, and that paved the way for me to join a startup doing Learning & Development (L&D) Program Management. I did L&D Program Management up until Google, when I switched to Trust & Safety.
I didn't know T&S existed as a career until I happened to see a Child Safety role and, on a whim, applied for it. The role really spoke to my social justice side - at the time, I was volunteering as a crisis counselor, and I wanted to make something like that my full-time gig. I had this wonderful opportunity present itself in front of me, and so I jumped on the T&S bandwagon and haven't looked back ever since. I definitely don't miss working in L&D :)
LED: Tell us more about what prompted you to apply for the Child safety role at Google?
Jessica: I had been working in L&D for a lot of different companies by that time - Surveymonkey, Google, Uber, etc. After doing a manager training for the umpteenth time, I was annoyed and feeling adrift. I wasn’t sure where I was going and I wasn’t enjoying what I was doing either. At the time, I was also a part of a program to help women in Peru and women in Tech. It helped me realize that I wanted to be in social justice, and what I was doing wasn’t going to fulfill me.
So yeah I applied, and I was hire number 2 on the team. I was a program manager on the team. Over time I became a manager.
LED: What does someone in Trust and Safety work on?
Jessica: In Trust & Safety, the core job is keeping good users safe from bad users and from disturbing content. It sounds really simple, but in practice, it's like playing a game of whack-a-mole, but with some seriously bad folks - think pedophiles, neo-nazis, or some really uncomfortable topics - like hate speech or suicidal content.
When describing T&S, you'll sometimes hear the sanitized version such as "anti-abuse strategies", but that doesn't capture the actual work. Engineers, Ops, Product Managers, Program Managers, Data Scientists, Analysts, etc. are all working towards making the internet safer for everyone. Where I sit is a hybrid between Operations, Program development, and Research. Since T&S is so nascent, you have an opportunity to wear a lot of different hats.
There are a few different facets to this:
Keeping bad content off
Keeping bad users away from good users
Stopping abuse in real time.
A little more on point number 3, ie, stopping abuse in real time - I worked in Operations and so I and my team were doing this in real time. You’re trying to stop the spread of child pornogoraphy, or domestic abuse or stuff like that in real time. And then, once the content is off, the problem is only partially solved - for example - that person still has access to children and they might be abusing them. And so we have to report them. We work with NCMEC to do this. Of course, this is all done in a way so as not to violate someone’s privacy.
In terms of roles - A Product Manager would be working more on products that can flag suspicious content at scale. They wouldn’t be doing any investigative work of stopping real time abuse or any monitoring. That work falls on the Operations team.
Another part of the job is setting up programs to define strategy for handling this type of content. For eg: At Google, I set up a program called “kids and families” to set up the strategy for how to protect kids against abusers on new products Google launched. I ran "future-proofing" programs, like finding ways to prevent online grooming of kids for sexual abuse, or rooting out abusers in novel ways. I also served as our Ops external partner for nonprofits like the National Center for Missing and Exploited Children, and the Technology Coalition, and organized co-sponsored events like a mental health awareness summit. That summit was one of my favorite things I've ever done.
LED: What’s a typical day like?
Jessica: Here’s an example of something that tends to happen - I’d get an alert from another team at legal for needing information on some account within the next 2 hours due to an imminent threat. And so you have to immediately take action.
Removing beheading videos or child abuse videos and such are par for the course, there’s so much of this content, and yes we have to remove on an ongoing basis and report to the right agencies where legal recourse is available. We usually refer to Clearinghouse or law enforcement agencies.
There are other kinds of content that fall under Trust & Safety too. There’s Terrorism content, child saftey content, abuse content, content around Elections such as civil unrest, identity theft. There’s really a lot of terrible content out there.
A quick note here is that looking at this type of content, day in and day out, is not easy. Usually a good company will not have you look at multiple types of violent extremist content in one go - if one group does child safety, then terrorism content will handed to a different group and not the same group. Because it’s so psychologically damaging, managing the mental health of employees in these roles is extremely important.
LED: What do you think are the core skills needed to succeed in this job?
Jessica: In my opinion, the number one predictor of success for T&S is the ability to compartmentalize your job from the rest of your life. I called my team "Digital First Responders" because if you're close to the work, the content you look at can be absolutely gruesome. There really is no sugarcoating it, child sexual abuse material, terrorist beheadings, people committing suicide - these aren't things average people look at. It definitely takes a certain kind of person to do the work, but it is so, so worth it when you hear about a child you saved in the news.
This is a very nascent field, that has come up with the rise of social media and user generated content. If it becomes too much, you can rotate off to give your brain a break. This doesn’t work usually because people who are doing this - they really care about this. You know this stuff is still happening while you’re taking a break. Moving to different type of content, from child safety to beheadings say, doesn’t help either. We are routinely told how our tips have helped make arrests and make a real impact. After working on something like this, I just cannot imagine working on how to get more ad clicks.
LED: What aspects do you really like about this job?
Jessica - You make a real impact, and not just for a company, but for real people in imminent danger. T&S is one of the few non-compete areas in tech, and it just inherently feels good. It's difficult for other areas of a business to compete with missions like "keeping kids safe online". I love seeing the articles where I know I actively helped bring someone to justice, and I love being in the trenches with people who care about the bigger societal picture.
LED: What do you not like about this job?
Jessica: The material for sure. You also can't really talk about what you do at a cocktail party - either people get uncomfortable or they get a little too comfortable, and ask all sorts of inappropriate or awkward questions.
LED: How do you get through?
Jessica: We use gallows humor. Like I came from HR and so we talk about HR violations and some such. And we are in the trenches together. Yeah we went for an offsite to a break stuff room. We broke VCRs and glass and we just let out the pent up rage. We just vented and slammed things for an hour.
LED: If you were to help someone evaluate whether they would enjoy this job or not - what would you tell them?
Jessica: You need to have a lot of passion for the work. This isn't a rest-and-vest type of job, there are real lives on the line. Like any other first responder job, you need to want to help, and feel compelled to protect vulnerable people. A sense of justice and righteousness also helps.
Apart from your personal metal capacity, you really need a solid foundation and strong support network. That will help you cope.
When hiring, we show candidates non-illegal content such as examples of chats freely available online, and see how they handle it. We filter out over 90% of those who apply at this stage.
LED: How can one identify a good Trust and Safety role from the outside? What should you look for?
Jessica: There are critical things to consider, like is the role a full time position at a reputable company? Does the posting clearly articulate the scope of work required with a "may have to look at sensitive content" warning? Does it specify if on call is needed? If it's written in vague language, I wouldn't trust it.
Unfortunately, there’s a lot of bait and switch for these jobs. It’s a difficult job, there’s a lot of churn, and people leave. It’s easier for companies to make this a contract role. However, the job opening is a contract role, don’t take it. You don’t have the benefits that full time employees have like mental health coverage, which is critical for this work. Everyone in this line of work is recommended to see a therapist. You may not even get health insurance in a contract position. You don’t get stock perhaps if you’re contract, even though you’re doing really important work. A contract position is also a signal that the company doesn’t take this work seriously.
If this is not mentioned in the JD, make sure you ask the recruiter.
LED: Any important questions candidates should ask during interviews / informationals
Jessica: Culture is critical for T&S teams. I've seen many teams fail because bosses wouldn't look at content with them, or created internal stigmas around their roles. So it's very important to ask questions like: How does the team think about mental health? How does the company approach safety? Privacy? From a technical standpoint, probe about what they think the future of abuse looks like - it's a weird question, but I've qualified myself out of roles before when leaders were unable to answer that question.
I had a boss previously who told me that he respected my role, but could not look at the content himself. He was there to manage my career, but wasn’t willing to get into the weeds with me.
To me, this signifies that you feel you’re special and don’t have to look at that content. This is not good for the team culture. A manager needs to be in the trenches with their team.
LED: Do you recommend any resources?
Jessica: Yes! If you're going into Child Safety, do your homework with Thorn.org or the National Center for Missing and Exploited Children. Read some of their studies, and look at the work they're doing. For other T&S areas, read up on section 230, and look up articles on content moderation.
LED: Any parting advice?
Jessica: Trust and Safety is a uniquely 21st century space, so it's important to go in with eyes wide open. User generated content is still the wild west for most tech companies, so be ready to help build the foundation of the industry, and know that some nights you'll just need to watch cute cat videos with a tub of ice cream after work.
If you have any feedback to share, or if you have any questions for Jessica, drop us an email at [email protected] or tweet at us @LED_Curator