National AI Awards 2025Discover AI's trailblazers! Join us to celebrate innovation and nominate industry leaders.

Nominate & Attend

Product Policy Manager, Youth Safety and Wellbeing, EMEA - Trust & Safety

TikTok
London
10 months ago
Applications closed

Related Jobs

View all jobs

Event Marketing Manager

Data Science Manager

Machine Learning Engineer - Operational Research

ML (Machine Learning) Engineer

Tax Tech Solution / Sales Engineer

Senior Data Scientist

TikTok is the leading destination for short-form mobile video. At TikTok, our mission is to inspire creativity and bring joy. TikTok's global headquarters are in Los Angeles and Singapore, and its offices include New York, London, Dublin, Paris, Berlin, Dubai, Jakarta, Seoul, and Tokyo. Why Join UsCreation is the core of TikTok's purpose. Our platform is built to help imaginations thrive. This is doubly true of the teams that make TikTok possible. Together, we inspire creativity and bring joy - a mission we all believe in and aim towards achieving every day. To us, every challenge, no matter how difficult, is an opportunity; to learn, to innovate, and to grow as one team. Status quo? Never. Courage? Always. At TikTok, we create together and grow together. That's how we drive impact - for ourselves, our company, and the communities we serve. Join us. Our Trust & Safety team's commitment is to keep our online community safe. We have invested heavily in human and machine-based moderation to remove harmful content quickly and often before it reaches our general community. The Trust & Safety Policy team develops and reviews our community policies to promote a positive and safe environment for all of our users and content creators to enjoy and express themselves. As a Global Youth Safety & Wellbeing Policy Manager, you will be responsible for contributing to TikTok's global Youth Safety & Wellbeing policy strategy, designing and consulting on the enforcement of TikTok's Trust & Safety policies that capture potential risks to youth, including but not limited to physical harm, psychological harm, developmental harm, and risk of exploitation. You will advocate for youth safety and wellbeing policies internally and externally through policy, user education, and product design. Your core responsibilities will be to contemplate and write policy standards that honor youth and adolescent wellbeing, which will include considering the complexity with which different global regions perceive appropriate youth entertainment experiences. You will adjust policies accordingly in collaboration with other issue area subject matter experts. You will also work with Youth Safety Product counterparts to ensure TikTok products are risk assessed for youth safety. This role requires generalized knowledge of core themes relevant to technology policy, Trust & Safety, and adolescent experience: adolescent identity development, youth gender and sexuality, peer and family relationships, teen risk-taking behavior, etc. This role will report to the Global Lead for Youth Safety & Wellbeing and can report out of any of TikTok's UK or EU T&S offices. It is possible that this role will be exposed to harmful content as part of the core role/as part of project/ in response to escalation requests/by chance.This may occur in the form of images, video, and text related to every-day life, but it can also include (but is not limited to) bullying; hate speech; child safety; depictions of harm to self and others, and harm to animals. What will I be doing? - Contribute to the development of product policies that are relevant to TikTok's global community, with a focus on protecting youth from harmful content and potential exploitation- Stay abreast of emerging trends, research findings, and legislative developments relevant to youth safety and wellbeing, and integrate this knowledge into policy development and program design - Collaborate closely with your Issue Policy Manager peers to identify when policy themes affect youth and how to best capture the nuanced needs of youth compared to adults in policy and product design- Engage in youth safety and wellbeing risk assessment processes related to upcoming products and features - Support youth-specific user education initiatives - . supporting digital literacy and awareness among youth populations spanning mental health to civic integrity issues- Consult with regional teams on high-profile escalations and provide information on how, if at all, content violated our community guidelines or policy standards- Engage with Public Policy and Communications teams as needed, sometimes co-leading cross-functional projects that help translate the team's work and support our broader narrative - Work with our Outreach & Partnerships team to identify and receive guidance from external Youth Safety & Wellbeing and Adolescent Wellness experts, including youth themselves, civil society groups, and academic institutions, to enhance our understanding of youth needs and concerns online

What should I bring with me?- 4+ years working in Trust & Safety, Technology Policy/Advocacy, Child Advocacy or another field with a connection to themes of youth wellbeing online- Demonstrated success working in a fast-paced technology company or advocacy organization with effective project management, complex cross-functional project collaboration, and strategic execution- Confident experience developing and refining policies or strategies that are designed to scale across global populations- Familiarity with technology policy, international child protection laws and child rights advocacy- Excellent verbal and written communication skills and an ability to educate a wide variety of business stakeholders including Product Policy, Engineering, Public Policy, Legal, Communications, and Data Science on the nuances of Youth Safety & Wellbeing topics- Your ability to work in a high tempo environment, adapt, respond to day-to-day challenges of the role.- Your resilience and commitment to self-care to manage the emotional demands of the role. Preferred qualifications: - Academic research and/or advocacy background focused on youth online safety or youth mental health- Experience representing organizations in external forums and media engagements- Experience working in non-western cultures, or knowledge of variations in culture and values across diverse populations- Understanding of Trust & Safety positioning in the entertainment media technology sector, with comfort learning internal tools and product workflows Trust & Safety recognises that keeping our platform safe for TikTok communities is no ordinary job which can be both rewarding and psychologically demanding and emotionally taxing for some. This is why we are sharing the potential hazards, risks and implications in this unique line of work from the start, so our candidates are well informed before joining. We are committed to the wellbeing of all our employees and promise to provide comprehensive and evidence-based programs, to promote and support physical and mental wellbeing throughout each employee's journey with us. We believe that wellbeing is a relationship and that everyone has a part to play, so we work in collaboration and consultation with our employees and across our functions in order to ensure a truly person-centred, innovative and integrated approach.

National AI Awards 2025

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

Part-Time Study Routes That Lead to AI Jobs: Evening Courses, Bootcamps & Online Masters

Artificial intelligence (AI) is reshaping industries at an unprecedented pace. From automating mundane tasks in finance to driving innovation in healthcare diagnostics, the demand for AI-skilled professionals is skyrocketing. In the United Kingdom alone, AI is forecast to deliver over £400 billion to the economy by 2030 and generate millions of new jobs across sectors. Yet, for many ambitious professionals, taking time away from work to upskill can feel like an impossible ask. Thankfully, part-time learning options have proliferated: evening courses, intensive bootcamps and flexible online master’s programmes empower you to learn AI while working. This comprehensive guide explores every route—from short tasters to deep-dive MScs—showcasing providers, course formats, funding options and practical tips. Whether you’re a career changer, a busy manager or a self-taught developer keen to go further, you’ll discover a pathway to fit your schedule, budget and goals.

Top 10 Mistakes Candidates Make When Applying for AI Jobs—And How to Avoid Them

Avoid the biggest pitfalls when applying for artificial intelligence jobs. Discover the top 10 mistakes AI candidates make—plus expert tips and internal resources to land your dream role. Introduction The market for AI jobs in the UK is booming. From computer-vision start-ups in Cambridge to global fintechs in London searching for machine-learning engineers, demand for artificial-intelligence talent shows no sign of slowing. But while vacancies grow, so does the competition. Recruiters tell us they reject up to 75 per cent of applications before shortlisting—often for mistakes that could have been fixed in minutes. To help you stand out, we’ve analysed thousands of recent applications posted on ArtificialIntelligenceJobs.co.uk, spoken with in-house talent teams and independent recruiters, and distilled their feedback into a definitive “top mistakes” list. Below you’ll find the ten most common errors, along with actionable fixes, keyword-rich guidance and handy internal links to deeper resources on our site. Bookmark this page before you hit “Apply”—it could be the difference between the “reject” pile and a career-defining interview.