

برعاية
How do tech platforms develop clear content policies while balancing user freedom, regulatory requirements, and cultural contexts? What does it take to scale trust and safety efforts for billions of users in a rapidly changing digital landscape? Navigating these challenges requires foresight, transparency, and a deep understanding of user behavior.
In today’s episode of Click to Trust, we are joined by Cathryn Weems, Head of Content Policy at Character.AI, to take on the intricacies of building Trust and Safety policies. Cathryn shares her extensive experience shaping content policies at some of the world’s largest tech platforms, from crafting transparency reports to addressing complex government takedown requests. She offers unique insights into balancing global scalability with localized approaches and why clear, enforceable transparency reports are key to fostering trust.
In this episode, you’ll learn:
Jump into the conversation:
(00:00) Meet Cathryn Weems
(01:10) The evolution of Trust & Safety as a career path
(05:30) Tackling the complexities of content moderation at scale
(10:15) Crafting content policies for gray areas and new challenges
(14:40) Transparency reporting: Building trust through accountability
(20:05) Addressing government takedown requests and censorship concerns
(25:25) Balancing cultural context and global scalability in policy enforcement
(30:10) The impact of AI on content moderation and policy enforcement
(35:45) Cathryn’s journey as a female leader in Trust & Safety
(40:30) Fostering trust and improving safety on digital platforms
18 حلقات
How do tech platforms develop clear content policies while balancing user freedom, regulatory requirements, and cultural contexts? What does it take to scale trust and safety efforts for billions of users in a rapidly changing digital landscape? Navigating these challenges requires foresight, transparency, and a deep understanding of user behavior.
In today’s episode of Click to Trust, we are joined by Cathryn Weems, Head of Content Policy at Character.AI, to take on the intricacies of building Trust and Safety policies. Cathryn shares her extensive experience shaping content policies at some of the world’s largest tech platforms, from crafting transparency reports to addressing complex government takedown requests. She offers unique insights into balancing global scalability with localized approaches and why clear, enforceable transparency reports are key to fostering trust.
In this episode, you’ll learn:
Jump into the conversation:
(00:00) Meet Cathryn Weems
(01:10) The evolution of Trust & Safety as a career path
(05:30) Tackling the complexities of content moderation at scale
(10:15) Crafting content policies for gray areas and new challenges
(14:40) Transparency reporting: Building trust through accountability
(20:05) Addressing government takedown requests and censorship concerns
(25:25) Balancing cultural context and global scalability in policy enforcement
(30:10) The impact of AI on content moderation and policy enforcement
(35:45) Cathryn’s journey as a female leader in Trust & Safety
(40:30) Fostering trust and improving safety on digital platforms
18 حلقات
يقوم برنامج مشغل أف أم بمسح الويب للحصول على بودكاست عالية الجودة لتستمتع بها الآن. إنه أفضل تطبيق بودكاست ويعمل على أجهزة اندرويد والأيفون والويب. قم بالتسجيل لمزامنة الاشتراكات عبر الأجهزة.