Former head of trust and safety at Twitter speaks on promises and perils of decentralized social media
- by dailyUW.com
- Oct 31, 2024
- 0 Comments
- 0 Likes Flag 0 Of 5
Email
Yoel Roth speaking at the human centered design & engineering distinguished lecture in the HUBÂ in Seattle, Wash., on Oct. 15, 2024.
Sarah Hemminger Save
Yoel Roth, vice president of trust and safety at Match Group, the parent company of Tinder and Hinge, and former head of trust and safety at Twitter (now X), spoke at a distinguished lecture series hosted by the UW department of human centered design & engineering (HCDE) on Oct. 23.
Rothâs speech included information regarding federated social media sites such as Mastodon and Bluesky, and the evolution of content moderation on social media sites. Roth spoke in front of an audience of approximately 70 people, including UW HCDE professors Julie Kientz and Kate Starbird.Â
Roth began the lecture by discussing the early age of internet moderation, and how sites like LambdaMOO, a multi-user domain, were completely unprepared to deal with digital hate crimes. Roth also noted how even after the community decided to ban an offender, the decision to ban an offender had no actual bearing on their removal itself, which came from the moderators of the domain.Â
âFiguring out that the harms caused by online speech need to be mitigated, but that the mitigation is in the hands of somebody who isnât necessarily a participant in the community at all, in my view, showed us the shape of things to come,â Roth said.
Roth continued to discuss the relative simplicity in early content moderation in social media.
âFacebookâs policies for many years were âif it makes you feel bad in your gut, then go ahead and take it down,ââ Roth said. âThis approach to content moderation, letâs call it moderation by vibes, has some advantages: Itâs simple ⦠it also generally is going to protect people from the worst of the worst things.â
Roth then discussed the growth of content moderation and the continuation of challenges regarding the subjective nature of content moderation. Roth proceeded to bring up the case of the Napalm Girl photo, a Pulitzer-Prize winning photo of children fleeing Napalm attacks in Vietnam that also contains nudity and sensitive imagery.
âInevitably, even as these processes have continued to mature ⦠platforms continue to struggle with actually doing the work of content moderation,â Roth said. âI used Google to search for the Napalm Girl photo ⦠and found that not only had Google decided that the Pulitzer-Prize winning photo was sensitive warranting a safe-search blur, I also found that they seemed to be unable, even in the span of four search results, to consistently apply their policy.âÂ
These failures of content moderation by tech giants like Meta, Google, and X have led to a rise in interest towards decentralized content moderation, which Roth noted as promising less moderation compared to the aforementioned media giants. However, Roth emphasized how decentralized social media sites face glaring issues in their own content moderation policies, specifically noting the lack of hash-and-match function to prevent abusive material or a heuristics engine to ban people based on a set of rules instead of on a case-by-case basis.
Most Popular Stories
Please first to comment
Related Post
Jaguars-Eagles: 5 Players to Watch
- Nov 01, 2024
Stay Connected
Tweets by elonmuskTo get the latest tweets please make sure you are logged in on X on this browser.
Sponsored
Popular Post
Tesla: Buy This Dip, Energy Growth And Margin Recovery Are Vastly Underappreciated
28 ViewsJul 29 ,2024