Why Has Clubhouse Been Plagued by Trust and Safety Issues? - a podcast by Patrick OKeefe
from 2021-01-11T10:30
If you were building a community product, how would you start? Who would you choose as your first hire? What efforts would you make to ensure that the product is inclusive, safe, and well-moderated?
In this episode of Community Signal, we’re joined by Danielle Maveal to do a deep-dive on audio-first platforms and specifically, Clubhouse. While every platform and community has moderation issues to work through, Clubhouse has made headlines and Twitter rounds for the lax moderation that has brought anti-Semitism, misogyny, and misinformation to the “stage” on the app. In this discussion, Danielle and Patrick discuss how other audio-first platforms have approached trust and safety and what steps they would take to scale the teams, communities, and norms that power them. And while they acknowledge that not every conversation or connection that happens on the platform is bad, they offer a reminder that we can all do something to hold platforms accountable.
The members and the content that we allow on our platforms dictate the culture that permeates in our communities. If there’s one thing that Clubhouse proves, it’s that there is still room for platforms that are built with safety and inclusivity in mind from day one.
Danielle and Patrick discuss:
- The current landscape of audio-first communities
- How they would scale a team and membership base of a community product
- Why community guidelines, enforcement, and tools matter from day one
Our Podcast is Made Possible By…
If you enjoy our show, please know that it’s only possible with the generous support of our sponsors: Vanilla, a one-stop shop for online community and Localist, plan, promote, and measure events for your community.
Big Quotes
Community governance influences the culture of our communities (8:05): “I haven’t heard about anyone being removed from [Clubhouse]. I’m sure there have been some, but there’s no transparency. When is someone banned, or when are they muted? What if they are a repeat offender? What happens then? There’s no discussion of that happening publicly so [Clubhouse] feels like a brand new territory for these scammers to go and chat.” –@daniellexo
Not all audio-first platforms are terrible (14:55): “[At Clubhouse], I didn’t see a key community or trust and safety hire very early on to set the norms. I’m on two other audio apps with absolutely no problems. [Space and Quilt]. They’re smaller, they’re growing at a smaller rate, but they have key community hires there. The social norms that are being developed are just completely different.” –@daniellexo
How do we make people care about trust and safety issues? (17:29): “I’ll have a conversation with someone who is a very reasonable person, and we’ll talk about Clubhouse, and all the issues that have been raised. Then I’ll see them ask for an invite. My mind is blown. We are not learning from lessons of the past. How do we make people care?” –@daniellexo
The value that Clubhouse members bring to the stage (24:35): “[Clubhouse] creators who are now put into these moderator and facilitator roles; they’re going to make the founders and investors rich. [Clubhouse] invites are positioned like a gift, but really [users are] creating the experiences that draw in hundreds of listeners, thousands in some cases, and they get absolutely nothing for it.” –@daniellexo
Scaling your moderation efforts with your community is a must (34:30): “Being able to shut down a bad thing while it’s happening [on an audio-first platform] is important and if you can’t do that, maybe it’s a clue that you shouldn’t be doing this. … Taking care of a call two days later is not going to be a workable solution to stopping bad things.” –@patrickokeefe
Building a community for the long haul, not for short term vitality (38:00): “One thing I would focus on is really keeping the testing pool super small. Trusted people only, no invites, or very few and I would know who’s coming in the door. I wouldn’t be allowing more people in the door until I was ready to be responsive to live reports, so that I can come out very strongly against bad behavior.” –@patrickokeefe
About Danielle Maveal
Danielle Maveal has 15 years of experience launching, growing, and supporting brand and marketplace communities. She was on the founding team at Etsy and BarkBox and since has worked at Airbnb and Lyft. She’s now moving from building community teams and programs to building community products.
Related Links
- Sponsor: Localist, plan, promote, and measure events for your community
- Sponsor: Vanilla, a one-stop-shop for online community
- Clubhouse
- Clubhouse on Twitter
- Danielle Maveal’s website and Substack, Community Feelings
- Exclusive Social Media App ‘Clubhouse’ Had an Anti-Semitic Meltdown Over Yom Kippur, by Yair Rosenburg
- Jewish “Control” of the Federal Reserve: A Classic Antisemitic Myth, via ADL
- “You become hostage to their worldview”: The murky world of moderation on Clubhouse, a playground for the elite, via Vanity Fair
- Kia Richards, product compliance manager at Square, on Clubhouse’s disinformation problem
- Taylor Lorenz, journalist at the New York Times, documents Clubhouse’s moderation issues
- Other audio-first platforms referenced in this show include Airtime, Space, and Quilt
- Heather Merrick, director of customer support at Airtime, on Community Signal
- Tracy Chou, CEO at Block Party
- Wesley Faulkner on Community Signal
- Ustream
Transcript
Your Thoughts
If you have any thoughts on this episode that you’d like to share, please leave me a comment, send me an email or a tweet. If you enjoy the show, we would be so grateful if you spread the word and supported Community Signal on Patreon.
Further episodes of Community Signal
Further podcasts by Patrick O'Keefe
Website of Patrick O'Keefe