Practical insights for compliance and ethics professionals and commentary on the intersection of compliance and culture.

Round-up on compliance issues with online platforms: Reddit

This is the final post in a series of six on compliance issues with various online platforms.  The first post was about YouTube.  The second post was about Facebook.  The third post discussed Instagram.  The fourth post was about Twitter.  Last week’s post covered Snapchat.  Today’s post, the sixth and final post in the series, is about Reddit.

Reddit is a web-based forum where users share links to news, photos, and videos, as well as engage in social media-style discussion threads.  Founded in 2005, Reddit has become one of the most visited websites in the world.  The platform is set up as a variety of user-generated community boards called “subreddits.”  These subreddits cover a wide variety of popular culture, current event, and special interest subjects.

Users submit posts and comments to posts on the subreddits, which other members reward as valuable contributions to the conversation with “up-votes” or indicate as irrelevant or unhelpful with “down-votes.”  The site is intended to be self-regulating, with strict rules in place about anti-harassment.  However, abusive and inappropriate content and user conduct is an ongoing issue on Reddit, which leads to sometimes heavy-handed moderation practices and controversy about the balance between restraint on expression and community protection.

Hopefully this series on compliance and online platforms has been interesting and informative.  Check back in the future for further posts which will address this same rich and complex area of risk management in the digital era.

Leave a Reply