This is the first in a series of six posts on compliance issues with various online platforms. Today’s post is about YouTube. Next Tuesday’s post will be about Facebook. The third post in the series, on March 22, will discuss Instagram. The fourth post in the series, on March 29, will focus on Twitter. The fifth post, on April 5, will be about Snapchat. On April 12, the sixth and final post in the series will discuss Reddit.
The video hosting and sharing service YouTube was created in 2005 and is now owned by Google. YouTube contains content from both individuals as well as media corporation partners. This content is extremely diverse, ranging from short clips to entire television shows and films as well as music videos, video blogs, live streams, and educational presentations. YouTube also makes use of the advertising program Google AdSense and includes targeted ads on its content; most of the videos on YouTube are free to view but some ad content will appear before, during, and/or after the video plays.
In addition to all of the above, YouTube is also a social media platform, with commenting, sharing, rating, and numerous other forms of user interaction and expression available. Combining the social media aspect of the site with the previously-mentioned content and advertising purposes of YouTube, there have been frequently been some controversial or challenging issues on the platform that are of interest to study from a corporate compliance perspective.
- This is a fascinating and important deep-dive into YouTube’s use of algorithms to delete videos depicting the ongoing Syrian War. The platform’s algorithms introduced in 2017 are intended to use machine-learning to flag propaganda videos by extremist and terrorist groups for removal, but it’s also flagged content and channels created by activists and videos that could be part of a substantial historical record of the ongoing conflict. YouTube and its parent Google remain opaque about the exact parameters of the algorithm, and restoration of the content is often delayed or incomplete: Erasing History: YouTube’s Deletion Of Syria War Videos Concerns Human Rights Groups
- In response to issues with incorrect flagging and deletion via algorithmic monitoring, YouTube has committed to hiring thousands of human moderators to monitor videos uploaded to the platform for abusive, offensive, or fake content. While hopes are that people will be “smarter” than machine-learning in picking out nuances of content to be flagged for removal, human error made by new hires or as a result of varying judgment is still an issue: YouTube Stumbles Trying to Deal with Misinformation
- Monetization of YouTube content allows individuals uploading videos to make money from the Google AdSense program. This ad revenue came into collision with NCAA rules last year when University of Central Florida kicker Donald De La Haye had his football scholarship taken away due to violation of these rules for earning ad revenue for his videos on YouTube. NCAA offered a compromise to De La Haye which he found acceptable and the university removed him from the team in order to save their own eligibility/season record. De La Haye is now suing UCF for violation of his civil rights: Commentary: Disgruntled YouTuber kicker should have stayed at UCF instead of suing knights
- In an entirely different angle on all the current Russia controversies surrounding social media, YouTube has recently run afoul of Russia’s internet censor for hosting content that a Russian court has ruled to be a privacy violation. The principal of the organization which posted the content claims that it is evidence of corruption and therefore politically significant and cannot be removed. The Russian internet watchdog, Roskomnadzor, wants the content removed for privacy reasons and if it is not, then Russian ISPs, lacking the ability to filter content for blocking, may be forced to cut all local access to YouTube in order to address the issue: YouTube and Instagram face Russian bans
- Popular and wide use of YouTube by young children and exposure to inappropriate or abusive content is a pervasive issue and often creates massive controversy around the platform’s content monitoring and filtering activities and capabilities. YouTube star Logan Paul, who is popular with very young children, was all over the news for posting a video that showed a dead body. His case has driven parents and content providers to take a second look at the addictive and inappropriate aspects of YouTube consumption by kids and to consider ways which the impact could be curbed or mitigated: Parents Are Trying To Control Their Kids’ YouTube Habits After Logan Paul’s Suicide Forest Video For a similar look into troubling content on YouTube that kids can come across with insufficient controls to prevent this or protect them from it, check out this Medium article from November 2017: Something is wrong on the internet
Check back next Thursday for the second post in this series, discussing major, current compliance issues with the online platform Facebook.