YouTube is a BIG DEAL with today’s youth. Allowing users to share YouTube videos within your application increases user engagement and retention. However, consider what might happen if inappropriate content such as sex, violence, or personally identifiable information (PII) is shared. In this post I will walk you through the process to provide a safe environment that parents will trust and save precious moderator hours by building a list of already-approved and already-rejected videos.
Clip of Inversoft's Sean Bryant's puppy "Bodhi"
Approval Process: YouTube ID White & Black List
Allowing kids to post and share videos without being reviewed first by a moderator is risky, to say the least. Even if you allow users to report inappropriate content, it would not take more than one angry parent seeing pornography on your site to ruin your reputation for good. Therefore you must watch every video before it is made public in your community.
Kids love to share videos, particularly the same video over and over. Your moderation efforts will be taxed if a single video is reviewed 700 times! However, you can reduce the burden on moderators by employing a simple whitelist/blacklist technique on YouTube videos. Each YouTube video has a unique ID that can be saved after a moderator watches it. The next time the same video is posted, you can check the whitelist/blacklist to determine if it has already been moderated.