Best Practices for Moderating User-Generated Content in Video Games

Idowu Odesanmi

Best Practices for Moderating User-Generated Content in Video Games

When you are building a game of any sorts, there is bound to be content generated by users. It can either arise from their activity, their progress or play a bigger role as a playable piece of content, such as maps, items, etc. In any case, you need to moderate the content that users generate and put into your games. In this guide, you will share some best practices for moderating and managing user-generated content (UGC).

Best Practices for Moderating User-Generated Content in Video Games

In its most basic form, user-generated content (UGC) stands for any content, including audio, video, GIFs, text, 3D models, images, and code, that is created and published by users of a platform, on the platform. UGC has been around since well before the term was coined. In fact, the Oxford English Dictionary was compiled from user-generated content from all over the English-speaking world.

The video game industry has experimented with UGC for the past few decades with concepts like in-game modifications, or mods. Nowadays, big industry players like the teams behind Roblox, Fortnite Creative, and Minecraft have embraced UGC as a business model and are reaping the rewards of increased consumer participation in their creative process.

Your biggest challenge as a developer, creative talent, or game company is to hold consumer attention and attempt to satisfy the continuous appetite of player communities for new content. It’s unrealistic to develop strategies to meet this market demand and remain competitive without expanding your platform to allow users to freely contribute to content creation. With the right UGC setup, you can expand in-game experiences for your users, reduce the cost of marketing your product, and improve player retention and overall interaction.

Like any other social platform that gives unbridled user freedom and access, though, your platform is vulnerable to abuse if UGC is not adequately moderated. In this post, you’ll learn some best practices for moderating and managing UGC in your application or platform.

Note that platforms, forums, and applications are used interchangeably throughout this article to refer to video games and the community.

Why Is It Important to Moderate User-Generated Content?

When your platform relies solely on content generated within your organization, you have complete control over the type of content, how it’s used, and what’s accessible to users. Conversely, accepting UGC on your forum means you are outsourcing this control to users. The anonymity that the digital environment provides can encourage users to abuse the privilege of making UGC. This is why regulating content that’s created by users and shared on your forum is important.

UGC moderation involves screening and filtering images, text, and other types of content to make sure it meets your platform’s standards.

There are several types of problematic content you need to guard against when you allow UGC in your online application:

  • Age-sensitive content: Findings from a study carried out by the UK’s National Society for the Prevention of Cruelty to Children revealed that about 56 percent of children between ages 11 and 16 have come across explicit content online. As children become increasingly active on the internet, there is a need to filter out explicit and sexually provocative content, posts promoting self-harm, and other age-inappropriate content that some users are likely to share.

  • Spam: Some users create bots to spam your site, executing automated repetitive tasks to spread scams and malware.

  • Irrelevant promotion: Customers can exploit the access granted by UGC to promote irrelevant content that redirects traffic to other platforms, executes unpaid advertisements, promotes offensive beliefs, or even distributes propaganda.

  • Bullying/harassment: UNICEF explains cyberbullying as the spreading of lies, misinformation, or embarrassing content about someone, sending threatening or abusive content targeted at someone or a group of people, or impersonating someone and harassing others. Users of all ages and across all platforms on the internet are exposed to bullying every day, and it is your responsibility as a game developer or moderator to impose measures that will curb it.

  • Incompatible content: Occasionally users will try to upload or render files to your server that are too big or in unsupported formats, which can slow down your server and even cause it to crash.

  • Copyright infringement: Screengrabs, memes, GIFs, and similar content have made it easy for users to duplicate copyrighted materials and share them without approval. The damage to your reputation and potential legal ramifications for your gaming platform as a result of stolen content, even with disclaimer clauses protecting your interests, can prove fatal.

For all of these reasons and more, it is in your best interest to prioritize UGC management if you accept user content in your video game application.

What Are Best Practices for Moderating User-Generated Content?

The following are actions you can take to hold UGC to an appropriate standard.

Maintain Public Community Guidelines

The first and probably most important task you need to undertake is to create a set of community moderation guidelines. A clear, concise, and accessible set of guidelines will communicate to established community members and new users how they are expected to interact with others and with the tools on the platform. Your guidelines should capture the beliefs and personality of the community without being too forceful or restrictive. They should also spell out required safety measures, privacy and security concerns, and penalties for violations.

You should keep a few things in mind when drafting your community guidelines:

  • Take care with your tone and choice of words
  • Explain how to report violations
  • Explain how the guidelines will be enforced
  • Allow the guidelines to evolve as your platform evolves

Roblox Community Standards

For reference, here are the community guidelines of some popular platforms that accept UGC:

Reward Positive Behavior

When there are well-defined rewards for positive behavior, users are usually a lot more inclined to play by the rules. Incentives can be given in several forms and can be structured to encourage positive attitudes that benefit your platform. Research has shown that rewards motivate people to undertake tasks that have little intrinsic value, and they make people feel competent and autonomous.

When it comes to rewards, you have plenty of options. You can incentivize a particular form of content over others or even employ it to attract new users. You can create as many types of rewards as possible to nudge users into complying with your community guidelines.

Reward systems have the additional advantage of improving player retention and dedication, especially when users get real-life value out of their more positive attitudes.

Below are some of the most common forms of incentives and reward systems used in the gaming industry:

  • Score systems in which players are ranked on public leaderboards by total points accrued
  • Experience point reward systems that increase avatar attributes and add skills in-game
  • Item-granting systems that give exclusive access to in-game merchandise and tools
  • Rewards in the form of unlocked mechanics, such as new game levels and environments
  • Recognition with real-life merchandise
  • Off-game awards and monetary prizes

Incentives offer you direct and indirect avenues for moderating UGC on your platform.

Monitor Content in All Languages

No matter how effective your content management technology is, undesirable content can still get through your filters if it’s shared in a language your system or moderators cannot interpret. Advances in natural language processing (NLP) technology has been a big help to global technology companies, but such tools are limited in their capacity to detect the inherent dynamics of human languages.

This is why you need a plan for monitoring and managing multilingual UGC.

Adopt Moderation Methods That Suit Your Needs

The methods that you choose for moderating UGC should reflect the challenges your application will face. Take these factors into account when making your choice:

Improve Accountability through Transparency

Fairness and equity are crucial to user content moderation on public forums. You must apply the rules equitably and every user must be subjected to a similar level of moderation. A lot of leading companies that allow UGC, like Google, publish periodic transparency reports explaining all aspects of their content moderation system and sanctions.

What Should You Moderate?

  • Chat: Live chat is a great feature of modern online gaming. It enables your users to interact and socialize while they enjoy your application. Most chat rooms allow users to share media in several formats, including audio, video, and images. If your product has a chat feature, you should moderate exchanges between users and filter out inappropriate or offensive content.

  • Usernames: As innocuous as you might think usernames are, they likely reach a much wider audience on your platform than any other UGC. Be sure to regulate what types of words or phrases that users can choose as usernames.

  • Forum discussions: Like chats, conversations on your forums should be monitored for potentially offensive or harassing content.

  • In-game characters and profiles: Your game might allow users to build in-game characters or models, create backstories, and share them with other players. The character types or images or the written content shared could be problematic or offensive.

The bottom line is, you should moderate any form of UGC that can be shared in-game at all.

Overseeing Games for Kids

The stakes are much higher if you operate a platform meant for children. Your community guidelines should have sections for parents and guardians as well as for children. Stricter moderation strategies must be put into place to protect children from exposure to inappropriate content. Consider requiring content reviews by a trained human moderator before the content is released. You should also take special care to authenticate new and existing users signing on to your platform so that predators and other nefarious characters can’t gain access.

If You Need Help

It can feel overwhelming at first to properly implement content moderation. Communities like the Fair Play Alliance have made it easier for game developers and companies to rely on one another and get guidance on global industry best practices. You and other members of your organization can register to join.

AI vs. Human Moderation

Human-moderated UGC is content that’s moderated, screened, and filtered out by human beings. Armies of moderators are trained to serve as the firewall between users and inappropriate online content. AI-based UGC moderation means the content is moderated by artificial intelligence, in which an algorithm does much of the heavy lifting and filtering.

Both techniques have advantages and disadvantages in key areas:

  • Ethical issues: With human moderators, evidence shows that people who are constantly exposed to harmful content suffer psychologically. This is not a problem with AI monitoring.

  • Cost: Real-time manual moderation is expensive and almost impossible to set up and run.

  • Thoroughness: If properly trained, AI won’t make the kind of mistakes people make due to lapses in concentration or fatigue.

  • Context: AI models, however, are terrible at detecting contextual usage of content. This is where a trained human moderator excels.

Combining the gains of human moderation and the advantages of AI seems like the most effective strategy for handling problematic content generated in real time. The algorithm is responsible for most of the real-time traffic; it filters out content that fits a predefined threshold and sends the content requiring human judgment to human moderators.

In addition to reducing the workload for staff, this combination of human and machine is far more productive, less prone to bias, and cheaper than constantly relying on human moderators.

Conclusion

As you learned in this post, you have multiple options and methods for moderating UGC on your platform. Whichever approach you take and whatever tools you use, you should establish guidelines for your community and create a transparent system around these guidelines to ensure participation.

The potential problems of UGC are not new, but they have expanded greatly over the past few years. For you to reap the many benefits of UGC on your platform, you must go the extra mile to ensure that effective content moderation techniques are in place.