Online Community: 3 Types of User Communication

Mike King

As User Generated Content (UGC) becomes more prevalent, it is useful to explore the different types of user communications in online communities.  Users can communicate in one of three online settings:  public area, group, and private chat.  Each form of communication can enhance user experience but can also present unique challenges for the community moderator.  We’ll look at each of these types of communication, identify some benefits and challenges of each and suggest strategies for effective community moderation.

Types of Online Communities

Public Areas

Many online communities have a public area where all members can interact.  This environment is much like the lobby in a building where people can congregate to share news, meet up for an activity, ask questions and develop a sense of community with other members.   In a game environment, players might meet in the lobby to trade goods or discuss game strategies and tactics, or plan to enter the game together.

Since the lobby is the community segment where the most participants congregate at any one time, it is the perfect place for those with nefarious intent to target and approach other community members.  Spammers can blast the public area with offers and promotions.  Online Predators can identify potential victims and groom them with small talk then invite them into private chat to continue their overtures unobserved.  Trolls can badger other players.

Community moderators must observe the public area and take action against bad actors to preserve the mood of the community.  Fortunately, filtering and moderation software like CleanSpeak can help minimize the workload and alert moderators to questionable content.  The community moderator uses this tool to view alerts and make decisions based on current and historical user information.

Groups

The use of groups in the community enables people with shared interests or friendships to communicate amongst themselves in a smaller landscape.  Some conduct may be more tolerated within a group than it would be in the public area since members have to be invited and must opt in.  Spammers and trolls are typically prevented from joining in or are quickly removed by group owners.  One of the mechanisms available is community moderation, which gives users the ability to report someone violating the rules or use the ‘ignore’ feature to block all content from the other user.  Members may have more latitude within a group setting than in the public area – if they don’t like the conduct of group members they can leave the group and join another.

Sometimes groups are predisposed toward nefarious activities like bullying and hate speech, so the community moderator must use moderation tools to identify the participants and their group affiliations.  The moderator can use severity settings that trigger different responses depending on the severity assigned to words or phrases.  The response might be an auto-pop-up that informs or warns a participant when their conduct is flagged.  They can also set up the system to auto-ban users for high severity violations.  Using these tools, the moderators can monitor activity of group members and work within the group to cultivate more appropriate behavior by use of warnings, bans or even shutting down the group.

Private Chat

Community members can sometimes use private chat to speak freely with one other person.  This allows people to form closer relationships and creates a higher level of intimacy within the community as a whole as each member forms and cultivates more one-to-one friendships.  This is good.

The challenge with private chat is that groomers, trolls and online predators may be able to entice members they targeted in the public or group areas into the private channel where oversight of their behaviors is greatly minimized.  It is in this channel that intelligent profanity filters like CleanSpeak from Inversoft are invaluable to the community moderator.  Carter Pham is the Online Community Manager for Animal Jam, a virtual world for children that is heavily moderated. He says his team uses a different protocol for each chat paradigm.  He points out “while we monitor all the paradigms closely, we consider content in the private dens more carefully because the chat that occurs there has the potential to escalate due to the player’s perception that no one is watching what they do in their private den.”

A good filter blocks profanity, sharing of Personally Identifiable Information (PII) and all other words or phrases deemed inappropriate by the community moderator.  There are many levers within the filter and moderation tool set that enable the moderator to actively monitor and control behavior within the community.  Pham says, “Sometimes when the filter flags questionable content, in a certain category we can use the opportunity as a teachable moment to help players understand the rules and proper etiquette for our online community.”

 

Just as in the real world, the setting affects the nature of activity.  People are more inclined to refrain from the worst activities in a public setting while they speak and act more freely the smaller the group.  Community moderators must strike a delicate balance between giving members freedom to communicate openly and controlling content to ensure the mood of the community is positive and healthy.

 

Further Reading:

Types of Profanity Filters for Online Safety

Prevent Users from Sharing PII & Account Information

Online Communities: Approval Processes 

 

Tags: