Cyber bullying, or online bullying, continues to be a significant problem for teen and child-focused online communities. What are the steps a site owner can take to prevent or minimize this type of behavior in their online community?
First, let’s give a quick overview of cyber bullying. Cyber bullying is similar to “regular” bullying, but is done through electronic means (e.g., cell phone, computer, tablet). There are many avenues for cyber bullying: social media sites, text messages, forums, and chat rooms are among them. According to the US Department of Health and Human Services Cyberbullying Research Center, a stunning 52% of students reported being cyber bullied. It may be difficult for adults to relate to cyber bullying as the sheer speed and scale is much greater than bullying that we grew up with. No longer is bullying just done face-to-face. Being able to disparage someone online provides not only anonymity, but also, an extremely wide audience.
Communities are no longer restricted by walls or boundaries. People from all over the world can congregate and share their thoughts and opinions from the click of a button. A site owner has an inherent responsibility to protect users and prevent unwanted content. The chat filter is your first line of defense, but when multiple languages find their way in to the community, it can get confused and create false positives. Filtering multiple languages at the same time can quickly turn your leading advocates in to antagonists.
1. Word Collision
Word collisions occur when filtering multiple languages from a central black list. A word in English does not necessarily mean the same thing in German or Spanish. Filtering words and phrases in multiple languages within one community will create false positives. As an example, the word pupil (the center of the eye) is harmless. When an “a” is placed at the end, “pupila,” it becomes derogatory. The sequence of letters placed within a word can mean something harmless in one language and be profane in another. Be aware of the users in your online community and refine your filter based on the languages most commonly seen.
As User Generated Content (UGC) becomes more prevalent, it is useful to explore the different types of user communications in online communities. Users can communicate in one of three online settings: public area, group, and private chat. Each form of communication can enhance user experience but can also present unique challenges for the community moderator. We’ll look at each of these types of communication, identify some benefits and challenges of each and suggest strategies for effective community moderation.
Many online communities have a public area where all members can interact. This environment is much like the lobby in a building where people can congregate to share news, meet up for an activity, ask questions and develop a sense of community with other members. In a game environment, players might meet in the lobby to trade goods or discuss game strategies and tactics, or plan to enter the game together.
Since the lobby is the community segment where the most participants congregate at any one time, it is the perfect place for those with nefarious intent to target and approach other community members. Spammers can blast the public area with offers and promotions. Online Predators can identify potential victims and groom them with small talk then invite them into private chat to continue their overtures unobserved. Trolls can badger other players.
Community moderators must observe the public area and take action against bad actors to preserve the mood of the community. Fortunately, filtering and moderation software like CleanSpeak can help minimize the workload and alert moderators to questionable content. The community moderator uses this tool to view alerts and make decisions based on current and historical user information.
When and if you decide to create an online community, you, or someone else you hire will want to moderate the content your users submit. Approval queues are an efficient way to maintain a healthy and pro active online community when working with content that is persistent. If you have content that you want to approve before it is viewable in the community, or if you want the ability to remove unwanted content later, please read on.
Types of Content
Two types of content exist, transient and persistent. Both have a place depending on the type community involved. One accompanies real-time interaction, while the other is best suited for approval processes.
Lasting only for a short time; impermanent. Chat rooms, online games or any other application that uses transient real-time chat should not use approval processes. Preventing users to engage with others in real-time online environments will frustrate your community, decrease retention and will deter others from joining. It is best to use an intelligent chat filtering solution rather than approval queues in these environments.
Persistent content sticks around for a while. For example, when you write a review on an Amazon product, share insight in a forum, tweet a relative article, post an image on your friends wall or post a blog for the rest of the world to see. It is persistent because the moment that it’s submitted and posted, it stays in one place. Approval queues are best when managing persistent content.
A few weeks ago, my 8-year old son came to me and said, “Dad, can you help me with something?” I said, “Absolutely.” He led me over to the family computer, and he explained to me that while he was playing Roblox, this “other guy” was being really uncool. I asked him to tell me what the player was doing, and he said, “Let me show you.”
For the next 20 minutes, the two of us watched as this player proceeded to ruin the game for everyone else. My son explained that in this level, there was a special item that afforded players a lot of power. Because this “other guy” player had this special item, he had the ability to fly, create anything, or destroy anything. He was using this power to ruin the experience of other players, including my son.