In online communities, it is common for members to want to create Internet personas that enable them to express themselves and establish an online social identity. By allowing users to choose unique public display names to represent the personas they aim to create, you encourage repeat interaction and engagement with the community. While it is vital to encourage these things, it is also important to ensure public usernames remain appropriate for your environment. The first step in doing so is to identify your target audience. You will most likely want to prevent profanities in usernames and may also need to prevent personally identifiable information (PII) for COPPA compliance in the case of under-13 communities. To do so, you have the choice to implement an automated profanity filter, employ human moderation, or utilize both. This post will cover the limitations, overhead, and assessment of risk for each approach.
Profanity Filter Challenge
Implementing an automated profanity filter to monitor username creation has several benefits. A filter can block obvious profanities and prevent members from using their real name. (Refer to the following link for more on Blocking PII). However, usernames are analogous to personalized license plates. Sometimes the meaning of the letter/number combination jumps out at you immediately, and other times it’s not so obvious. Consider the following examples (say the words out loud if the meanings are not obvious):
The Federal Trade Commission announced yesterday it will not delay the July 1, 2013 date of implementation to update Children’s Online Privacy Protection Rule (COPPA). Over 19 trade associates have requested for an extension claiming more time is needed for the companies to transition and come up to par with current business practices. Many companies fear the significant impact that comes with COPPA 2.0 changes.
Extension Response from Commission Letter
Due to the “proliferation” of the digital market, including mobile devices, online social environments, virtual worlds and children's accessibility to these mediums the Commission noted the COPPA rule was on a “accelerated schedule.” 1.
Final amendments have been announced and Commission issues the statement of basis and purpose (SBP) and responds to analysis of public comments of obligated parties. Making note of costs and burdens of compliance as well as Commissions decision on the effective date of July 1, 2013 (Consistent with original rule and time frame of November 3, 1999 and its effective date on April 21, 2000). 2.
Commission responds it has provided sufficient guidance in regard to obligation of amended rule and responsible parties and its effective date is adequate.
Cyber bullying, or online bullying, continues to be a significant problem for teen and child-focused online communities. What are the steps a site owner can take to prevent or minimize this type of behavior in their online community?
First, let’s give a quick overview of cyber bullying. Cyber bullying is similar to “regular” bullying, but is done through electronic means (e.g., cell phone, computer, tablet). There are many avenues for cyber bullying: social media sites, text messages, forums, and chat rooms are among them. According to the US Department of Health and Human Services Cyberbullying Research Center, a stunning 52% of students reported being cyber bullied. It may be difficult for adults to relate to cyber bullying as the sheer speed and scale is much greater than bullying that we grew up with. No longer is bullying just done face-to-face. Being able to disparage someone online provides not only anonymity, but also, an extremely wide audience.
Communities are no longer restricted by walls or boundaries. People from all over the world can congregate and share their thoughts and opinions from the click of a button. A site owner has an inherent responsibility to protect users and prevent unwanted content. The chat filter is your first line of defense, but when multiple languages find their way in to the community, it can get confused and create false positives. Filtering multiple languages at the same time can quickly turn your leading advocates in to antagonists.
1. Word Collision
Word collisions occur when filtering multiple languages from a central black list. A word in English does not necessarily mean the same thing in German or Spanish. Filtering words and phrases in multiple languages within one community will create false positives. As an example, the word pupil (the center of the eye) is harmless. When an “a” is placed at the end, “pupila,” it becomes derogatory. The sequence of letters placed within a word can mean something harmless in one language and be profane in another. Be aware of the users in your online community and refine your filter based on the languages most commonly seen.
As User Generated Content (UGC) becomes more prevalent, it is useful to explore the different types of user communications in online communities. Users can communicate in one of three online settings: public area, group, and private chat. Each form of communication can enhance user experience but can also present unique challenges for the community moderator. We’ll look at each of these types of communication, identify some benefits and challenges of each and suggest strategies for effective community moderation.
Many online communities have a public area where all members can interact. This environment is much like the lobby in a building where people can congregate to share news, meet up for an activity, ask questions and develop a sense of community with other members. In a game environment, players might meet in the lobby to trade goods or discuss game strategies and tactics, or plan to enter the game together.
Since the lobby is the community segment where the most participants congregate at any one time, it is the perfect place for those with nefarious intent to target and approach other community members. Spammers can blast the public area with offers and promotions. Online Predators can identify potential victims and groom them with small talk then invite them into private chat to continue their overtures unobserved. Trolls can badger other players.
Community moderators must observe the public area and take action against bad actors to preserve the mood of the community. Fortunately, filtering and moderation software like CleanSpeak can help minimize the workload and alert moderators to questionable content. The community moderator uses this tool to view alerts and make decisions based on current and historical user information.