You recognize the potential value of implementing a profanity filter to keep your community clean and productive, but wonder, can it be trusted? Will the filter offer the efficiency you seek while maintaining the flexibility necessary for the ever-changing online environment? After all, even the most advanced language filtering techniques will produce false-positives or let creative slang slip through the cracks. The good news is there is an answer to improving profanity filtering effectiveness: Updating filter lists in real time. The ability to customize your profanity filter in real time is the key to gaining the security you seek, optimizing user experience and creating a successful community.
Protect Your Community Culture
Online communities commonly develop their own sub-culture enhanced by new and creative vocabulary specific to that culture. Had you heard of the terms uber, leet, or ftw prior to the rise of virtual worlds? This phenomenon should be embraced and supported by site owners as it strengthens the community by enhancing the online experience and increasing user engagement. However, new fun ways to communicate within a community also opens doors for new methods of harassment and bullying. So while it is important to allow community evolvement and creativity, you must also protect your members. A powerful tool for doing so is having the ability to update your profanity filter in real-time, which provides the following benefits:
Communities are no longer restricted by walls or boundaries. People from all over the world can congregate and share their thoughts and opinions from the click of a button. A site owner has an inherent responsibility to protect users and prevent unwanted content. The chat filter is your first line of defense, but when multiple languages find their way in to the community, it can get confused and create false positives. Filtering multiple languages at the same time can quickly turn your leading advocates in to antagonists.
1. Word Collision
Word collisions occur when filtering multiple languages from a central black list. A word in English does not necessarily mean the same thing in German or Spanish. Filtering words and phrases in multiple languages within one community will create false positives. As an example, the word pupil (the center of the eye) is harmless. When an “a” is placed at the end, “pupila,” it becomes derogatory. The sequence of letters placed within a word can mean something harmless in one language and be profane in another. Be aware of the users in your online community and refine your filter based on the languages most commonly seen.
As User Generated Content (UGC) becomes more prevalent, it is useful to explore the different types of user communications in online communities. Users can communicate in one of three online settings: public area, group, and private chat. Each form of communication can enhance user experience but can also present unique challenges for the community moderator. We’ll look at each of these types of communication, identify some benefits and challenges of each and suggest strategies for effective community moderation.
Many online communities have a public area where all members can interact. This environment is much like the lobby in a building where people can congregate to share news, meet up for an activity, ask questions and develop a sense of community with other members. In a game environment, players might meet in the lobby to trade goods or discuss game strategies and tactics, or plan to enter the game together.
Since the lobby is the community segment where the most participants congregate at any one time, it is the perfect place for those with nefarious intent to target and approach other community members. Spammers can blast the public area with offers and promotions. Online Predators can identify potential victims and groom them with small talk then invite them into private chat to continue their overtures unobserved. Trolls can badger other players.
Community moderators must observe the public area and take action against bad actors to preserve the mood of the community. Fortunately, filtering and moderation software like CleanSpeak can help minimize the workload and alert moderators to questionable content. The community moderator uses this tool to view alerts and make decisions based on current and historical user information.
Consumer Privacy Education
This is a significant sign of progress: The National Association of Attorneys General (NAAG) is working with Facebook on consumer privacy education. We’re still only in the first half of this decade, and in the second half of the last one, the state attorneys general were threatening legal action against a social media service – MySpace, the most popular one of that time. Now NAAG is actually co-branding a consumer-ed campaign with this decade’s biggest social media service. Today NAAG and Facebook launched “Safety and Privacy on Facebook.” On the page, parents will find privacy tips, videos from Facebook’s “Ask the Safety Team,” and a public service announcement from FB CEO Sheryl Sandberg and Maryland Attorney General Douglas Gansler, president of NAAG. [For more, see our Parents' Guide to Facebook.]
On behalf of the entire team here at Inversoft, we extend our hearts and sincerest thoughts to all those affected by the Boston Marathon Explosions.
It is times like these we are reminded to take a moment and be thankful for the freedoms and luxuries we sometimes forget, and most importantly those we appreciate and hold close to our hearts.
"Make it a habit to tell people thank you. To express your appreciation, sincerely and without the expectation of anything in return. Truly appreciate those around you, and you'll soon find many others around you. Truly appreciate life, and you'll find that you have more of it."
- Ralph Marston
Inversoft discontinues all original content for this week.