As brands increasingly communicate with and engage with their consumers online, it’s all the more imperative for brands to transform how they support and protect their online customers. The consumer experience of a brand now needs to be ubiquitous across all touchpoints, including how passionate customer bases create and engage with branded content
In Southeast Asia there are over 482 million active social media users creating content that engages, drives conversation around a brand, inspires loyalty, and builds organic search traffic. In fact, so much content is created online that it is now of importance in the boardroom.
Gartner predicts that user-generated content moderation will be a C-suite priority for at least 30% of large organizations by 2024. It is a powerful tool, but one that comes with a fair share of risk. The onus is on companies to tightly manage their content so that it remains a resource that delivers value to the business and the consumer.
Yet a key challenge in content management is how to preserve online communities from offensive or fraudulent content that can impact a company's reputation and engagement over the long term. People want to feel safe and somewhat protected, as do those who moderate and manage the content.
Content moderation protects customers, moderators, and the brand
Today, AI and other workflow-based tools are used to manage content so that those that consume the content, along with the community hosts and moderators, are not overly exposed to content that is soul-destroying, horrific, or designed to defraud people.
Content moderators are tasked to defend the brand’s reputation. The job can be mentally demanding and over-exposure to negative and traumatic content can be soul-destroying. An example is content moderators for Facebook who had developed post-traumatic stress disorders on the job.
To pre-empt the issue, a sound content moderation strategy that values its people and yet maximizes the opportunity to enhance the brand should be put in place, supplemented by technological tools to filter out repetitive streams of disparaging content.
User-generated content can also expose a brand’s inadequacies, for example, poor responsiveness, or exposure to brand detractors. Yet often it is the content that is mishandled, such as an unnoticed social engagement, that makes a company viral for all the wrong reasons. The strategy needs to encompass how the company manages situations such as these. And in communities where users lead the content generation, the brand’s reputation is at risk. Technology can help mitigate this risk.
Effective moderation leverages carbon and silicon
Modern content moderation strategies use teams of well-trained human moderators coupled with artificial intelligence (AI) and machine learning (ML) tools to limit risk because the technology filters out risky content and can be scaled on demand. Purpose-built technology for content moderation also helps to streamline workflows and rapidly identify risks or issues. Carousell, the classified marketplace, recorded a 67% drop in fraud rates in 2020 when they deployed a content moderation team and used a combination of AI with digital fingerprinting technology to identify and prevent the return of bad actors.
Algorithms whittle out the weeds and the chaff, leaving complex and uncertain issues for skilled humans to deftly deal with, permitting faster moderation and identification of issues before they escalate.
With the right service provider, a brand can unpack the value of user-generated content with minimal risk, in a manner that will help enhance the brand’s reputation and customer engagement. Technology can also help reduce cultural whiplash by working with multilingual capabilities designed specifically to engage native language moderators that operate within their own culture.
A modern content moderation strategy helps brands unpack the value of the content, unlock the data insights from within the content, and minimize the risks. This is the digital way to tell your company story.
- Championing Diversity & Inclusion | Sudhir Agarwal, Founder and CEO
- Content Moderation Evolved for the Experience Age
- Case Study: User-Generated Content Moderation