The challenge: A flexible and agile solution for content moderation
Content moderation still requires an element of human intervention as AI is not yet ‘smart’ enough to understand all the nuances of language, and therefore what can be included and what needs to be removed. The client’s Trust & Safety requirements cover multiple global languages for every type of UGC, and we had to compete with highly mature outsourcing companies to deliver a content moderation team. The client wanted a flexible and agile solution for their ‘Trust & Safety’ operations for the Japanese and Korean languages in their delivery centers in the respective countries.
How we helped: Building an end-to-end solution
In collaboration with our client, we created an end-to-end solution that included recruitment of the right profiles, on-boarding and training of the teams, as well as evergreen performance improvement. We have also deployed a unique wellbeing approach to support and protect the moderators from the unique stressors induced by ‘disturbing’ content.
We launched operations in Japan over 8 years ago as a small, flexible placement solution. We demonstrated a flexible, added-value approach for the client, and matured the programme to outsourcing, with tangible output measured. We now manage the operations in two delivery centres as their outsourcing partner and we support the customer by delivering tangible output governed by strict Service Level Agreements (SLAs).