Kinzen helps companies that host content to improve their response to highly complex moderation challenges. We support the wide variety of teams inside these companies who are promoting trust and ensuring safety within online communities. This includes (but is not limited to) policy, enforcement, threat assessment, risk response and curation experts, and the product teams who support them.

Kinzen also works with content moderation service providers engaged by large technology companies, and consults with public policy makers seeking to better understand and respond to harmful content.

Kinzen does not make the decisions about what content to moderate. Instead, we support our partners as they develop and enforce policies best suited to the communities they seek to protect.

Our goal is to help our clients make more precise and consistent decisions about evolving online threats to real-world safety. We do this by focusing on harmful content with the greatest capacity to create violence, abuse or civil unrest, and performing the following tasks:

  • Prioritise countries and languages in which content moderation is potentially a matter of life-or-death
  • Decode the cultural and linguistic nuances which distinguish harmful content from place to place
  • Prepare for events during which organized disinformation could undermine electoral integrity, provoke violence or promote conflict
  • Pre-empt the spread of international misinformation narratives which threaten public health, such as anti-vaccine campaigns
  • Analyse the evolution of persistent campaigns of hateful speech, such as anti-semitism
  • Anticipate the emergence of campaigns of violent rhetoric based on identity

Technology plays a critical role at Kinzen, helping scale our work through a constantly improving feedback loop between human analysts and artificial intelligence. Kinzen experts and analysts gather and label data which is then used to train machine learning models. This helps Kinzen analysts improve the detection and understanding of harmful content across languages and platforms.

Our Mission

We empower the internet’s essential workers. We are the support team for the people who keep our online communities safe from harmful content.

Read More

Our Principles

Our principles start with the two core foundations of journalism: fairness and impartiality.

Read More