YouTube announced Monday that 8.28 million videos were removed from its platform during Q4 2017, and that about 80% of those were initially flagged by AI computer systems. This is the first time YouTube has provided the number of videos removed in a specific quarter with plans to release quarterly reports moving forward.
What this means for marketers: For the industry, this is noteworthy as Google begins to transparently address the issues brands and advertisers have been elevating for some time. YouTube announced Monday that 8.28 million videos uploaded to the platform during Q4 2017 were flagged and removed for violating platform content policies. Just over 80%, 6.7 million of the videos were flagged by machines; of those, 76% were removed before receiving a single view. While AI content policing systems are not the panacea Google (or Facebook) need them to be, they are necessary. Current AI systems still have trouble distinguishing context in ways humans can, but over time they will be better in detecting violative content. Even with their current flaws, Google’s deployment of these machines is the current best option considering the scale and breadth of the content needing policing. For brands and advertisers, while Google is making moves towards more transparency, continued pressure on Google to clean its ecosystems is necessary considering their market dominance. It can be difficult to assess how aggressive the company is policing itself, but this is the first right step, and at least we have a baseline.