YouTube removed about 5 million videos from its source for content policy violations in last year’s fourth quarter before any viewers watch them, it said in a new report that highlighted its response to pressure to better police its online community.
YouTube has been criticized by the government because not do enough to delete extremist content, and by advertisers, such as Procter & Gamble Co and Under Armour Inc that briefly boycotted the service when they unwittingly ran ads alongside videos the companies deemed inappropriate.
YouTube said it still needed an in-house team of humans to verify automated findings on an additional 1.6 million videos that were removed only after some users watched the clips.
The automated system did not identify another 1.6 million videos that YouTube took down once they were reported to it by users, activist organizations, and governments.
“They still have lots of work to do but they should be praised in the interim,” Paul Barrett, who has followed YouTube as deputy director at the New York University Stern Center for Business and Human Rights, said.
Facebook reported that it had removed or put a warning label on 1.9 million pieces of extremist content related to Daesh or Al Qaeda in the first three months of the year, or about double the amount from the previous quarter.
Corralling problematic videos, whether through humans or machines, could help YouTube, a major driver of Google’s revenue; stave off regulation and a sales hit. For now, analysts say demand for YouTube ads remains robust.