Google has said it will employ more than 10,000 staff as moderators to eliminate violent and abusive content from YouTube.
In March, the UK government
pulled all its advertising from the video sharing platform, after
concerns arose it was being featured besides inappropriate content. Ever
since, YouTube has taken down over 150,000 of these videos, after Prime
Minister Theresa May gave an ultimatum to have terrorist material taken
down two hours within their upload time.
Susan
Wojcicki, CEO of YouTube, wrote in the Daily Telegraph that some users
were using YouTube to “mislead, manipulate, harass or even harm”.
According to her, Google’s algorithm tends to promote these kinds of
extremist videos and needs tweaking.
According to Wojcicki, YouTube’s staff have reviewed nearly two million videos since June. This is in an attempt to manually train the site’s machine learning algorithm to identify videos that are inappropriate, enabling the staff to work at least five times. At the same time, the teams are working with child safety organisations around the world to catch videos and accounts that exhibit predatory behaviour.