Google is planning to deploy 10,000 employees to moderate YouTube videos in 2018. Recently there have outcry against the number of extremist content, non family friendly videos and channels flooding the video sharing site.

YouTube chief, Susan Wojcick said that some users were exploiting YouTube to “mislead, manipulate, harass or even harm”.

She said the website, owned by Google, had used “computer-learning” technology that could find extremist videos.

Ms Wojcicki also revealed that staff had reviewed nearly two million videos for violent extremist content and more than 150,000 of these videos have been removed since June.

Also, she disclosed that the company has begun training its algorithms to improve child safety on the platform and to be better at detecting hate speech. To be able to teach its algorithms which videos need to be removed and which can stay, though, it needs more people’s help. That’s why it aims to appoint as many as 10,000 people across Google to review content that might violate its policies.

Wojcicki said that the company was taking “aggressive action” on comments, using technology to help staff find and shut down hundreds of accounts and hundreds of thousands of comments.

Also, its teams will “work closely with child safety organizations around the world to report predatory behaviour and accounts to the correct law enforcement agencies”.

In addition to getting 10,000 Google employees’ help, YouTube also plans to conjure up stricter criteria to consult when deciding which channels are eligible for advertising. At the moment, creators need at least 10,000 views to be able to earn ad money, but it sounds like the platform will also expand its team of reviewers to vet channels and videos and “ensure ads are only running where they should.”

The video sharing website also pledged to be a lot more transparent in 2018 publishing reports containing data on the flags it gets, along with the the actions it takes to remove any video and comment that violates its policies.