YouTube Accelerates ‘Extreme Content’ Cleanup Efforts

5759

Alphabet Inc’s YouTube said on Monday it intends to include more people next year to identify inappropriate content as the provider responds to criticism over extremist, violent and disturbing videos as well as comments.

YouTube has developed automatic software to identify videos connected to extremist content and today is aiming to do the same with clips which portray hate speech or are unsuitable for children. Uploaders whose videos are flagged by the computer software might be ineligible for generating advertising revenue.

But amid stepped up enforcement, the business has received complaints from video uploaders that the software is error-prone.

The objective is to attract a number of individuals across Google working to address content that may violate its policies to over 10,000 in 2018, YouTube CEO Susan Wojcicki said in a single set of blog posts Monday.

“We need an approach that does a better job deciding on which videos and channels should qualify for advertisements,” she explained. “We have heard loud and clear from founders that we must be more accurate in regards to reviewing articles, thus we don’t demonetize videos by mistake.”

Additionally, Wojcicki said the firm would take “aggressive action on comments, launching new comment communicating tools and in some cases shutting down comments altogether.”

The moves come as advertisers, regulators and advocacy groups express continuing concern over if YouTube’s policing of its service is adequate.

YouTube recently upgraded its recommendation feature to spotlight videos consumers are very likely to come across the most gratifying, brushing aside concerns that such an approach can capture users in bubbles of misinformation and like-minded thoughts.