It turned out that Google Europe publicly apologized over placing ads next to extremist content only after it was “read the riot act” at a Whitehall summit.
Google’s representatives will meet Cabinet Office ministers again to set out further action aimed at strengthening Google’s advertising policies in order to make sure that government ads don’t appear next to extremist YouTube videos. In the meantime, Government advertising remains suspended from YouTube until an action plan and timetable are agreed to solve the problem.
The Commons home affairs select committee made clear that Google had still not made a commitment to proactively search for material from terrorist or illegal organizations on its network. They pointed out that a recruitment video posted by a far-right group banned in the UK was still live on YouTube despite the MPs’ complaints. Moreover, the solicitor general said that a criminal offence of “recklessly disseminating this material” existed in law.
It turned out that Google had been read the riot act at a Downing Street meeting and had been told to draft an action plan and a timetable to make sure such content was removed within 24 hours. The committee is also considering the German approach of fining social media companies hosting hate crime content.
In response, Google claimed that the government’s response was insufficient, because ministers must tell the company what action they want to take, and it should not be beyond the wit of Google to use search engines or algorithms to remove this illegal content.
In the meantime, Twitter reported that it had suspended over 375,000 accounts for breaches associated with the promotion of terrorism during the last 6 months. According to its Transparency Report, 636,000 accounts had been suspended since August 2015 for links to extremism, most of them being shut down in result of the Twitter’s spam account-scanning technology.
Aside from the extremism ads issue, the committee also heard evidence from the parliamentary authorities that tech giants fail to react quickly or “sensibly” enough to reports of the online abuse of MPs. In the UK, an “embedded team” was being put in place to monitor social media comments about MPs and advise them when they were being targeted.