Twitter has found 49 accounts linked to a notorious Russian “troll factory” which were sending out messages about the Brexit vote during the 2016 referendum campaign.
The company’s UK head of public policy Nick Pickles told MPs that accounts linked to the St Petersburg-based Internet Research Agency amounted to less than 0.005 per cent of those tweeting about the referendum, and received “very low levels of engagement” from other users.
The announcement came in a hearing of the Commons Digital, Culture, Media and Sport Committee’s inquiry into fake news – taking evidence from YouTube, Facebook, Google and Twitter in Washington DC.
YouTube told the cross-party committee, chaired by Tory MP Damian Collins, that it had found no evidence of Russian sources using ads on its video-sharing service to attempt to interfere in the 2016 referendum.
Juniper Downs, YouTube’s global head of public policy, said that the company would be ready to help further investigations into possible Russian attempts to influence votes in Britain.
Meanwhile, Facebook said it had taken down “thousands” of fake accounts in the run-up to 2017 elections in the UK, France and Germany – although they were not necessarily aimed at spreading false information.
Pickles told the committee that Twitter had identified “a very small number of suspected Internet Research Agency-linked accounts”.
“Forty-nine such accounts were active during the referendum campaign, which represents less than 0.005 per cent of the total number of accounts that tweeted about the referendum,” he said.
“Those accounts collectively posted 942 tweets, representing less than 0.02 per cent of the total tweets posted about the referendum during the campaign. Those tweets collectively were retweeted 461 times and liked 637 times.”
This amounted to fewer than 10 likes and 13 retweets per account, which was “a very low level of engagement”, he said.
The cross-party panel of MPs was also using the trip across the Atlantic to meet senior senators who have been investigating allegations of Russian interference and collusion in the American presidential election won by Donald Trump.
YouTube has previously informed a US Senate committee of 18 channels it discovered which were linked to the Internet Research Agency “content farm”.
In September, Facebook bowed to pressure and provided the contents of 3,000 ads bought by a Russian agency to the US committee.
Asked whether any similar searches could be undertaken in relation to UK elections, Downs told the committee: “Absolutely, we are happy to co-operate with the UK Government’s investigations into whether there was any interference in elections in the UK.
“We have conducted a thorough investigation around the Brexit referendum and found no evidence of interference.
“We looked at all advertisements with any connection to Russia and we found no evidence of our services being used to interfere in the Brexit referendum and we are happy to co-operate with any further efforts.”
Facebook’s head of global policy management, Monika Bickert, told the MPs that it had a strict policy of people signing up using their real names and took action to tackle fake profiles.
“In the run-up to the French election, the German election, the UK election we were using our technical tools to remove thousands of fake accounts,” she said.
“Not that those were necessarily related to spreading disinformation or to spreading information about the election, but they were fake accounts and we are using those technical tools to reduce the chance that they might be used to spread disinformation.”
Committee chairman Damian Collins last year criticised Facebook and Twitter over their replies to the committee’s investigation.
Under questioning from the MPs, Google vice-president of news Richard Gingras acknowledged that so-called “fake news” was harmful to users and to society.
He gave the example of bogus cancer cures found by patients searching the internet for information about their conditions.
Gingras said Google was “in the trust business” and felt “an extraordinary sense of responsibility” about the reliability of information highlighted by its search engine and news app.
“The loyalty of our users is based on their continued trust in us,” he said. “To the extent they don’t trust us, they will stop using our products and our business will collapse.
“We believe strongly in having an effective democracy, we believe strongly in supporting free expression and supporting a sustainable high-quality journalism eco-system to make sure that quality information is out there.”
Gingras acknowledged that Google’s auto-complete function, which suggests possible search phrases as users type, sometimes produces offensive suggestions in response to phrases like “Jews are…” – in part due to “malicious actors” seeking to game the system.
But he said Google was constantly working to correct them.
Downs recognised concerns over YouTube’s “up next” feature, which queues videos and loads them automatically for users, has come under attack for suggesting inappropriate videos to users.
“We recognise that there is work to do on our recommendation engine in terms of making sure we are surfacing the right news content to our users,” she said.
Picture: Reuters/Kacper Pempel/Illustration/File Photo