Internet giant Google is making changes to its search engine designed to counter the spread of fake news.
The technology firm has revealed new guidelines for its testers, known as Raters, and algorithm changes to identify more authoritative and recognised web pages which will be used to improve the quality of ranking in search results.
- August 28, 2019
- August 9, 2019
- July 10, 2019
The firm has been criticised in recent months over “low-quality content” on its platform, including a high-profile incident in which a page denying the events of the Holocaust appeared at the top of search results on the subject.
New, in-depth feedback tools are also being added to the search engine’s auto-complete and featured snippets features so users can report offensive or inaccurate content.
Earlier this year the search engine had to remove a featured snippet that suggested former US president Barack Obama was planning a coup with China.
Ben Gomes, Google Search’s engineering vice president, said: “Search can always be improved.
“Today, in a world where tens of thousands of pages are coming online every minute of every day, there are new ways that people try to game the system.
“The most high profile of these issues is the phenomenon of ‘fake news’, where content on the web has contributed to the spread of blatantly misleading, low-quality, offensive or downright false information.
“While this problem is different from issues in the past, our goal remains the same – to provide people with access to relevant information from the most reliable sources available.
“And while we may not always get it right, we’re making good progress in tackling the problem. But in order to have long-term and impactful changes, more structural changes in Search are needed.”
The updated guidelines, introduced last month, “explicitly provide” examples of low-quality pages that should be flagged by Raters and demoted as a result when combined with the changes to Google’s algorithms.
Google uses what it calls Quality Raters – around 10,000 testers based around the world – who analyse and rate web pages to help Google’s algorithm better identify which pages to promote in a search.
The firm says it uses testers from all segments of its user base to avoid making political or biased decisions when determining ratings.
Gomes said: “We’ve adjusted our signals to help surface more authoritative pages and demote low-quality content, so that issues similar to the Holocaust denial results that we saw back in December are less likely to appear.”
Google said the direct feedback tools would enable users to give more detailed reasons why they flagged a web page as inaccurate or offensive.
Users would also be able to choose different categories to explain why they felt content was unhelpful, which would help Google improve its algorithms.
Gomes said Google was determined to stay “one step ahead” and that the changes put it “on a path to addressing this problem”.
Picture: Reuters/Dado Ruvic/File Photo