Google has been accused of telling ‘mistruths’by publishers who support a new standard for regulating how search engines access news websites.
World Association of Newspapers president Gavin O’Reilly, who chairs the publishers’ consortium behind Automatic Content Access Protocol (ACAP), called on Google to adopt the new standard after a Google executive told an industry conference last week that the existing standard was sufficient.
ACAP aims to extend the functionality of the existing standard, the Robots Exclusion Protocol, better known as robots.txt, by adding additional ways for publishers to embed terms and conditions on the use of their material in their website.
O’Reilly said: ‘It’s rather strange for Google to be telling publishers what they should think about robots.txt, when publishers worldwide – across all sectors – have already and clearly told Google that they fundamentally disagree.
‘If Google’s reason for not (apparently) supporting ACAP is built on its own commercial self-interest, then it should say so, and not glibly throw mistruths about.”
O’Reilly’s comments came in response to those made by Google’s European head of content partnerships, Rob Jonas, who said last week that ‘the general view’within the search company was that robots.txt was best for ‘what publishers need to do”.
Since officially launching in November, ACAP has been encouraging both publishers and search engines to adopt the new standard.
Both The Times Online and The Independent website have incorporated ACAP commands into their robots.txt files.
Newspaper websites in 15 other countries – including Australia, Belgium, Denmark and Germany – have also adopted the standard.
So far none of the three major search engines – Google, Yahoo! and MSN – have adopted ACAP by pledging to honour the additional instructions that ACAP allows publishers to embed in their sites.