Technology.am (Apr. 7, 2009) — Once again traditional media, The Wall Street Journal and The Associated Press is lashing out in the direction of Google and other sites that aggregate headlines from news sources and post them on their sites.
“There is no doubt that certain Web sites are best described as parasites in the intestines of the Internet, it’s certainly true that readers have been socialized–wrongly I believe–that much content should be free. And there is no doubt that’s in the interest of aggregators like Google who have profited from that mistaken perception. And they have little incentive to recognize the value they are trading on that’s created by others,” Robert Thomson, The Australian Journal’s editor, said on Monday.
William Dean Singleton, chairman of the AP, said, “We can no longer stand by and watch others walk off with our work under misguided legal theories, we are mad as hell and we are not going to take it any more.”
Google has long said that it provides news site owners with a means to block the search engine from crawling its site and indexing its headlines. “Those who publish on the Web have a lot of control over which pages should appear in search results,” Google said in a Blog post “The key is a simple file called robots.txt that has been an industry standard for many years. It lets a site owner control how search engines access their Web site.”
Regardless, the statements from two stalwart print publications raises questions about whether Google will be forced to open up a new front against a new group of copyright owners. The search engine is currently defending itself against a copyright-infringement lawsuit filed in 2007 by Viacom, parent company of MTV and Paramount Pictures.
Image credit: Dvice.