Cho, Junghoo and Garcia-Molina, Hector (2003) Effective Page Refresh Policies for Web Crawlers. Technical Report. Stanford InfoLab.
In this paper we study how we can maintain local copies of remote data sources ``fresh,'' when the source data is updated autonomously and independently. In particular, we study the problem of Web crawlers that maintain local copies of remote Web pages for Web search engines. In this context, remote data sources (Web sites) do not notify the copies (Web crawlers) of new changes, so we need to periodically poll the sources to maintain the copies up-to-date. Since polling the sources takes significant time and resources, it is very difficult to keep the copies completely up-to-date. This paper proposes various refresh policies and studies their effectiveness. We first formalize the notion of ``freshness'' of copied data by defining two freshness metrics, and we propose a Poisson process as the change model of data sources. Based on this framework, we examine the effectiveness of the proposed refresh policies analytically and experimentally. We show that a Poisson process is a good model to describe the changes of Web pages and we also show that our proposed refresh policies improve the ``freshness'' of data very significantly. In certain cases, we got orders of magnitude improvement from existing policies.
|Item Type:||Techreport (Technical Report)|
|Uncontrolled Keywords:||Web Crawler Page Refresh|
|Subjects:||Computer Science > Databases and the Web|
|Deposited By:||Import Account|
|Deposited On:||10 Jul 2003 17:00|
|Last Modified:||24 Dec 2008 09:09|
Repository Staff Only: item control page