People are likely to surf the web using search engines. Crawler (a part of search engine) continuously crawls the web and keeps its collection as fresh as possible. An efficient crawler should address issues like unnecessary burden on the web crawler, parallelism of the crawling process, freshness of the web contents discovered and revisiting frequency of web pages. When pages are changing very fast then these crawlers need to visit the pages as frequently as possible. Today when web size has become very large; these revisits not only engage the network traffic for a longer time but the...
People are likely to surf the web using search engines. Crawler (a part of search engine) continuously crawls the web and keeps its collection as fres...