- Traversal crawlers:
A crawler is software that automatically scans various web sites and collects web documents from them.
They follow the links on a site to find other relevant pages.
Depth/breadth -first search method is usually used for the web page traversal.
|
|
|