Skip navigation

Glossary

Crawler

Crawler

A web crawler, also known as a spider or spiderbot, is a computer program that traverses the web by downloading web pages and extracting data from them. It can be used for various purposes like indexing web pages, gathering data for search engines, or monitoring websites for updates. Web crawlers are typically programmed to visit specific websites or links and harvest information from them in an automated fashion.

Crawler on ikipedia
Back