The generic definition of crawler
A crawler (also known in SEO jargon as a “spider” or “web crawler”) can be defined as software capable of automatically analyzing documents stored within a database.
In the context of search engines, a crawler is typically a software that scans and retrieves accessible documents, such as those made publicly available on the internet.
The SEO Definition of “Crawler”
In SEO, a search engine’s web crawler is a bot used by search engines to explore the web. Its main task is to scan accessible pages.