A crawler is an automatic program used by some search engines that will visit different websites. It will read the website’s pages and the information on it in order to create entries that can be used for the search engine index list. A crawler will keep visiting sites that are marked as new or updated by the webmasters in order to update the search engine index continuously. Crawlers are also referred to as spiders or bots. A crawler received its name because it crawls through sites, a page at a time, following each link on the site until all linked pages have been read.