What is the difference between crawlers, spiders and robots in SEO

Started by EthanHawke, 01-18-2016, 03:54:14

Previous topic - Next topic

EthanHawkeTopic starter

Hi

It has been asked as an interview question to me the difference between crawlers, spiders and robots in SEO. But i am not able to answer it as there is no such difference between the three all of them are search engine crawlers that searches the keywords and links in website for indexing. So if any one knows little bit information about the difference please reply me.
  •  


RH-Calvin

They are all the same. They are automated search engine programs that are responsible to read through webpage source and provide the webpage with a cache certificate.


fix.97

Spider - a browser like program that downloads web pages.

Crawler – a program that automatically follows all of the links on each web page.

Robots - An automated computer program that visit websites & perform predefined task. They are guided by search engine algorithms & are able to perform different tasks instead of just one crawling task. They can combine the tasks of crawler & spider together and help in indexing and ranking of websites on any particular search engine.

TomClarke

Spider - scan and judge your website content
Crawler - crawl your content and index it in there searches according to Inbound links.
Both techniques helps Google to judge importance of website in terms of ranking.
Robots.txt  - if the file thta is included into web coding  so that Google not consider it in crawling ot to hide page from Google crawling.

MVMInfotech18

Robots are programs that follow links on Web sites, gathering data for search engine indexes. They are considered robots rather than programs because they are somewhat independent: no programmer has told them what pages to find. Other names for these programs are spider or crawler (because they follow links on the Web), worm, wanderer , gatherer