What is the difference between crawlers, spiders and robots in SEO

Author Topic: What is the difference between crawlers, spiders and robots in SEO  (Read 846 times)

Offline EthanHawkeTopic starter

  • Trade Count: (0)
  • Newbie
  • *
  • Thank You 0
  • Posts: 1
  • Karma: 0

It has been asked as an interview question to me the difference between crawlers, spiders and robots in SEO. But i am not able to answer it as there is no such difference between the three all of them are search engine crawlers that searches the keywords and links in website for indexing. So if any one knows little bit information about the difference please reply me.

Offline RH-Calvin

  • Trade Count: (0)
  • Hero Member
  • *****
  • Thank You 49
  • Posts: 1010
  • Karma: -1
  • Gender: Male
    • Cheap VPS Hosting
They are all the same. They are automated search engine programs that are responsible to read through webpage source and provide the webpage with a cache certificate.

Offline fix.97

  • Trade Count: (0)
  • Jr. Member
  • **
  • Thank You 1
  • Posts: 64
  • Karma: 0
    • Fikreview
Spider - a browser like program that downloads web pages.

Crawler – a program that automatically follows all of the links on each web page.

Robots - An automated computer program that visit websites & perform predefined task. They are guided by search engine algorithms & are able to perform different tasks instead of just one crawling task. They can combine the tasks of crawler & spider together and help in indexing and ranking of websites on any particular search engine.

Offline ShreyaKoushik

  • Trade Count: (0)
  • Hero Member
  • *****
  • Thank You 52
  • Posts: 500
  • Karma: 0
  • Gender: Female
    • Scope of e-commerce in India
Crawling - Crawling takes place when there is a successful fetching of unique URIs which can be traced from valid links from other web pages.
Indexing - Indexing takes place after a crawled URIs are processed. Note that there may be several URIs that are crawled but there could be fewer of them whose content will be processed through indexing.

spider- A spider is a program run by a search engine to build a summary of a website’s content (content index). Spiders create a text-based summary of content and an address (URL) for each webpage.

Offline TomClarke

  • Trade Count: (0)
  • Hero Member
  • *****
  • Thank You 15
  • Posts: 896
  • Karma: 0
Spider - scan and judge your website content
Crawler - crawl your content and index it in there searches according to Inbound links.
Both techniques helps Google to judge importance of website in terms of ranking.
Robots.txt  - if the file thta is included into web coding  so that Google not consider it in crawling ot to hide page from Google crawling.


Related Topics

  Subject / Started by Replies Last post
4 Replies
Last post 05-31-2010, 03:26:45
by coolman
5 Replies
Last post 11-05-2011, 00:01:18
by Vinil
4 Replies
Last post 01-19-2012, 22:54:39
by cpaoutsourcing
4 Replies
Last post 11-23-2015, 03:04:50
by seoroy
3 Replies
Last post 03-01-2016, 04:19:12
by hrishivardhan