Google Spider

Started by vedwati01, 02-09-2015, 04:58:09

Previous topic - Next topic

vedwati01Topic starter

Hello,

What is google spider? how it can work?

Thanks
newbielink:http://packers%20and%20movers%20gurgaon [nonactive]
newbielink:http://movers%20and%20packers%20ghaziabad [nonactive] , newbielink:http://packers%20and%20movers%20delhi [nonactive]
  •  


Worldeyeglasses

Each major search engine has one or more spider. These show up in server logs as unique user agents. 'Polite' spiders identify their parent search engine. For example the Google spider identifies itself as 'Googlebot'.
Worldeyeglasses - Eye Doctor Fort Lauderdale
  •  


kavyasharma561

Web spiders or web crawlers are bots or automated web services popularized by Google. They visit the sites that are listed or added on your site and then proceed to visit and enlist the sites that have links from the original set of sites on your site.

qx_1789


The search engine "bots" index the websites based on their content and links. if you change the content of any web page, it will automatically detect those changes and index them.
eg Googlebot, Bingbot,

morganlong

Googlebot is known as Google Spider. Googlebot discovers new and updated pages to be added to the Google Index is the process known as crawling. The program that does the fetching is called Google Spider or robot or bot or Googlebot.


RH-Calvin

Google spider is a search engine program that helps to read through your webpages and index them in search engine database.

Talukdar

A spider is a program that visits Web sites and reads their pages and other information in order to create entries for a search engine index. The major search engines on the Web all have such a program, which is also known as a "crawler" or a "bot." Spiders are typically programmed to visit sites that have been submitted by their owners as new or updated. Entire sites or specific pages can be selectively visited and indexed. Spiders are called spiders because they usually visit many sites in parallel at the same time, their "legs" spanning a large area of the "web." Spiders can crawl through a site's pages in several ways. One way is to follow all the hypertext links in each page until all the pages have been read.

SaanviRao

#7
Google spider is the website crawling program that helps to crawl every page of a website that are linked with the main domain. Unless and until a web page is linked with the main URL of any website it won't be indexed.


TomClarke

Google Spider is the crawler used by search engines to keep its index updated. All search engines run crawlers which shows up as connection to websites.
Spider friendly sites are the ones that have relevant and quality links on them. Google bot only crawls links, don't expext the bot to put in login details, if your page cannot be accessed by a link the bot would not see it let alone crawl it. There's no fixed time for the spider to crawl your website, but it doe not do it in  real time. Understand that the entire Google algorithm and how things work is extremely private information available only to Google's team.