Whats the largest page size that Googles spider will crawl?

Author Topic: Whats the largest page size that Googles spider will crawl?  (Read 4424 times)

Offline ORLOVATopic starter

  • Trade Count: (0)
  • Full Member
  • ***
  • Thank You 4
  • Posts: 127
  • Karma: 0
Hi guys,

which answer you will choose
a) No set limit exists - Google may crawl very large pages if it believes them to be worthwhile
b) 1000KB
c) 2GB
d) 100KB


Offline checkifsccode

  • Trade Count: (0)
  • Semi-Newbie
  • *
  • Thank You 1
  • Posts: 14
  • Karma: 0
    • Check IFSC Code
Google bots will crawl all the page sizes, but if it's too big, it takes long time and may be leave your site for next one.
My advice is keep your site with small size as possible, that google bots will crawl your sites easily and easy to index all your pages and showing on search results.

I will chose a google usually index first few hundred kb.

Offline Medventa

  • Trade Count: (0)
  • Semi-Newbie
  • *
  • Thank You 1
  • Posts: 12
  • Karma: 0
I believe that the ''correct'' answer is definitely a) No set limit exists. If Google deems your website to be worth the time, it will crawl it, no matter the size.
Just make sure that your page is good in the eyes of the Google bots - content and high-quality backlinks are two good factors to consider.

Offline fayeseom

  • Trade Count: (0)
  • Jr. Member
  • **
  • Thank You 7
  • Posts: 83
  • Karma: 1
  • Gender: Female
    • Faye
Google crawls websites of any size. However, very large sites tend to not be crawled all at once. It's kind of like eating a large meal. You have some now, and save some for later. Google indexes parts of large sites during it's routine spidering process. Then they'll return later on to index the rest.

Offline lishmaliny

  • Trade Count: (0)
  • Jr. Member
  • **
  • Thank You 1
  • Posts: 85
  • Karma: 0
According to Google's Documentation: All files larger than 30MB will be completely ignored. They will index up to 2.5MB of an HTML file. Non-HTML files will be converted to HTML


Offline eSage IT

  • Trade Count: (0)
  • Semi-Newbie
  • *
  • Thank You 0
  • Posts: 35
  • Karma: 0
  • Gender: Male
    • eCommerce Marketing Agency
According to Google's Documentation, files which are larger than 30MB will not be crawled. They crawlers will index up to 2.5MB of an HTML file. So I would suggest, it is better to keep the file size small to make it more effective.

Offline inoxindia

  • Trade Count: (0)
  • Semi-Newbie
  • *
  • Thank You 0
  • Posts: 14
  • Karma: 0
A Web Crawler (also known as a search engine spider, searchbot, robot) is a program which is used by search engine to find what is new on Internet(website). This process is called Crawling.

Web Crawler makes a begining by crawling the pages of websites. Then it indexed the words and contents found on that website. And then it visits the links available on that website.

 

Related Topics

  Subject / Started by Replies Last post
2 Replies
1582 Views
Last post 11-19-2011, 03:54:41
by C.Rebecca
11 Replies
3385 Views
Last post 06-25-2012, 01:54:19
by worldtraveler
16 Replies
11716 Views
Last post 09-04-2016, 23:29:59
by torcwebdesign
2 Replies
2764 Views
Last post 09-06-2016, 03:36:22
by royshantanu
1 Replies
1815 Views
Last post 09-08-2016, 01:24:20
by RH-Calvin