My site in Yahoo SE

Started by seo-guy, 06-11-2011, 05:44:44

Previous topic - Next topic

seo-guyTopic starter

I notice that home page of my site only indexed by yahoo but not other pages. Why is it?


takeshiro

#1
There can be several reasons as to why other pages of your site are not indexed by Yahoo or other search engines. Here are some possible reasons:

Crawl Errors: Search engines might not be able to crawl your other pages due to errors. Check the error reports of Yahoo's Search Console (Bing Webmaster Tools, as it powers Yahoo search engine) to see if your site has any crawl issues.

Robots.txt: This file might be blocking search engines from accessing certain parts of your site. Make sure that your Robots.txt file is not preventing Yahoo's crawlers from accessing and indexing your other pages.

Noindex Tag: The pages might contain a "noindex" meta tag that tells search engines not to index that particular page.

Poorly Designed Navigation: If your website's navigation isn't clear, or pages are buried too deep and not linked from other pages, search engines may have a hard time finding them.

Lack of Valuable Content: Search engines prioritize indexing pages that contain valuable content. If your other pages do not have meaningful, valuable content, they might not be picked up.

New or Recently Updated Website: If your website is new or you've recently added new pages, it might just be a matter of time for Yahoo to discover and index these pages.

Sitemap Issues: If your XML sitemap isn't updated or isn't accurate, then the search engines might have trouble finding all your pages. Make sure that your sitemap is current and submitted to Yahoo via Bing Webmaster Tools.

Duplicate Content: If the content on your other pages is very similar to content found elsewhere, either on your own site or on the rest of the web, it may be considered duplicate content. Search engines will often only index one version of duplicate content to avoid redundancy in their search results.

Slow Page Load Speeds: If your web pages take a long time to load, crawlers might abandon the attempt to index them. Improving your site's load speed can potentially help with this issue.

Penalties: If your site has been involved in spammy or black-hat SEO practices, it may have received penalties causing certain pages or in some cases the entire site to be deranked or deindexed.

Canonical URLs: You might have told search engines that a certain URL is canonical or the 'master copy,' and thus other versions of the page aren't being indexed.

Changes not yet crawled: If you've recently made changes to the page or site, it may not yet have been crawled and indexed by search engines. Search engines do not update their indexes in real time and it may take some time for new pages or content to be picked up.

Disallowed in robots.txt: The robots.txt file is a file that tells search engines what they can and cannot crawl on your site. If pages are disallowed in this file, they will not be crawled or indexed.

URL Parameters: Some search engines struggle with indexing URLs that have multiple parameters or are overly complex.

Server Issues: If your server is frequently down or slow when the spiders come to crawl your site, these pages might not be indexed.

Poor Internal Linking: Search engine spiders find new pages by following links from other pages. If a page is not well linked to from other parts of your site, it may not be discovered.

Mobile Unfriendliness: Search engines increasingly prioritize mobile-friendly pages. If your pages aren't optimized for mobile, they may be downgraded or ignored by search engines.

Content is Seen as Automated: If your website content is spun or appears automated, search engines might not index your website to prevent low quality content from appearing in search results.

Temporal Blockage: If a site was down or inaccessible when search engines tried to crawl it, it could miss the opportunity to be indexed. Regular hosting maintenance or server issues could result in such temporary blockages.

Geo-Blocking: If your website server actively blocks certain countries or IP addresses, search engines might not be able to crawl or index your site.

Lack of XML Sitemap: An XML Sitemap provides a roadmap of your website which allows search engines to find and index all of the pages on your site. Not having a sitemap could make it more difficult for a search engine to discover all of the pages on your website.

High Loading Times: If your website or certain pages take too long to load, search engines might not take the time to index these pages. A fast, responsive site is more likely to be crawled and indexed efficiently by search engines.

Orphan Pages: These are pages without any internal links pointing towards them, making them undiscoverable for search engine bots. To resolve this issue, make sure these pages are linked from at least one other page on your website.

Cloaked content: Some websites show different content to users and search engines. This technique, called cloaking, is against Google's Webmaster Guidelines, and could result in the website not being indexed.