16 Problems Search Engines indexing your site

. Monday 24 March 2008
  • Agregar a Technorati
  • Agregar a Del.icio.us
  • Agregar a DiggIt!
  • Agregar a Yahoo!
  • Agregar a Google
  • Agregar a Meneame
  • Agregar a Furl
  • Agregar a Reddit
  • Agregar a Magnolia
  • Agregar a Blinklist
  • Agregar a Blogmarks

Your website looks properly optimized, but nothing happens really significant. Here are some of the reasons why your site may encounter obstacles during its indexing in the various research tools.


The time indexing

If your site is not indexed yet (it is very young), we must consider the time indexing by search engines. The period of time until the engines do your site is generally indicated on the pages of submission of each of them. But sometimes that period is not accurate or not up to date. On average, the index of time fear of a move to eight weeks depending on the engine. Some engines like Altavista or Inktomi offer formulas pay if you want to be indexed faster.

Already indexed

Search engines are not going to warn you of your indexing, it is up to you to find out. The method to find out if a particular page or domain name has been taken into account varies from one engine to another. Do not say that you have not been indexed just because you wanted a keyword you and that you are not located in the first few pages of results. You could very well be indexed but that your site appears far from the results.


Links from the HomePage

Some engines are known not to index the pages that are not accessible from the home page. A rumor on this subject ran for HotBot. Think about your internal links as a set of paths from a page in others. If there is no access from your home page to the page you want to index, a search engine may decide that this page was unnecessary.


Referrals

Some search engines like Google and HotBot are known to refuse to index sites that do not link to other sites. Or, they can index your your home page, but refuse to index the other pages unless you offered links to other sites. Or, they will indexing for a while, but then you come out of their databases if you have not set up links to external after a certain period of time.


The frames (frames)

The content within HTML frames can be a problem because a search engine can index content in the main frame, but not the frames constituting page full frame as the menu for example. Visitors will be able to find your information, but will be unable to access your menu. It is more preferable that create web sites without frames.


Obstacles to robots

Search engines can not index the sites that require registration or a password, and they can not fill out the forms. This also applies to the indexing of the contents of a searchable database. The solution is to create static pages that the engines can find and index without action on your site. Depending on your database system, there are utility programs and / or companies that can help you resolve this problem.


The free sites

Because of all the "junk submissions" free websites like Geocities, many drivers have chosen not to index sites of these areas or limiting the number of pages they accept.


Guilty by association

If your website shares the same IP address as other sites on your server, you may find that your IP is prohibited because of what someone else has done. Ask your hosting service if your domain name has its own unique IP assigned to it. Otherwise, ask them to do so to avoid being penalized because of someone else.


Dynamic pages

Dynamic pages with URLs that contain special characters like "?" Or "&" are ignored by a number of search engines. The pages generated on the fly from a database very often contain such characters. In this case, it is important to generate static versions of each page you want to be indexed or proceed with the rewrite. The scripts and codes fanciful can also be an impediment to your indexing. When it comes to search engines, go to the easiest.


The pages too long

If your site has a load too slow or that the pages are really complicated and take too long to load, there comes a time when robots will stop indexing them. To prevent this, limit your pages to a maximum of 50K. A good rule is: weight of the cumulative weight + page images on the 50-page = 70K. If the weight exceeds the maximum given the visitors with slow connections will leave before the page is completely loaded.


The unreliability of accommodation

It is extremely important to have a reliable shelter. If your site is not responding when a robot tries to access it, it will not be indexed. Worse, if you are already indexed and a robot tries to connect when your site is down, the latter may be désindexé (where frequent repetition).


The sp @ m

If you are using questionable techniques that could be considered as sp @ m, as the excessive repetition of keywords, text the same color as the background ..., a search engine may ignore or reject your submission.


The redirections

The redirects and meta tags refresh can sometimes cause problems for search engines that attempt to index your site. If the search engines "think" you are trying to cheat using the cloaking technology or redirect IP, they do not at all indexing.


Submissions in the categories best

When you submit your site to directories such as Yahoo, Open Directory, LookSmart and others, a person will analyze your site. It will decide whether your site is of sufficient quality to be indexed. These directories can help you to be indexed in other search engines. Make sure to give your submissions at directory all the attention they need.


Limiting pages

Search engines do not have all the pages on your site. It could be a dozen pages as three or four hundred depending on the engine. Google is one of the engines that runs deeper websites. The depth depends on the popularity of links. Sites with a strong popularity of links are considered to be "worthy" of further exploration.


Mistakes

Sometimes, the drivers simply lost bids sites due to bugs or technical errors. Do not forget that mistakes can occur, since these engines run databases containing hundreds of millions of pages.


source : Articles from Jennifer Horowitz
at www.ecombuffet.com

0 comments: