Why doesn’t site submission move my site to the top?

Getting your site listed with search engines means only that it will be included in their indices. Improving the position in search engines’ result pages is a different task. This task involves optimization of your pages for the targeted keywords, improvement of site link popularity (getting inbound links to your site), ensuring quality (usability, absence of broken links and other errors preventing SE robots from indexing, search engine- and visitor-friendly design) and promotion in social media and advertising.

How often should I resubmit my pages to search engines?

The major search engines have their own search robots that regularly crawl the Web. So, if your pages are already in the index, resubmitting won’t help you improve site rankings in any way.

You have a good reason for resubmitting your site to search engines and directories only if you’ve made some significant changes to its content. But even in this case, don’t resubmit all of the pages — you only need to resubmit the updated ones or just submit a renewed sitemap.

Why do links to my site suddenly disappear from search engine indices?
The numbers reported in “link:pageURL” queries are really approximate, so we recommend that you not worry about fluctuations showing significant decreases or increases, unless they are accompanied by sudden traffic drops.

In case a significant decrease in the number of links is accompanied by reduction of the corresponding traffic referral from the search engines, then you may be experiencing a real loss of link juice. In that case check if the important links sending you traffic and boosting your search rankings still exist, and ensure your site has not been penalized or hacked. Use your Web analytics tools and Google’s Webmaster Tools to identify potential problems.

How can I remove my site from search engines?

If you don’t want your content to appear in the search engines’ indices, you can prevent search engines from crawling your content by using the robots.txt protocol. That will give the robots instructions on which pages they are allowed to crawl and/or index. This method can help you keep new content out of the index and remove the content already in the index (when the crawlers revisit your pages).

However, even if the search engines don’t crawl or index the content of pages blocked by robots.txt, they may index the URLs by discovering references to the excluded URLs in other sources on the Web.

The most reliable way to remove your site URLs from search engines’ indices is to use webmaster tools:

Google URL removal tool
Yahoo! Site Explorer
Bing Content removal request
The delete action usually affects the entire index within 48 hours.

Can companies guarantee high rankings?

Most good SEOs understand the principles of how search engines work and can implement changes to your site that will positively influence the chances of your site to be ranked high by search engines, but NO ONE CAN GUARANTEE high rankings on search engines.

Some SEO companies provide a guarantee on their services — they may recommend changes to your site and suggest actions to increase the possibility of ranking higher, – but they should not guarantee steadily high rankings, because they do not control search engines. If search engines change their algorithm overnight, rankings may disappear suddenly.

Beware of SEO firms that guarantee high rankings, because if such a SEO company’s promises are broken, it may balk at giving a refund or will suggest other services instead or even become unreachable.

For more information on “white hat” search engine optimization, please contact Atomic 55.