jump to navigation

Search Engine Saturation July 9, 2008

Posted by seonlinks in seo learn.
Tags: , , , , , , , ,
add a comment


Search Engine Saturation simply refers to the number of pages a given search engine has in its index for your website domain. Not all search engines report this information but enough of them do to create some meaningful benchmarks for your search engine marketing campaigns.

Here are some ideas for useful data you can gather using the Search Engine Saturation tool:

  1. Total number of pages in each search engines index for your website. How many pages are there total for your website? How many domains exist for your websites? Once you run your search engine saturation report, you can compare the numbers a percentage of your sites availability to the search engines.

  1. Which pages of your site are showing up in the search engines? Are there sections of your site that aren’t showing up? Once you know what the search engines see, you will know the pages that need action to get them included.

  1. How are your competitors performing in search engines? If they are have high search engine saturation numbers, you can look at their sites and analyze what they are doing right.

How do search engines index sites? July 7, 2008

Posted by seonlinks in seo learn.
Tags: , , , , , , , , , ,
add a comment

The first step in the indexing process is discovery. A search engine has to know the pages exist. Search engines generally learn about pages from following links, and this process works great. If you have new pages, ensure relevant sites link to them, and provide links to them from within your site. For instance, if you have a blog for your business, you could provide a link from your main site to the latest blog post. You can also let search engines know about the pages of your site by submitting a Sitemap file. Google, Yahoo!, and Microsoft all support the Sitemaps protocol and if you have a blog, it couldn’t be easier! Simply submit your blog’s RSS feed. Each time you update your blog and your RSS feed is updated, the search engines can extract the URL of the latest post. This ensures search engines know about the updates right away.

Once a search engine knows about the pages, it has to be able to access those pages. You can use the crawl errors reports in webmaster tools to see if we’re having any trouble crawling your site. These reports show you exactly what pages we couldn’t crawl, when we tried to crawl them, and what the error was.

Once we access the pages, we extract the content. You want to make sure that what your page is about is represented by text. What does the page look like with Javascript, Flash, and images turned off in the browser? Use ALT text and descriptive filenames for images. For instance, if your company name is in a graphic, the ALT text should be the company name rather than “logo“. Put text in HTML rather than in Flash or images. This not only helps search engines like google index your content, but also makes your site more accessible to visitors with mobile browsers, screen readers, or older browsers.