Usability considerations

It’s not always possible to please both your site users and the crawlers that determine your page ranking. It is possible, however, to work around problems. Of course, the needs of users come first because once you get them to your site you want them to come back. On the Internet, it’s extremely easy for users to surf away from your site and never look back. And returning visits can make or break your site.

But the catch is that in order to build returning visitors, you have to build new visitors, which is the purpose of SEO. That means you need search engines to take notice of your site.

When it seems that users’ preferences are contrary to crawlers’ preferences, there is a solution. It’s a site map. And there are two types of which you should be aware. A basic site map is an overview of the navigational structure of your web site. It’s usually text-based, and it’s nothing more than an overview that includes links to all of the pages in your web site. Crawlers love site maps. You should, too.

A site map allows you to outline the navigational structure of your web site, down to the second or third level of depth, using text-based links that should include anchors and keywords. An example of a site map for the Work.com.

When a site map exists on your web page, a search engine crawler can locate the map and then crawl all of the pages that are linked from it. All of those pages are then included in the search engine index and will appear on search engine results pages. Where they appear on those SERPs is determined by how well the SEO is done for each individual page.

A second type of site map, the XML site map, is different from what you think of as a site map in both form and function. An XML site map is a file that lists all of the URLs for a web site. This file is usually not seen by site visitors, only by the crawlers that index your site. There are more specifics on XML.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

Creating great content

Web-site content is another element of an SEO-friendly site that you should spend plenty of time contemplating and completing. Fortunately, there are some ways to create web-site content that will make search crawlers love you.

Great content starts with the right keywords and phrases. Select no more than three keywords or phrases to include in the content on any one of your web pages. But why only three? Wouldn’t more keywords and phrases ensure that search engines take notice of your site?

When you use too many keywords in your content, you face two problems. The first is that the effectiveness of your keywords will be reduced by the number of different ones you’re using. Choose two or three for each page of your site and stick with those.

The other problem you face is being delisted or ignored because a search engine sees your SEO efforts as keyword stuffing. It’s a serious problem, and search engine crawlers will exclude your site or pages from indexes if there are too many keywords on those pages.

Once you have the two or three keywords or phrases that you plan to focus on, you need to actually use those keywords in the content of your page. Many people think the more frequently you use the words, the higher your search engine ranking will be. Again, that’s not necessarily true. Just as using too many different keywords can cause a crawler to exclude you from a search engine index, overusing the same word will also cause crawlers to consider your attempts as keyword stuffing. Again, you run the risk of having your site excluded from search indexes.

The term used to describe the number of times a keyword is used on a page is keyword density. For most search engines, the keyword density is relatively low. Google is very strict about ranking sites that have a keyword density of 5 to 7 percent; much lower or much higher and your ranking is seriously affected or completely lost.

Yahoo!, MSN, and other search engines allow keyword densities of about 5 percent. Going over that mark could cause your site to be excluded from search results.

Keyword density is an important factor in your web-site design, and is covered in more depth. But there are other content concerns, too. Did you know that the freshness and focus of your content is also important in how high your web site ranks? One reason many companies began using blogs on their web sites was that blogs are updated frequently and they’re highly focused on a specific topic. This gives search engines new, relevant content to crawl, and crawlers love that.

Consider implementing a content strategy that includes regularly adding more focused content or expanding your content offerings. It doesn’t have to be a blog, but news links on the front page of the site, regularly changing articles, or some other type of changing content will help gain the attention of a search engine crawler. Don’t just set these elements up and leave them, however. You also have to carry through with regular updates and keep the links included in the content active. Broken links are another crawler pet peeve. Unfortunately, with dynamic content links will occasionally break. Be sure you’re checking this element of your content on a regular basis and set up some kind of a userfeedback loop so broken links can be reported to your webmaster.

Finally, when you’re creating your web-site content, consider interactive forums. If you’re adding articles to your site, give users a forum in which they can respond to the article, or a comments section. This leads to more frequent updates of your content, which search crawlers love. The result? An interactive relationship with your web-site users will keep them coming back, and give an extra boost to your search engine ranking.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS