Search engine optimization (SEO) is the practice of optimizing a Web site so as to achieve preferred ranking on the search engine results pages (SERPs).

Search engine optimization (SEO), also called organic or natural optimization, involves optimizing Web sites to achieve high rankings on the search engines for certain selected key phrases.

This is achieved by making changes to the hypertext markup language (HTML) code, content, and structure of a Web site, making it more accessible for search engines, and by extension, easier to find by users. These are also known as on-page factors. SEO also involves off-page factors—these generally build links to the Web site. Activities to increase links to a Web site, including social media and Web public relations (WebPR), are considered off-page SEO.

SEO is an extremely effective way of generating new business to a site. It is a continuous process and a way of thinking about how search engines see your Web site and how users use search engines to find your Web site. It’s search psychology.

SEO is a fairly technical practice, but it can easily be broken down into five main areas:

  • A search engine–friendly Web site structure
  • A well-researched list of key phrases
  • Content optimized to target those key phrases
  • Link popularity
  • Emerging trends

 

6.1 Search engine–friendly web site structure

Search engines encounter two kinds of obstacles:

  • Technical challenges that prevent the search engine spider from accessing content
  • A competitive marketing environment where everyone wants to rank highly

To ensure that search engines can access your content, you must remove technical barriers. Those who wish to achieve the best results must follow best Web development practices.

6.2 Well-researched key phrases

Key phrases are the very foundation of search. When a user enters a query on a search engine, she uses the words that she thinks are relevant to her search. The search engine then returns those pages it believes are most relevant to the words the searcher used. (See later.)

Search engines have built a sophisticated understanding of semantics and the way we use language. So, if a user searches for “car rental,” the search engine will look for pages that are relevant to “car rental” as well as possibly “car hire,” “vehicle hire,” and so forth. Search engines have also built up knowledge around common misspellings and synonyms and common related searches so as to try to return the best results for a user.

Because of this, it is crucial that Web sites contain content with keywords that are likely to be used by their target audience. Web sites need to appear when their potential customers are searching for them.

As a Web site owner, or the marketer for a Web site, we need to build a list of some of the terms our potential customers are likely to use to find the things we are offering. A big part of keyword research is understanding search psychology. When we build our key phrase or keyword list, we are tapping into the mental process of searchers and putting together the right mix of keywords to target.

6.3 Optimizing content for key phrases

Once keywords and phrases are selected, we need to ensure that the site contains content to target those key phrases. We must ensure that the content is properly structured and it sends signals of relevance. Content is the most important part of your Web site. We must create relevant, targeted content aiming at our selected key phrases. Content already has several roles to play on your site:

  • It provides information to visitors.
  • It must engage them.
  • It must convince them to do what you want.

Now it must also send signals of relevance to search engines. You need to use the keywords on the content page in a way that search engines will pick up and users will understand.

Each Web page should be optimized for two to three key phrases: the primary key phrase, the secondary key phrase, and the tertiary key phrase. A page can be optimized for up to five key phrases, but it is better to have more niche pages than fewer unfocused pages. Here are some guidelines:

  • Title tag. Use the key phrase in the title and as close to the beginning as possible.
  • H1 header tag. Use the key phrase in the header tag and as much as possible in the other H tags.
  • Body content. Use the key phrase at least three times, or more if there is a lot of content and it makes sense. Aim for about 350 words of content, but don’t overdo it! That could look like spam to the search engines.
  • Bold. Use tags around the keyword at least once.
  • Alt tag for an image. Use the key phrase at least once to describe an image on the page.
  • URL (uniform resource locator). Use a URL rewrite so that it appears in the URL of the page.
  • Meta description. Use it at least once in the meta description of the page. It should entice users to click through to your site from the search engine results page (SERP).
  • Meta tags. Use the keywords in the meta tags to provide context to the search engines.
  • Linked anchor text to another page. Don’t use it when linking to another page on your site. The anchor text describes the page being linked to, and so could dilute the relevance of the page you are linking from.
  • Domain name. If possible, use the key phrase in your domain name, although favor usability or memorable domain names.

6.3.1 Optimizing Images

Images should also be optimized with the relevant keywords. Search engines cannot see images, so rely on the way that an image is described to determine what the image is about. Screen readers also read out the image descriptions, which can help visually impaired readers to make sense of a Web site. Lastly, images are sometimes also shown on the SERPs, and of course one can also search images using most of the major search engines.

Just as an image can help emphasize the content on a page, it can also help search engines in ranking pages, provided the image is labeled correctly.

Here are some ways to optimize images with key phrases for search engine optimization (SEO):

  • Use descriptive filenames.
  • Use specific alt tags and title attributes for the images.
  • Meta information can be supplied in the image file. Make sure this information is relevant.
  • Use descriptive captions, and keep relevant copy close to the relevant image.
  • Make sure the header tags and images are relevant to each other.

SEO is both a science and an art. Focusing on writing quality content while sticking to a few guidelines when it comes to tags and URLs is the best way to ensure results. Remember, you want search engines to rank you highly for your content, but you also want to ensure that the content is a pleasure to read.

Adding fresh content that carries brand values regularly to your site will also encourage the search engines to crawl your site more frequently.

Use your Web site and its pages to establish and reinforce themes. Information can always be arranged in some kind of hierarchical structure. Just as a single page can have a heading and then get broken down into subheadings, a large Web site can have main themes that get broken down into subthemes. Search engines will see these themes and recognize that your Web site contains rich content.

6.4 Link popularity

Links are a vital part of how the Internet works. The purpose of a link is to allow a user to go from one Web page to another. Search engines, doing their best to mimic the behavior of humans, also follow links.

Besides allowing search engine spiders to find Web sites, links are also a way of validating relevance and indicating importance. When one page links to another, it is as if that page is voting or vouching for the destination page. Generally, the more votes a Web site receives, the more trusted it becomes, the more important it is deemed, and the better it will rank on search engines.

Links help send signals of trust. Signals of trust can only come from a third-party source.

Links help validate relevance. Text links, by their very nature, contain text. The text that makes up the link can help validate relevance.

6.4.1 What does a link look like?

Here is the hypertext markup language (HTML) code for a link:

<a href=“http://www.targeturl.com/targetpage.htm”>Anchor Text</a>

http://www.targeturl.com/targetpage.htm” is the page that the link leads to. “Anchor Text” is the text that forms the link.

The link sends a signal that the target URL (uniform resource locator) is important for the subject of the anchor text.

There is a lot more information that can be included in this anatomy, such as instructions telling the search engine not to follow the link or instructions to the browser as to whether the link should open in a new window or not.

<a href=http://www.targeturl.com/targetpage.htm rel=“nofollow”>Anchor Text</a>

The instruction rel=“nofollow” can be included in links when you don’t want to vouch for the target URL. Search engines do not count nofollow links for ranking purposes. It was initially introduced by Google to try to combat comment spam.

6.4.2 Not all links are created equal

Of course, not all links are equal. Some sites are more trusted than others. So if they are more trusted, then links from those sites are worth more. Likewise, some sites are more relevant than others to specific terms. The more relevant a site, the more value is transferred by the link. Well-known and established news sites, government sites (.gov), and university domains (.edu) are examples of sites from which links can carry more weight.

Search algorithms also consider relationships between linked sites. By analyzing various things, the engines try to determine if the links are natural links or if they are manipulative, artificial links created solely for ranking purposes. Manipulated links are worth very little compared to natural links and may lead to a drop in search engine rankings.

The search engine algorithm will also determine the relevancy of the referring Web site to the site being linked to. The more relevant the sites are to each other, the better.

6.5 Emerging Trends

SEO (search engine optimization) is a constantly evolving activity. As the search engine algorithms become more sophisticated, they assess Web sites in more complex ways to arrive at suitable search results. There are significant changes on the horizon in how search engines will index and present their results. These changes are aligned with a goal of sorting through the exponentially increasing amounts of data available on the Web and giving users better and more targeted search results, which they will find useful.

There are four strong emerging trends for SEO:

  • Localization
  • Personalized search
  • Usage data
  • Real-time search

These four trends are making optimizing Web sites for search engines even more complex.

6.5.1 Localization and Personalization

The first two trends revolve around how the search engines are trying to accommodate the geographic and personal preferences of a user so as to present them with the best contextual results. By localizing, the search engines are presenting information in the language and geographic context of the user.

In personalizing search, the search engines are trying to align with what they have determined would be more appropriate for that specific user. Personalized search targets users’ preferences on two bases: explicitly and implicitly.

Explicit personalized search is based on an interpretation, by the search engines, of data and information the users provide directly to search engines, such as location, age, language, and gender.

Implicit personalized search is based on data and information search engines gather by analyzing users’ behavior. Search engines will track the pages users frequently visit or how they use certain sites—such as Gmail or bookmarking sites. Based on this, the search engines will predict what a user is probably interested in and tailor the results it presents to that user accordingly.

Surveys suggest that users are generally in favor of personalization and are willing to trade personal information to benefit from better-quality results from the search engines. Large search engines, like Google, are even offering end users the opportunity to tell them what results they like—through the launch of user feedback mechanisms such as Google Search Wiki. This kind of functionality allows the user to tell search engines what results they like or don’t like and would like to see (or not) again.

To optimize a site properly, factors like personalization and localization need to be taken into account and the site needs to be honed to do the following:

  • Adapt to how the search engines will measure and index the sites
  • Adapt to how users will expect to be presented with contextualized information

6.5.2 Usage data

Search engines want their results to be highly relevant to Web users to make sure that Web users keep returning to the search engine for future searches. And the best way to establish relevance to users—how they use Web sites, of course!

Usage data are the most effective way of judging the true relevancy and value of a Web site. For example, if users arrive on a Web site and go back immediately, probably it wasn’t relevant to their query in the first place. However, if a user repeatedly visits a Web site and spends a long time on the site, probably it is extremely relevant. When it comes to search engines, relevant valuable sites get promoted, and irrelevant sites get demoted.

6.5.2.1 How do search engines access these data?

Search engines use cookies to maintain a history of a user’s search activity. This will include keywords used and Web sites visited from the search engine. Search engines gather data on the click-through rate of results and on bounce rates.

Most search engines also provide other services, all of which can be used to gather data relevant to search. For Google, some examples of these services include the following:

  • Google AdWords
  • Google AdSense
  • Google Analytics
  • Google Web Site Optimizer
  • Google Checkout
  • Google Toolbar

It no doubt plays a part in search engine rankings, and that contribution is set to grow. So what does this mean for SEO? When it comes to a Web site, SEO must do the following:

  • Be valuable enough to attract both visitors and links naturally
  • Retain visitors and make sure they return to the Web site
  • Convert visitors

6.5.2.2 What Not to Do

Black-hat SEO refers to practices that attempt to game the search engines. Should a search engine uncover a Web site that is using unethical practices to achieve search engine rankings, it is likely to remove that Web site from its index.

Google publishes guidelines for Webmasters, available through Google’s Webmaster Central (http://www.google.com/webmasters). As well as outlining best practice principles, Google has supplied the following list of precautions:

  • Avoid hidden text or hidden links.
  • Don’t use cloaking or sneaky redirects.
  • Don’t send automated queries to Google.
  • Don’t load pages with irrelevant keywords.
  • Don’t create multiple pages, subdomains, or domains with substantially duplicate content.
  • Don’t create pages with malicious behavior, such as phishing or installing viruses, trojans, or other malware.
  • Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches such as affiliate programs with little or no original content.
  • If your site participates in an affiliate program, make sure that your site adds value. Provide unique and relevant content that gives users a reason to visit your site first.

The bottom line: design Web sites for users first and foremost, and don’t try to trick the search engines.

6.5.3 Real-time search

Google offered a new feature called real-time search. It’s designed to further increase the optimal user experience with search and follows earlier features on Google search engine results pages (SERPs), which now regularly include images, news items, videos, and shopping listings. With real-time search, Google now displays a dynamic function in its SERPS, where you can see the latest mentions or URLs published on the Web, related to your search term.

This is ideal for social media and microblogging purposes, and Google has partnered with the likes of Twitter, MySpace, FriendFeed, Jaiku, Identi.ca, and other online businesses to offer this initiative. It opens up a number of opportunities and increases the importance of a strong social media presence to augment your search engine marketing efforts.

6.5.4 Tools of the trade

There are a number of tools available to assist with SEO. Some are made available by search engines, and some are developed by agencies and individuals who specialize in SEO. Most are available for free.

Google webmaster tools

URL: http://www.google.com/webmasters

Google provides guidelines to Webmasters and tools to help ensure your Web site is being indexed.

Quirk SearchStatus

URL: http://www.quirk.biz/searchstatus

Quirk SearchStatus is a Firefox extension that allows you to easily view key SEO information related to the page you are visiting. As well as linking to Alexa and Compete rankings and a Whois look up, Quirk SearchStatus will highlight keywords on a page and allow you to easily access link reports from each of the major search engines.

Tools from SEOBook.com

URL: http://tools.seobook.com

SEOBook.com provides a number of tools that assist any SEO. For example, Rank Checker is a Firefox extension that allows you to save a number of keywords and to perform regular searches on them, giving you the ranking of your chosen URL for each keyword in the search engines selected. They also have tools to help with keyword discovery.

Tools from SEOMoz

URL: http://www.seomoz.org/tools

SEOMoz provides a wealth of articles and forums, as well as excellent SEO tools and guides. Some are free, but become a “PRO” member to access them all.

Keyword discovery tools

There are a number of tools available, some free and some that require a fee, to assist with keyword discovery. Some include Trellion’s Keyword Discovery Tool (http://www.keyworddiscovery.com) and Wordtracker (http://www.wordtracker.com).

Online forums

Webmaster World (http://www.webmasterworld.com) is frequented by SEOs and Webmasters aiming to stay current with latest trends and search engine updates.

 

Google’s free search engine optimization guide

URL: http://www.google.com/webmasters/docs/search-engine-optimization-starter...

Google provides a free starter guide, useful for anyone new to SEO.

Google insights for search

URL: http://www.google.com/insights/search

This provides valuable information about search terms you may want to target for SEO purposes. It also provides regional interest (i.e., by geography) for search terms, which is increasingly important as search engines move toward localization and personalization in their search focus.

Pros and Cons

Optimizing a Web site for search engines should entail optimizing the Web site for users. Done properly, it should result in a better user experience, while ensuring that search engines index and rank the Web site well.

However, it can be tempting to focus on the technicalities of SEO while forgetting that both robots and humans need to read the same Web site. One should not be sacrificed for the other.

Search engines update their algorithms regularly. Each update is an attempt to improve search results but can result in loss of rankings for some Web sites, depending on the update. A contingency plan, such as a prepared PPC (pay-per-click) campaign, needs to be in place to cope with a sudden drop in rankings.

As with any eMarketing practice, SEO should not be the only focus of eMarketing efforts. It works best when part of a holistic eMarketing strategy.