Friday, January 9, 2015

EASY WAY to Creating a Search Engine Friendly Sitemap

Sitemaps are often ignored by webmasters. Their value for both visitor-targeted and spider-targeted optimization is underestimated.

What is a sitemap? In the most general terms, it's a page or pages that contain a list of and link to all the other documents on your site. Theoretically, it's designed to give your visitors a quick way to find what they are looking for on your site without browsing the entire content. A sitemap also aims at eliminating the need to link to every page of your site from your home page.

In the last few years, sitemaps have gained importance as a SEO factor: they can be utilized to direct the search engine spiders to all of your content-rich pages. This is especially true for large sites, where a number of clicks are needed to get to specific pages through the numerous sections and subsections. If a site has thousands of pages, its webmaster should really consider dividing it into sections to make navigation easy. This can also mean that a search engine crawler needs to do a lot of work to find all of the pages. With a sitemap, spiders feel much more "relaxed".

Mainly, a sitemap is important because of the following reasons:

  • It helps ensure that all of your content-rich pages are exposed to the search engine spider. With lots of pages and a deep link structure, the crawlers would need to work hard to find all of your pages. When you give them one single page which maps to all the necessary content, you make their job easier and ensure that nothing gets missed.
  • It gives you a way to spread Google's PageRank (covered in one of the future lessons) to the pages that need it and distribute it among these pages. PageRank is a very important part of Google's algorithm but sending it to the right pages may become a headache. So, instead of filling up your home page with internal links, you can use the sitemap to do the job.
  • Sitemap can be used for a more advanced PageRank distribution. Say your site has an "About Us" page which is solely designed for your visitors and not targeted at the spiders. If you link to it from every page on your site, such behavior will affect the rankings of your important pages. Instead, you can link to it from your sitemap and only have your important pages linking to each other.
Creating a sitemap gives you a double advantage by offering your visitors more convenience and a better browsing experience and smartly channeling the search engines' power. Be sure to include a sitemap as a part of your overall SE Marketing strategy.

Here we list our collection of tips on how to build an effective sitemap.

A search engine-friendly and visitor-friendly sitemap

Your sitemap must only be linked to from your homepage and no other page, because: a) you want the search engine spiders to find this link directly from your homepage and follow it from there and b) according to the PageRank distribution concerns, linking to your sitemap from only your home page will spread the PageRank quickly to pages all over your site.

If you have a large website of 50 pages or more, limit the number of pages listed on your sitemap to a maximum of 30, otherwise it can be mistaken for a link farm by the search engines. Limiting the number of entries to 30 also makes a map much easier for real human visitors to read. This step may mean splitting your sitemap over several pages – don't be afraid of that, just make sure each of your sitemap pages links to the next. Otherwise both visitors and search engine spiders will find a broken link, lose interest and go away.

The title of each sitemap link should be keyword rich and link directly back to the original page. Always link from your sitemap to your pages using the anchor text that will help those pages with their rankings (i.e. use the keywords for link text that the page you're linking to is optimized for). Include around 10 – 20 words of textual content from the original page underneath each sitemap link. This creates more content for search engine spiders and human visitors can see exactly what each page is about before clicking. Besides, descriptions help bring the keyword density of the map down to an acceptable level, should this level be exceeded.

Ensure that the look and feel of your sitemap page is consistent with the rest of your site. Use the same basic HTML template you used for every other page of your site.

As a solution for the problem of crawling big websites Google has suggested its Sitemap program. Google claims that its Sitemap Technology was created in order to list sites much faster (http://www.google.com/webmasters/sitemaps/docs/en/about.html). The idea is that you inform Google about your site, the quantity of pages, the frequency of updates and their regular or irregular basis. It also gives you the ability to see your site from the Google's point of view, i.e. learn about errors

(https://www.google.com/support/webmasters/bin/answer.py?answer=35120&topic=8474). It is free and gives you an opportunity to index new or changed pages on-the-fly if they conform to Google standards.


And what does this mean in practice? You place the Urls of your pages and point how Google should index them into an XML document. The crawler reads this information and if the pages correspond with Google standards they are indexed very quickly.

Here you can observe a simple example of a sitemap file:
 
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.google.com/schemas/sitemap/0.84">
<url>
<loc>http://www.yoursite.com/</loc>
<priority>1.0</priority>
<lastmod>2005-07-03T16:18:09+00:00</lastmod>
<changefreq>daily</changefreq>
</url>
</urlset>
 
There are several important variables here. <priority> explains Google spider the order of the pages' indexation. 1.0 is the highest priority, 0.0 is the lowest. By default it is recommended to be set at 0.5 if you don't what to specify it. If some pages are more important than others then higher priority will increase their importance in Google.<lastmod> means last modified and prevents spiders from recrawling pages that have not changed since last modification. <changefreq> means change frequency. This parameter can be very important if your update your pages frequently. Such modes are available - always, hourly, daily, weekly, monthly, yearly and never.

For constant update of your site in Google's massive index or database you should have a Generator that will spider your site, list the urls and export this information to Google. Don't forget that the option of a simple text file submitting is also offered by Google.

There are many generators and different ways to build an XML sitemap file but the most common are as follows:

  1. Google's Python Generator.
  2. A PHP Generator.
  3. Free Online Generator.
Every item listed above has its merits and flaws. Titus Hoskins, in his article "Three Ways To Index Your Site With Google Sitemaps'' (http://www.sitepronews.com/archives/2006/july/3.html) tells about the peculiarities of each method:

Google's Python Generator

That's a relative term, if you know your server like the back of your hand and installing scripts doesn't scare the bejesus out of you, you're probably smiling at the word difficult. Google supplies a link to a generator which you can download and set up on your server. It will cough up your sitemap XML file and automatically feed it to Google.

In order for this Generator to work, Python version 2.2 must be installed on your Web server - many servers don't have this. If you know what you're doing, this will probably be a good choice.

You don't need a Google Account to use Sitemaps but it's encouraged because you can track your sitemap's progress and view diagnostic information. If you already have another Google Account - gmail, Google Alerts, etc. just use that one to sign in and follow directions from there.

To submit your Sitemap using an HTTP request, issue your request to the following URL:

www.google.com/webmasters/sitemaps/ping?sitemap=sitemap_url

 A PHP Code Generator

This is a php generator that you can place on your server. This generator will spider your site and produce your XML sitemap file. Download the phpSitemapNG and upload it your server. Run the generator to get your XML sitemap file and send it to Google.

Again, this is only hard to do if you don't know your way around PHP files or scripts.

Free Online Generator

These Generators are popping up everywhere and Google now keeps a list of these 'third party suppliers' of generators on their site. Find them here: http://code.google.com/sm_thirdparty.html.

One of the easiest to use is http://www.xml-sitemaps.com and you can index up to 500 pages with this online Generator very quickly and it will give you the sitemap XML file Google needs to index your site. It will go into your site, spider it and index all your pages into an XML sitemap of your site. You can download this file, compressed or non- compressed and make minor changes such as setting the priority, changing frequency, etc.

Then upload this file to your site as sitemap.xml to the root directory of your server i.e. where you have your homepage. Then notify Google Sitemaps of your XML file and you're in business.

Easy steps to Optimizing Site Structure

Let's start our introduction into the site structure optimization by quoting SAM, the author of "Search Engine Visibility", a renowned search engine expert with great experience in building websites for search engines.

SAM says, "Web site architecture is something I feel has been poorly addressed by search engine marketers. Reason? Many search engine marketers ONLY specialize in search engine advertising, or they ONLY specialize in search engine optimization. They do not create user-friendly websites for a living. They do not perform usability tests on page layout, site designs, and navigation schemes".

We agree with SAM on this point. When properly composed, site architecture can significantly assist your business in getting high rankings, whereas a poorly structured website can nullify all your on-page SEO efforts.

Website structure does not belong to the category of on-page factors because it does not deal with the HTML code itself (except for the cases where it concerns the textual context of your links and navigation menus). Rather, it is about organizing files and directories and binding your pages by solid link relationships.

When a search engine spider crawls your site, it follows links from one page to another. However, most spiders are instructed not to go deeper than a certain number of links or directories away from the home page of your site. In this step, we'll try to figure out the best way to lay out all your pages for indexing. Remember that the more pages you get indexed, the more chances there are for search queries to find you in the results.

The larger the website the more significant it is to have good site structure . The aim of site structure optimization is to achieve maximum exposure of your pages to the spiders (of course, this concerns only those pages that you intend to expose).

In the lessons of this step, we cover the following topics: how directories are set up on your server, how your site navigation is organized, what the URL structure of your site is and which types of Web pages you use, and naturally, the cross-linking system between your pages. 

EASY STEPS FOR Local SEO for Your Site

The Internet opens up worldwide possibilities to even the smallest enterprises and organizations. But if your business or website is set up only to deal with local or national clients, local search engine optimization will help you target your customers and deliver the highest performance. Since Google's country-specific searches offer diverse rankings compared to Google.com, it is essential to know how to benefit from local search traffic.
Search engines, like Google, index sites in the most appropriate index they have, thus a UK-based website will be primarily indexed and ranked in the Google.co.uk search results. But country-targeted sites seeking top placement in search results should not only target their audience but also prepare to compete with local sites. They should be flexible in their approach and country specific SEO techniques. You have to be aware that the closer a website is to the user, the more relevant it will be to that user.

Below you will find several simple tips that will help you feel assured that the search engines are aware of their site's true geographic location.

Country-Specific Local SEO Tips

1. Arrange for local domains and local hosting

The major search engines (Google, Yahoo!, Live Search) check Top Level Domain (TLD) names to determine where a website is located. If your site has a Country Code Top Level Domain Name - that is, a domain name that ends in a country code like ".ca" for Canada, ".uk" for the United Kingdom, or ".fr" for France - then your site will be included in the country-specific search results. In most cases, the local TLD will outrank a .com name when it comes to local search results.
Registration criteria differ from country to country. It is fairly easy to register a .co.uk or a .co.nz, whereas registering a .com.au name, for example, involves establishing a business entity in Australia. Therefore, if your core goal is to get your content to make top matches in Google.co.uk or Google.co.nz, you had better host that content on country-specific TLDs.
The second method search engines use to determine the geographic location of a website is the IP address of the site. If your site is hosted on a server that is physically located in the target country, then that site will be included in country-specific searches, even if you have a generic TLD domain name like ".com", ".net" or ".info".
You can also register and redirect domains from other countries. It's possible to register a UK website with a hosting provider from anywhere in the world, e.g. the US registrars provide hosting from all over the world. But if you are particularly interested in local SEO, host the main content on your co.uk domain. Be sure to check the physical location of local hosting servers using, for example, SEOmoz's IP Location Tool.

2. Include contact information

Specify your address, location, and contact details in the footer of each page. This will help tell your clients and the search engines that you are a local business. It doesn't really help if you have offices in multiple locations, because mentioning many countries/cities in the footer won't do much for your local profile in each of those places.

3. Take care about language issues

Take the time to check your spelling and use the type of wording your clients will understand and are familiar with. Different English-speaking countries use different spellings and different words in their specific use of the English language, so you should use the most appropriate for the country you are targeting. For example, if you are optimizing a US-based travel portal, you will use "vacation". However, searchers from the UK, Australia and New Zealand use "holiday" instead of "vacation". Punctuation and the spelling of certain words also differ between certain English-speaking countries, so it would help to become conversant with these differences and avoiding the incorrect use of words, which may be perceived as spelling errors.
In targeting different countries you encounter the problem of mirror content when you simply put the same content to your websites located on different domains. You might think Web pages translated into different languages avoid the mirror content problem. Not really. You have to remember marketing demands will be different among English-speaking readers in American, Canada, New Zealand and other countries.
In performing local optimization, you need to adhere to local approaches. For example, the USA sales language is more direct compared with the tone adopted in Australia, the UK and New Zealand. If possible, employ the services of a local copywriting service.

4. Get local inbound links

Search engine spiders consider the inbound links that form your link profile. Inbound links from local TLDs will help you to get into the top results of local search engines. Try to gain links from local business partners, chambers of commerce, government agencies, suppliers, etc.
Local directories are also very userful, particularly local Yellow Pages such as Yellow Pages Australia, Yellow Pages New Zealand, and Yell (UK). Listings in the regional sections of the DMOZ, Yahoo, and BestOfTheWeb global directories will also help to improve local link popularity.

5. Make use of Google Webmaster Tools

Google Webmaster Tools help you control the country association of your content. Country specification will help determine how your site appears in search engine country-specific search results, and also improves your search results for geographic queries. You can set specification on a domain, sub-domain and directory level.
If you don't want to go to the effort of setting up structures in other countries, the alternative is to set up a Google Webmaster Tools account for the site you have and specify which country you are targeting.
This is one of the most simple but effective methods. Here is the four-step process of Google Webmaster Tools specification:
  Click on Tools
  Set Geographic Target
  Associate a geographic location with this site, and finally
  Select the Country or Region you want to target
Remember, this feature is restricted to sites with a country code top-level domain. They will be targeting their country accordingly.

6. Submit to Google Local Business Center


Google Local Business Center offers you free listing on Google Maps. When potential customers search Google Maps for local information, they will easily find information about your business: your contact details, hours of operation, even coupons to print out and bring to your shop. You may edit your listing to have your Google Map results updated in a few weeks. Google Local Business Center may be of help even if you don't have a website of your own.

Easy Step For On-Page Optimization | SEO On-Page Optimization

Keyword marketing

Choosing relevant and effective keywords for your website and each Web page is a fundamental step of the whole search engine marketing strategy. Here it is very important to perform advanced keyword analysis and focus on terms with a high number of daily searches and low competition. There are special indexes which can help determine the best keywords for your business and website.

Tuning the pages

This starts with populating the contents of the website with your best keyword and going through HTML elements, optimizing navigation and menus. This topic will provide an overview of all the necessary tasks to implement on your website in order to obtain high search engine rankings.

Optimizing site structure

This area covers topics such as optimization of site architecture, choosing a proper domain and file names and creating a search engine friendly site map. A site map is a very valuable and effective means of guiding both spiders and visitors to your relevant content.

Website submission

The next step explains how to effectively submit your site to search engines, directories as well as the vital principles of a pay-for-performance strategy. The final step here is verifying the success of submission.

Auditing and improving the website

It's a common situation to improve Web pages content and navigation for optimization purposes but forget about the visibility and usability problems these changes may cause. Auditing the website will help to identify these problems, clean up the mess and eliminate any weak points.

Working around specific optimization issues

This part of the course is devoted to optimization advice about Flash sites, graphic-heavy sites, JavaScript and other problem technologies for search engine spiders. Here you'll get a proper picture of search engine demands in order to make specific Web pages search engine compatible.


Picking out Keywords

When someone uses a search engine, they type words into the search box to find what they are looking for. These words are referred to as keywords or key phrases.

Keyword selection is what any optimization mission starts with. The rule of thumb is to start off with keyword research even before you start compiling the content for your pages. However, it is not always possible, and the most common situation is when you've got a website needing optimization and the content is already in place.


As we cannot make a site appear on top when people search for any word (and it isn't worth trying to), we want our site to appear among the first search results for the terms most relevantly reflecting our business

In this Step, you will learn:


  • Basic guidelines for keyword selection;
  • The many ways to obtain keyword suggestions;
Parameters and ways to estimate the potential of every keyword / key phrase.

The algorithm to perform a keyword job is as follows:

  1. Figure out the strategic keywords you want to start off with.
  2. Conduct preliminary keyword research. Get a list of keyword suggestions for each page.
  3. If necessary, select or change the domain name and the names of the pages of your site according to the obtained results.
  4. With the data obtained at step 2, perform your advanced keyword research and complete the list of keywords you will use for optimization on each of your pages.
As a result, every page of your site must have 1-3 exact keywords / key phrases associated with it.

You may want to print out this scheme and make it your checklist while working with keywords. 


Defining your Niche and Audience

To start with our keyword research, we will first figure out which segment of Web surfers we want to find our site, and subsequently, which terms will most likely bring these visitors to us.

Unfortunately, today it's not enough to figure out the keywords that make up the essence of your business. If the industry is competitive (such as, for instance, Web hosting and Web design), there are lots of sites competing to be on the top for these terms. Luckily, Web surfers have become more and more conscious of the need to make their search specific – people who are serious about their search look for "Los-Angeles Web hosting" or "Linux Web hosting" instead of just "Web hosting". Therefore, it gives a bit more variety to optimizers.

Therefore, generic keywords are usually not the best approach. As a rule, it's better to focus on niche keywords related to your product or service.

There are highly specific keywords that people don't search for often but when they do, getting that traffic is very important since these visitors are targeted and will most probably convert into customers. On a regular basis, these are things like actual product names (e.g. Samsung SyncMaster 757 monitor). If someone searches for the specific name of a product, there is a high possibility they are looking to make a purchase. So, those keywords can bring serious profit. If you are running an affiliate site or an online store then this can be very important.

For instance, your site deals with selling office and computer desks. You want it to be on the top when people enter into search boxes "office desks" and "computer desks". These are your strategic keywords for the whole site. Each page of your site can have (and should have) its own strategic keywords, depending on the content. For instance, one page deals with "high-end office desks", the second one with "glass computer desks", etc. It makes a lot of sense to optimize each page separately for different strategic terms.

 This means targeting each page to a specific search term that relates to the overall theme of your site. In our example of computer desks, we'll target each page to a suitable market that is looking for a certain kind of computer desk. Thus, we'll be competing with fewer websites on the same keywords, and our pages will be optimized for terms people actually use when searching. This will also bring much better results than optimizing your whole site for "computer desks".


The next thing to point out is local targeting. For example, if you're optimizing a Web development site whose owner is located in Sydney, Australia, using keywords such as "Web development Sydney" or "Web development services Australia" will make search engines refer most people from this location to your site because lots of people tend to search services or products locally.

 

Easy step To Increase Page Rank | How Search Engines Rank Pages

Every smart Search Engine Optimizer starts his or her career by looking at Web pages with the eye of a search engine spider. Once the optimizer is able to do that, the path is half way complete to full mastery

The first thing to remember is that the search engines rank "pages", not "sites". What this means is that you will not achieve a high ranking for your site by attempting to optimize your main page for ten different keyword phrases. However, different pages of your site WILL appear up the list for different key phrases if you optimize each page for just one of them. If you can't use your keyword in the domain name, no problem – use it in the URL of some page within your site, e.g. in the file name of the page. This page will rise in relevance for the given keyword. All search engines show you URLs of specific PAGES when you search – not just the root domain names like

Second, understand that the search engines do not see the graphics and JavaScript dynamics your page uses to captivate visitors. You can use a graphic image of written text that says you sell beautiful Christmas gifts. But it does not tell the search engine that your website is related to Christmas Gifts – unless you use an ALT attribute where you write about it.

Here's an example to illustrate.

What the visitor sees:


Beautiful Christmas Gifts!!!


What the search engine will read in this place: 

The first example is what visitors see, the second is the source code script that produces the output. Assume the search engine spider is intelligent enough to read the script (however, actually not all the spiders do); is there anything in the code that can tell it about the Samsung Monitor? Hardly.

As a rule, search engine spiders have a limit on loading page content. For instance, the Googlebot will not read more than 100 KB of your page, even though it is instructed to look whether there are keywords at the end of your page. So if you use keywords somewhere beyond this limit, this is invisible to spiders. Therefore, you may want to acquire the good habit of not overloading the HEAD section of your page with scripts and styles. Better link them from outside files, because otherwise they just push away your important textual content.


There are many more examples of relevancy indicators a spider considers when visiting your page, such as the proximity of important words to the beginning of the page. Here, as well, the spider does not necessarily see the same things a human visitor would see. For instance, a left-hand menu pane on your Web page. People visiting your site will generally not first pay attention to this, focusing instead on the main section. The spider, however, will read your menu before passing to the main content – simply because it is closer to the beginning of the code.


Remember: during the first visit, the spider does not yet know which words your page relates to! Keep in mind this simple truth. By reading your HTML code, the spider (which is just a computer program) must be able guess the exact words that make up the theme of your site.


Then, the spider will compress your page and create the index associated with it. To keep things simple, you can think of this index as an enumeration of all words found on your page, with several important parameters associated with each word: their proximity, frequency, etc.


Certainly, no one really knows what the real indices look like, but the principals are as they have been outlined here. The words that are high in the list according to the main criteria will be considered your keywords by the spider. In reality, the parameters are quite numerous and include off-page factors as well, because the spider is able to detect the words every other page out there uses when linking to your page, and thus calculate your relevance to those terms also.


When a Web surfer queries the search engine, it pulls out all pages in its database that contain the user's query. And here the ranking begins: each page has a number of "on-page" indicators associated with it, as well as certain page-independent indicators (like PageRank). A combination of these indicators determines how well the page ranks.


It's important to keep this in mind: after you have made your page attractive for visitors, ask yourself whether you have also made it readable for the search engine spiders. In the lessons that follow, we will provide for you detailed insight into the optimization procedure; however, try to keep in mind the basics you've learned here, no matter how advanced you become. 

Classification of Search Engines | About Search Engine

The term "search engine" (SE) is often misused to describe both directories and pure search engines. In fact, they are not the same; the difference lies in how result listings are generated.
There are four major search engine types you should know about. They are:
  • crawler-based (traditional, common) search engines;
  • directories (mostly human-edited catalogs);
  • hybrid engines (META engines and those using other engines' results);
  • pay-per-performance and paid inclusion engines.
Crawler-based SEs, also referred to as spiders or Web crawlers, use special software to automatically and regularly visit websites to create and supplement their giant Web page repositories.

This software is referred to as a "bot", "robot", "spider", or "crawler". All these terms denote the same concept. These programs run on the search engines. They browse pages that already exist in their repositories, and find your site by following links from those pages. Alternatively, after you have submitted pages to a search engine, these pages are queued for scanning by a spider; it finds your page by looking through the lists of pages pending review in this queue.


After a spider has found a page to scan, it retrieves this page via HTTP (like any ordinary Web surfer who types an URL into a browser's address field and presses "enter"). Just like any human visitor, the crawling software leaves a record on your server about its visit. Therefore, it's possible to know from your server log when a search engine has dropped in on your online estate.


Your Web server returns the HTML source code of your page to the spider. The spider then reads it (this process is referred to as "crawling" or "spidering") and this is where the difference begins between a human visitor and crawling software.

While a human visitor can appreciate the quality graphics and impressive Flash animation you've loaded onto your page, a spider won't. A human visitor does not normally read the META tags, a spider can. Only seasoned users might be curious enough to read the code of the page when seeking additional information about the Web page. A human visitor will first notice the largest and most attractive text on the page. A spider, on the other hand, will give more value to text that's closest to the beginning and end of the page, and the text wrapped in links.

Perhaps you've spent a fortune creating a killer website designed to immediately captivate your visitors and gain their admiration. You've even embedded lots of quality Flash animation and JavaScript tricks. Yet, a search engine spider is a robot which only sees that there are some images on the page and some code embedded into the "<script>" tag that it is instructed to skip. These design elements are additional obstacles on its way to your content. What's the result? The spider ranks your page low, no one finds it on the search engine, and no one is able to appreciate the design.


SEO (search engine optimization) is the solution for making your page more search-engine friendly. The optimization is mostly oriented towards crawler-based engines, which are the most-popular on the Internet. We're not telling you to avoid design innovations; instead, we will teach you how to properly combine them with your optimization needs.

Let's return to the way a spider works. After it reads your pages, it will compress them in a way that is convenient to store in a giant repository of Web pages called a search engine index. The data are stored in the search engine index the way that makes it possible to quickly determine whether this page is relevant to a particular query and to pull it out for inclusion in the result page shown in response to the query. The process of placing your page in the index is referred to as "indexing". After your page has been indexed, it will appear on search engine results pages for the words and phrases most common on the indexed Web page. Its position in the list, however, may vary.

Later, when someone searches the engine for particular terms, your page will be pulled out of the index and included in the search results. The search engine now applies a sophisticated technique to determine how relevant your page is to these terms. It considers many on-page and off-page factors and the page is given a certain position, or rank, within other results found for the surfer's query. This process is called "ranking".

Google (www.google.com) is a perfect example of a crawler-based SE.
Human-edited directories are different. The pages that are stored in their repository are added solely through manual submission. The directories, for the most part, require manual submission and use certain mechanisms (particularly, CAPTCHA images) to prevent pages from being submitted automatically. After completing the submission procedure, your URL will be queued for review by an editor, who is, luckily, a human.

When directory editors visit and read your site, the only decision they make is to accept or reject the page. Most directories do not have their own ranking mechanism - they use various obvious factors to sort URLs, such as alphabetic sequence or Google PageRankTM (explained later in this course). It is very important to submit a relevant and precise description to the directory editor, as well as take other parts of this manual submission seriously.


Spider-based engines often use directories as a source of new pages to crawl. As a result, it's self-evident in SEO that you should treat directory submission and directory listings as seriously and responsibly as possible.


While a crawler-based engine would visit your site regularly after it has first indexed it, and detect any change you make to your pages, it's not the same with directories. In a directory, result listings are influenced by humans. Either you enter a short description of your website, or the editors will. When searching, only these descriptions are scanned for matches, so website changes do not affect the result listing at all.

As directories are usually created by experienced editors, they generally produce better (at least better filtered) results. The best-known and most important directories are Yahoo (www.yahoo.com) and DMOZ (www.dmoz.org).

Hybrid engines. Some engines also have an integrated directory linking to them. They contain websites which have already been discussed or evaluated. When sending a search query to a hybrid engine, the sites already evaluated are usually not scanned for matches; the user has to explicitly select them. Whether a site is added to an engine's directory generally depends on a mixture of luck and content quality. Sometimes you may "apply" for a discussion of your website, but there's no guarantee that it will be done.

Yahoo (www.yahoo.com) and Google (www.google.com), although mentioned here as examples of a directory and crawler respectively, are in fact hybrid engines, as are nowadays most major search machines. As a rule, a hybrid search engine will favor one type of listing over another. For example, Yahoo is more likely to present human-powered listings, while Google prefers its crawled listings.

Meta Search Engines. Another approach to searching the vast Internet is the use of a multi-engine search, or meta-search engine that combines results from a number of search engines at the same time and lays them out in a formatted result page. A common or natural language request is translated to multiple search engines, each directed to find the information the searcher requested. The search engine's responses thus obtained are gathered into a single result list. This search type allows the user to cover a great deal of material in a very efficient way, retaining some tolerance for imprecise search questions or keywords.


Examples of multi-engines are MetaCrawler (http://www.metacrawler.com) and DogPile (http://www.dogpile.com). MetaCrawler refers your search to seven of the most popular search engines (including AltaVista and Lycos), then compiles and ranks the results for you.

Pay-for-performance and paid inclusion engines. As is clear from the title, with these engines you have no way other than to pay a recurring or one-time fee to keep your site either listed, re-spidered, or top-ranked for keywords of your choice. There are very few search engines that solely focus on paid listings. However, most major search engines offer a paid listing option as a part of their indexing and ranking system.


Unlike paid inclusion where you just pay to be included in search results, in an advertising program listings are guaranteed to appear in response to particular search terms, and the higher your bid, the higher your position will be for these terms. Paid placement listings can be purchased from a portal or a search network. Search networks are often set up in an auction environment where keywords and phrases are associated with a cost-per-click (CPC) fee. Such a scheme is referred to as Pay-Per-Click (PPC). Yahoo and Google are the largest paid listing providers, and Live Search (formerly MSN) also sells paid placement listings.

Easy Steps for Search Engine Marketing (SEM)

Search Marketing has become a buzzword that is now heard all over the place many times a day. Here we provide an exact definition of what it refers to, and how it relates to both Web Search and Web Marketing.

Search Marketing is also known as Search Engine Marketing (SEM), and as such we will refer to is as SEM throughout this course. The definitions that follow are the basics; if you are an expert / advanced Search Marketer, you can skip these terms; otherwise we recommend that you read and understand them.

Search Marketing is a part of business marketing efforts that is aimed at increasing traffic (the number of visitors) to the website from the search engines. Additionally, it addresses conversion (the percent of visitors who become buyers). The first is achieved by increasing search engine visibility, i.e. the position of your site in search engine results for certain keywords that people type in the search box to obtain these results.

For instance, if someone wants to find and buy a digital camera, they will go to a search engine such as Google and type "digital camera" in the search box. Google will list, in this case, 138 million results (these are the real figures extracted while creating this course). If you sell digital cameras or offer any related services, your site may be listed among these 138,000,000 results. Here, everything depends on how deep you are. If you are on the first or second page of the search results, it's more likely that such visibility will bring many visitors and customers from Google. If you are the 300th result, it's unlikely that anyone at all will come to you from Google.


Together with the power and size of your banner / ad network, your affiliations and partnerships, SE visibility comprises a broader concept - Web visibility (aka online visibility).

Generally, there are two main methods of carrying out SEM: a) Search Engine Optimization (SEO) b) using pay-per-click and paid inclusion listing models. They are briefly depicted in the following lessons.

Although paid inclusion and pay-per-click advertising methods seem like the fastest methods to search engine marketing, website owners prefer to adopt a more time consuming search engine optimization method to obtain better marketing of their website on search engines.


Organic rankings are results that you get for free. That is, you create Web copy and publish it, then after a certain period of time a search engine robot finds it (either by itself or as a result of your submission). Finally, the robot reads your content and puts your site into its index. Now your site will be found by this search engine when people query for some words contained within your pages. Obtained this way, your positions in the result list are called your "organic search engine rankings".


Paid listings are different: pay a search engine and it guarantees the inclusion of your site in the index. Moreover, many search engines offer advanced pay-for-performance programs, such as showing your site / ad in the search results for keywords of your choice. These are the so-called "sponsored" results. Most commonly, you will have to pay a specified rate for each visitor that comes to your site from this search engine that clicks on these ads.

Mastering both methods and their proper combination can provide maximum search engine visibility. Because things keep changing, search engine marketers need to devote a good deal of time staying on top of the SEO industry and its trends.
The aim of SEM is not only to find a proper balance between organic and paid listings, but also to achieve maximum conversion of visitors into loyal customers. Nowadays SEM relies on the statement that it's not the traffic itself that matters, but how targeted and convertible it is. The way your traffic converts also matters a lot –  even more than your site rank on a search engine. You can rank worse than a competitor and yet the percentage of your visitors that turn into buyers can be high enough to actually outperform a competitor several times over.
The following are the main goals of Search Engine Marketing:
  1. Improve Web visibility and get as much traffic as possible.
  2. Improve traffic quality: get high rankings for exactly those keywords that bring visitors with the best conversion rate.
  3. Decrease expenditures by switching off advertising for underperforming keywords.

Methods used by Search Marketing

The main methods used for achieving the goals of Search marketing are Search Engine Optimization (for organic listings), Bid Management (for paid listings) and Web Analytics (for both types of listings).

Search Engine Optimization (further referred to in this course as SEO) is about changing the HTML code of your pages and the structure of your site in such a way that when an SE robot reads the site, it can understand that the pages have valuable content related to your keywords, and then rank them high. SEO also tells about ways to increase your link popularity - the number of links from other high-ranked pages to your site. This is important because most search engines consider your link popularity a vital ranking factor.


Bid Management is about controlling bids, i.e. the amount of money you spend maintaining your visibility in the sponsored listings. Usually you try to detect the best converting keywords and keyword groups, in order to increase bids on them; as well as decrease or take off bids on keywords that don't break even. Attention also should be paid to leveraging your paid and organic listings, so to spend less on paid advertising campaigns when you get enough traffic from natural results, and invest in paid advertising when an algorithm changes or strong competitors force you out from the top positions in the organic listings.


Web Analytics (further referred to as WA) is about getting, analyzing and using the information about your visitors, their details, their behavior on your site, the ways they have found your site, the efficiency of referrers and advertising, conversion rates, and, together with all that, eCommerce information.

Introduction to Internet Marketing - Online Marketing - Digital Marketing

Starting from the early 1990s Internet marketing made an amazing development from simple text-based websites that offered product information into highly evolved and complete online businesses promoting and selling their services on the Internet.
Nowadays, the Internet marketing industry has become a complicated and Branch science involving a great deal of theoretical knowledge in combination with applied techniques. As a science, it ranges from browser-side and server-side programming and coding on one end to marketing and economics on the other.

Internet marketing means the use of the Internet to advertise and sell goods and services. It includes Banner and Text Advertising, Email Marketing, Interactive Advertising, Affiliate Marketing and Search Engine Marketing (including Search Engine Optimization and Pay-Per-Click Advertising).

Our first stage of the Internet Marketing course will start with Search Engine Marketing (SEM) as a specific area of an online marketers' business. Mainly, its purpose is to increase targeted traffic from search engines via organic search engine ranking, paid listing and advertising. Here you'll be shown the main principles of Search Engine Optimization (SEO), link building, and paid advertising campaigns.

Actually, every successful search engine optimizer should be aware of top search engine demands and consider them while creating website and improving on-page and off-page factors for Web pages. There are numerous important factors influencing search engine ranking of a Web page. The SEO division of the course provides profound and consecutive lessons depicting each step of your optimization work. Search Engine Advertising is the last topic of the SEM Stage.

There are certain methods that go beyond SEM that can help improve your site online visibility. These include, for instance, creating and spreading a banner / ad network and / or paid link partnerships, as well as email marketing and building affiliate relationships with other websites.

Email marketing is an independent branch which has to be dealt with separately and does not have much in common with SEM. Email marketing is a subject of our next stage and there we will provide insight into the main direct mail campaign steps and guidelines.
Banner networks relate to SEM as long as they touch upon your link popularity (which is a component of SEM).

In the following stage you'll study the Affiliate Marketing division of Internet Marketing. It is a popular method for promoting web businesses when with few marketing dollars marketers can establish a presence and earn a profit recruiting affiliates. Such partner networks can grow with your company business projects and add its profit to your marketing budget.
The most vital stage of the whole course is Web Analytics.

 Its role can be hardly underestimated as Web Analytics is an essential measure for continually improving web business performance, advertising campaigns, organic search engine results, ranking positions and others. Generally, Web Analytics deals with the traffic already generated at the previous stages. Its primary goal is to improve traffic quality and enhance conversion. Although it is possible (and advisable) to understand every theoretical aspect of Internet marketing, in practice you may do much better by specialization in a specific area or technique and simply start your Internet Marketing business. Our last stage provides a proper and clear scheme about how to estimate your potential, find a niche, manage projects and promote your services as online marketer, SEO consultant, etc.