,

What is the difference between Google, Yahoo, and MSN? How do I rank well for all three?

When it comes to SEO, each of the top three search engines are indeed different and use different algorithms to rank your pages. Many site owners will tell you that their site ranks well on one engine, and very poorly (or not at all) on another. Without access to trade secrets from the big three it can be difficult to ascertain what works for which engine, but with some simple observation and experience it is possible to gather a few generalities as listed below and come up with an optimum approach to ranking well across the web and across search engines.

First of all, Google, Yahoo, and MSN take up about 95% of the search engine pie, so attempting to rank well in the myriad other engines might not be worth your time. Ask.com may be worth considering, but that’s another story. I’ll include Ask in the forthcoming summary.

Google

Google is tops. They’ve been at it the longest. They take about 81% of the pie as of February 2009. There’s a reason for this dominance: they are simply the best.

Likes:

Keywords in the url.

Keywords in the title.

Keywords in page headers or h1 tags.

Backlinks, lots of backlinks. Prefers quality over quantity. This is otherwise known as “link popularity” which Google really, really likes. And they only count the ones they like.

Relevant anchor text. External and internal. This is related to link popularity.

Secondary keywords. Enough of them to justify your primary keyword density. This is something I mentioned in another post about singulars vs plurals and latent semantic indexing (LSI) <http://www.sunsigndesigns.com/cgi-bin/ebb/blog2/index.php?action=viewcomments&pid=9> .

Fresh content. Sites which add fresh content on a regular basis will be rewarded. This is why Google likes blogs, especially popular ones. Busy blogs and backlinks. This is also known as “buzz”.

Google seems to prefer informational pages to commercial sites. This is why all those directories tend to dominate the SERPs for certain types of keywords to the ire of most business site owners.

Dislikes:

Similar pages.

Duplicate content.

Over-optimization (spammy text).

Excessive low quality links and link schemes.

Yahoo!

Yahoo! wants to be Google and so has adopted many of Google’s practices. This could be why it is served the second largest piece of the search engine pie or about 12% (which has dropped recently). They tend to like and dislike the same things as Google, but there are a few differences worth noting between the top two search engines (aside from the disproportionate market share) which makes them somewhat easier to dominate than Google:

In direct contrast to Google, Yahoo! tends to prefer commercial pages to informational pages. Unfortunately, this usually means their own commercial pages. At any rate, Yahoo! does use latent semantic indexing in its algorithm, but not as extensively as does Google. Therefore, Yahoo! search results are much more literal than Google’s, and “exact matching” is more important than “concept matching” which makes them slightly more susceptible to spamming or spammy-type text and spamminess in general or spammy spam, if you know what I mean by spam and spamlike spam. You can also say that this makes them easier to “fool” but then I would remind you that for Yahoo! this trait leads to poorer search results, less relevancy, and therefore a generally inferior search engine.

Yahoo! places more importance on the number or quantity of backlinks than does Google. It is common for Google to report 5 backlinks to a page, while Yahoo! reports 50 or more for the same page. In terms of backlinks and how link popularity affects page rank, the difference between quality and quantity may balance each other out when mapping link popularity and effective page rank across both engines (disregarding market share for the moment).

Yahoo! gives more credence to meta keywords and description tags than does Google. When a site owner wishes to improve his or her standing in Yahoo! this is where they usually start. Good descriptions, keyword tags, on-page content, and relevant and descriptive titles will usually pay dividends with Yahoo!

MSN

MSN is small and weak because it doesn’t get enough pie (about 3%). It doesn’t get much pie because it sucks. MSN is the youngest of the big three engines and is still trying to figure out what people want. More accurately, they are trying to bypass what people want by using their brand name to get their foot in the door, and then hoping people will be lazy enough to just accept them as their “default search engine” when buying other Microsoft products (products like Windows and Internet Explorer). This worked for their OS, browser, and office products, but it won’t work for their search engine. People are more saavy these days. The good news is that because they are so small and insignificant there is less competition and new sites can rank well, for what it’s worth. In a nutshell, the MSN algorithm differs significantly from the Google and Yahoo! algorithms in that:

MSN only crawls the beginning of pages (?!) The MSNbot is the laziest of all searchbots, seemingly designed to crawl only home pages. For this reason, it is much harder to grab their attention with backlinks, but because of the resulting small size of their index, it is much easier to rank well with traditional SEO basics: titles, urls, headers, etc.

Because MSN for whatever reason chooses not to crawl too deeply, they put less emphasis on link popularity and more on page content. Again, this is why new sites can rank well, but it also makes them more susceptible to spam and why they rely more on exact matching than on content matching. That is, they are more literal than semantic, and their SERPs are less authoritative and relevant (spammy).

Although MSN places less emphasis on link popularity, they are attracted by new links, like a trout is attracted to a shiny, new lure. But this is fleeting, and again MSN is not very good at crawling and link analysis.

Microsoft being the powerhouse that it is, it would be premature to count them out just yet. They are constantly improving and reworking their algorithm. This makes them less reliable at the moment, but I see them gaining market share as they play catch up and as Yahoo! experiences steady declines in its search results. Yahoo! boasts so many internal products in its attempt to be “all things to all people” and to please “all of the people all of the time” (which is evidenced by how cluttered their home page is compared to Google’s). We have Yahoo! this, Yahoo! that, and Yahoo! the other thing. Yahoo! seems to be competing with itself, and the MSN search engine seems to suffer from the same narcissisim (too many internal products competing with each other). If MSN ever decides to expand its index and at the same time separate its search products from its other products, I think you will see them picking up Yahoo’s lost market share.

Ask

The Ask search engine commands a whopping 1% of the total search engine market, which is why I only included it here as an afterthought. But it may be worth considering if you can find a natural fit within the Ask community. Ask is a topical search site, meaning that it places a premium on sites that are linked to topical communities and categorized “hubs”. Basically, you need the trust and confidence (links and citations) of the Ask community of sites in order to rank well. Ask is more susceptible to spam (keyword stuffing) and less proficient at semantic indexing even within those topical communities. So, even though they tend to concentrate on groups of sites and categories of content, those communities and hubs are usually less authoritative and relevant than they are filled with sponsored ads, ecommerce, and spam. Because Ask is more vertical and narrow (smaller and more specialized) than other engines, it is wise to approach this engine more from a networking or marketing perspective. Like MSN, Ask is not at the moment attempting to gain market share from Google, but is instead attempting to find another angle, create a new market, even change the way people search. It remains to be seen whether or not these smaller engines will evolve into something new or eventually be devoured by Google or some other more traditional engine.

Optimizing for Multiple Search Engines

So, how to approach all three? Is there a way to target each engine specifically, in order to give each engine what they want? There is, but most people would argue that you should pay more attention to what your customers need, rather than what the search engines want. In that case, good, solid, SEO basics and established trust and authority built over time will get the most traction on the web, as opposed to any so-called SEO “techniques”. Be a bull, not a bear, when optimizing and promoting your site.

The main thing which separates Google from the rest is its low tolerance for spam. The converse is also true: what MSN, Ask, and Yahoo! have in common is an overabundance of commercial content, spam, and irrelevant search results. But most site owners still submit to all three of the lesser engines simply because together they make up 20% of the market, which is nothing to sneeze at.

You could create three or four folders or directories (three sites, essentially), each optimized for a specific engine, and then use a robots meta tag or robot.txt file to direct the individual searchbots to the appropriate folder. In other words, Googlebot will crawl and index <http://www.sunsigndesigns.com/cgi-bin/ebb/blog2/index.php?action=viewcomments&pid=5> one group of pages, Yahoo’s searchbot another, and MSNbot still another. This is perfectly reasonable and would probably work, but it’s not worth all that toil and trouble, especially for very large sites.

It is therefore highly recommended that you stick to the basics: keyword and content analyses, titles, urls, header tags, linking structure, and link popularity (backlinks). If you cater to your customers rather than to the search engines, chances are your site will grow and become popular, relevant, and authoritative “on its own.” Did I say popular? Regardless of each engine’s individual quirks, providing good, indexable content, obtaining quality backlinks and generating “buzz” (activity) are still the surest ways to dominance on the web. Get out there. Do something. Participate. Blog. Grow. Expand. Don’t rely solely on titles and tags. Stay fresh, current, interesting, and useful, not stagnant and stale. The search engines will notice. Most of all, cater to your customers and give them what they want and need, something useful. Do that, and the rest will come naturally, including popularity, traffic, and sales. Buzz.

,

Singulars vs Plurals

Should you include both singulars and plurals in your keyword selection process? The quick answer is yes. It would be ideal to say that your pages are optimized for both the singular and the plural forms of certain keywords, assuming your visitors are using both, but it can get confusing. Which should you use in any given circumstance? Where on the page should you put them? What about the keyword meta tag? Most search engines do treat singulars and plurals differently, so these are good questions to ask.

The important thing is to understand which one your customers are using or will use and to remember that, by mathematical necessity, only one can have prominence. Only one can go first in the title, for instance. So, it’s not as simple as killing two birds with one stone or covering both bases, because it is actually impossible to optimize for two terms equally. Now ask yourself if this is really what you want. Is a near fifty percent split on the same page for two popular keywords really desirable?

Attempting to cast a wider net is never desirable with a new site. If ten people a month search for “widgets” and ten more people a month search for “widget” and you optimize a page for both, do you then have a shot at all twenty? Perhaps, if there is very little competition for those words, but it is more likely that you have simply diluted your chances for both. You have cut your chances in half for each, and what’s worse you can’t take those halves and add them together because half of a losing position is still a losing position. Search engine results are listed in order of relevance. Google has only one first page. Exact matching is somewhat important. So, better to use one stone for each bird.

The ideal solution then is to use two pages, optimizing one page for the singular and one page for the plural, taking care not to duplicate content of course. If this is not possible for your particular situation, then you will have to choose one, and decide which one deserves prominence and how much prominence. Don’t forget that secondary keywords, derivatives of your root words, and related words, should be liberally sprinkled around the page in the less prominent positions: singulars, plurals, nouns, adjectives, verbs, synonyms, even antonyms.

Search engines have evolved in recent years to include semantically related keywords in their algorithms. Primary keywords in association with secondary or related keywords rank better than primary keywords alone. Because of this, it is possible to “over-optimize” a page by focusing too much on one keyword or phrase. Search engines use “latent semantic indexing” to examine clusters of related keywords in an attempt to rank pages more accurately and return the most useful results. So, include a good mix of secondary and related words in your page content, including singulars and plurals.

Keyword meta tags are different, however. Search engines have mostly abandoned this tag in favor of those related keywords that exist on the page proper. Not a bad idea. Hopefully you have enough information on the page to make the meta tag irrelevant. Most people agree that these meta tags are not very important, but that it doesn’t hurt to keep in the habit of using them. You can never know to what extent they are being used or will be used in the future. Do not repeat keywords in your meta tags unnecessarily. Relevance is more important here than number or density.

It makes sense to me that search engines would prefer to find secondary or related keywords in the content of the page rather than in the meta tag. Latent semantic indexing gives search engines the ability to group “Babe Ruth” and “baseball bat” together for the purposes of ranking. It does this by examining supporting keywords on your page and from other pages in the same category and by building a secondary index of these related keywords. It doesn’t need a meta tag embedded in the code to do this.

The importance and the extent to which search engines use latent semantic indexing is still somewhat of a mystery, but the fact that it exists can be demonstrated by using a tilde before any search term in your Google search bar (~keyword). You will see that searching for “~design” returns and highlights related terms like “designer,” “designed,” “designing,” even “architecture” and “construction.” There are plenty of instances where this doesn’t quite show up so well, and when it does it is not always on target: The #1 position for “~scoreboard” in Google search is held by a page that does not have the word “scoreboard” anywhere on the page or anywhere on the site. It is held by a page optimized for the word “results.” Google understands that “results” can be semantically related to “score” and “scoreboard” and so returns the highest ranking related term it can find. The page actually has nothing at all to do with sports or scoreboards (results.org is a political organization), but the point is that Google does use some form of secondary indexing of related terms, with or without the tilde. A search for “widget” returns “widgets” in the #1 position. Most search engines also now include links to “related searches” right in their results pages.

Let me repeat: use both singular and plural forms of your keywords and related keywords liberally throughout the page and decide which one should have the prominent positions (do not try to split the prominent positions equally). You will get credit for your secondary keywords, by themselves and as support for your primary keywords. Repeat them often on your page, but for meta tags it is not necessary to include all forms or to repeat them because it’s little help, and search engines can parse these terms for whatever they’re worth: “scoreboards” already includes “scoreboard” within it and therefore using both is redundant. You will get equal credit for each. The same holds true for phrases which include other terms or phrases within them. Example: wood, wood stove, wood stove cleaners. The first two are redundant because they are contained within the third. Only the third one is necessary. “Wood stove cleaners” covers “wood”, “stove”, “wood stove,” and “wood stove cleaner” and “wood stove cleaners” all in one three word phrase. That’s five search terms in one which covers a lot of area. It is not necessary to keep repeating words in your meta tags. Key phrases are slightly different than keywords, as when you decide to include “wood stove pipes” in the example above, but you do NOT get a keyword density score (for good or bad) from your meta tag. It is doubtful that you will get much of a score at all, except where page content is missing or terribly light. And then not much. Keyword tags are not meant for people (they are meant for search engines) and are not considered content. The practice of stuffing these tags with irrelevant (or otherwise missing) keywords made these tags untrustworthy.

In conclusion, Google certainly does treat singulars and plurals differently, but it also considers them semantically related. This is why a search on either phrase returns mixed results. Google purportedly uses over 200 criteria for ranking pages and “exact matching” is just one of those things. Optimizing a page for the singular does not mean it will place higher in the SERPS than a page that was optimized for the plural when there are so many other factors for Google to consider. But it’s all about context and relevancy (and semantics). So, treat each case separately, give one prominence over the other rather than trying to use both equally, don’t repeat words unnecessarily in your meta tags, and then cross one more thing off your list of two hundred things to consider. :))

,

Mobile Search Engine Optimization

Everything you read about mobile search engine optimization begins with some statistic about how mobile search usage is increasing. So, I’ll get that out of the way first. In 2008, 20 million Americans searched the internet through their mobile phones or PDAs, a 65% increase from the previous year. There are only 1.25 billion PCs in the world, but about 3 billion mobile handsets.

If you haven’t thought about making your website mobile friendly, then you haven’t been paying attention. All the major search engines and plenty of minor ones have invested a considerable portion of their energy in providing content to the mobile search engine market, and this content comes from the millions of website owners and advertisers who have done the same.

Would you benefit from making your website mobile friendly? How do you reach those 3 billion mobile devices? How exactly do you go about making sure your website is mobile friendly? Alright, calm down. The concept is simple: all you need to know is that a PC is not a wireless handset, and a wireless handset is not a PC. So, you will need one version of your website designed for PCs and another version designed specifically for the wireless handset world.

This does not mean you need two websites, of course. By using CSS to separate style from content you can present a “stripped down” version of your core content to the mobile search engines. Google and Yahoo! have services for submitting mobile content and mobile site maps. The W3C provides mobile markup standards like XHTML-MP (XHTML-Mobile Profile). The search engines and the various mobile browser technologies will handle the indexing and display of your web pages to the mobile public, just as they do on your PC.

The technical side of creating a mobile friendly website involves following stricter markup rules and providing lighter pages. That means plain text and smaller images, but with the variety of mobile handsets and mobile browsers out there in those 3 billions hands, the possibilities are not so boring. Some handsets and PDAs cannot handle tables, Javascript, CSS, and multimedia, but many of them can. Accommodating the sheer range of capabilities of the different devices may seem daunting at first, but the ability of CSS to separate style from content and the use of strict standards based XHTML allows you to reach as much of the mobile search market as possible.

There are several validators available on the web where you can test the validity of your markup and your site’s “mobile-friendliness” such as the .Mobi Validator <http://ready.mobi/index.html> and the W3C Validator <http://validator.w3.org/mobile/> . These give detailed error reports which allow your web developer to know precisely how your site would fare on the mobile web and what specifically he or she would need to do to make it standards compliant. There are simulators which allow you to view what your site looks like on different mobile handsets such as the Opera Mini <http://www.opera.com/mini/demo/> /demo <http://www.opera.com/mini/demo/> and the .Mobi Emulator <http://ready.mobi/index.html> . For the .Mobi Emulator you need to enter your url to test the validity of your markup before the emulator renders the page for you. Modern browsers like Firefox and Opera also allow you to turn off certain components (CSS, frames, images, flash, java, javascript, etc.) in your browser which is another way to get an idea about how your pages would render without all of these technologies. Finally, methods for mobile browser detection provide you with the tools necessary to detect which mobile configuration is making the HTTP request and deliver the appropriate content to the appropriate device.

Now the mobile search public is a different kind of searcher than the…um…stationary public. The bulk of mobile searchers tend to be on the move and focus more on things like taxis, pizza, movies, sports scores, local services, and other “give me quick information now” type things than they are in reading, researching, or lengthy browser sessions. This doesn’t complete the entire mobile web picture, of course, but the GPS capabilities of modern mobile devices allow companies like Google and Yahoo! to provide very accurate local search results for mobile users who are most likely looking for a phone number they can automatically dial with the fewest “clicks” and the least amount of scrolling possible. This can even be done on handsets that don’t have a full blown browser installed, without the user even asking or searching for it, basically providing a permanent “contact list” of local services on their phone which is updated in real time. This is basically a “411” of websites.

The full browser capabilities of modern cell phones and PDAs, especially of higher end devices, allow mobile users to navigate your site in much the same way as they would on their PCs. It’s simply a matter of accommodating not only coding requirements, but also the physical limitations of the medium (smaller screens, less processing power, and limited keyboards). The KISS principle applies here more than anywhere else, both in content and in presentation. But the most important thing to remember is that because the attention of the average mobile searcher is focused more on local searches, the mobile SERPs tend to reflect that. Therefore if you want to place well in the mobile world, you need to focus your SEO efforts locally, if you can. There are ways to do this beyond just signing up for Google Maps, Yahoo Local, etc., but signing up for those services will help.

Of course, there are other kinds of mobile traffic besides search traffic. Referral traffic and direct traffic can also come from the mobile web. On that note, while link-building in the PC world is the best way to increase your page rank, in the mobile world “citations” are the holy grail of page rank, rather than links. A citation is when your location information appears somewhere on the web, that is it’s “mentioned” somewhere on some page. So, make sure you put your location and contact information on all your pages and try to get them mentioned elsewhere. Paid advertising on the mobile web also works the same as it does in the PC world. Google Adwords automatically includes your PPC ads in their mobile search network. So, as mobile technology is ever improving and the number of users ever growing, making your site mobile friendly is worth looking into for your particular product or service.

,

What’s the difference between “indexed” and “cached”?

Not much: an “indexed” page is a page that has been crawled by a search engine spider and filed away in that search engine’s index for later use (note the italics), and a “cached” page is one that might actually show up in search results.

Apparently, getting your submitted pages listed in the SERPs is a multi-step process. First, the spiders need to access your home page and crawl whatever links it finds there. If they crawl, when they crawl, and how deep they crawl, depends on a number of factors, but let’s assume you have an easily accessible site rich with content and that the spiders have crawled every page. Congratulations! The next step is for them to decide whether or not to put the information they find into their index. Now assume they like your content well enough and decide to index all of it. Great! At this point, you can say that your pages have been indexed.

So what? Big deal. By this I mean it’s good to be indexed, but it’s not enough because this part of the index is not offered to the general search public. That is, your pages have not been cached, but all of the textual information from your pages (urls, titles, tags, snippets) have been indexed. Of course, this textual information has been cataloged, categorized, filed (indexed!) based on the keywords you chose when you built your pages and on the relative focus and interaction of those keywords on those pages, and it has been ranked accordingly, but this part of the index is not meant for the SERPs and is not accessible to surfers. This is web page purgatory. La-la land. The Google “sandbox”. It’s a way station for lonely, unused web content. A word bank from which search engines can withdraw money. Okay, no more metaphors. The point is that the search engines will not offer this content to web surfers at this stage. Google may tell you, “We have just indexed your pages!” Great, but they will NOT show up in search engine results. They may eventually arrive, but not yet.

The last step in the process comes when the search engines decide that your information (which they have already indexed and so have access to) might actually be of use to someone. When that happens, they will take a “snapshot” of the page (save or download it) and store the file away in their index of cached web pages. This is a completely different index, or more accurately, a subset of the original index, but it is safe to say that these cached pages qualify for inclusion in the SERPs, and now your potential customers may be able to find you. The search engine gods have gone one step beyond indexing your content and have now indexed the actual pages, which they will present to web surfers in their full and complete glory. It’s really just a matter of semantics. Your content has been indexed the whole time, but that doesn’t guarantee inclusion in the SERPs. Only further indexing (caching) will do that.

So, a search engine could have all 1,000 of your pages “indexed” but only 50 or so “cached.” This explains the strange numbers you get when you do a site search (site:yourdomain.com) and notice that the number at the top differs, often drastically, from the actual results on the page. This is to be expected because there may be several pages on your site which the spiders have crawled and indexed, but which the search engines did not find meaningful or useful enough to cache. You may even agree with them. A “thank you for ordering” page may get indexed (spiders are not overly picky eaters), but it probably shouldn’t get cached and it probably won’t. But the search engines figure (or were told to figure) that this page may become useful someday and so keep it in the index uncached. They will likely revisit the page on subsequent crawls to see if any changes have been made and to reevaluate the situation.

Sometimes a search engine will just stubbornly refuse to cache a page in their index which you know for a fact is very useful and which you know your customers would just love

,

SEO Maintenance

SEO is not a one time event but a continuous process. Your initial attempt at search engine optimization for your website will almost certainly not be your last. Most site owners will tell you that there is always room for improvement. Search position and page rank are not static but are constantly changing and fluid. Trends change, industries change, markets change, search engine algorithms change, and website owners who neglect to maintain and periodically “refresh” their site after it is built and established run the risk of becoming irrelevant.

Competitors are forever attempting in several ways to outdo one another for placement in the SERPs, and an increase in page rank for one of your competitor’s pages unfortunately means a relative decrease for your own. This happens constantly as competitors come and go and work out their own SEO issues. Because of this, it is inevitable that your search position will degrade over time.

It is important to keep providing fresh content and to periodically update that content. This could be as simple as providing new offers, coupons and banner ads and tweaking your site’s titles and tags, or as much as adding whole new pages of indexable content. Search engines love finding new links to crawl and comparing the pages they find with the content they already have. Again, almost any content on the web will become stale over time without frequent updates.

Constant monitoring and maintenance of your position on the web is therefore a must, but what exactly should you be thinking about when the time soon comes to address these issues? Here are just a few considerations:

SEO Reports – Careful monitoring and analysis of your current position is important. Weekly reports containing search results by keyword or by search engine can tell you how visible your site is to the search engines and to your customers. They can also show you how that visibility fluctuates over time and give you valuable information you can use to stay on top of your competitors.

Traffic Reports – It will do you no good to place well in the SERPs for a keyword or phrase no one uses. It is important that the keywords you do place well for actually bring traffic to your site. Most traffic monitoring software can tell you the keywords your visitors are using to find you, and used in conjunction with SEO reporting software it is possible (after a certain amount of trial and error) to zero in on the keywords that give you the best placement and bring you the most traffic.

Competitive Analysis – If your competitors consistently outrank you for your most important keywords, then it is important for you to understand why and to remedy the situation. See what primary and secondary keywords your competitors are using and how they are implemented. Analyze their content and compare it to your own. Check other ranking factors as compared to your own, especially backlinks (a.k.a. inbound links). Use a backlink checker such as the one found here <http://www.iwebtool.com/backlink_checker> to discover who is linking to your competitors. You might even try getting a link for yourself from these same sites. However you obtain backlinks remember that this is the single most important factor contributing to your site’s page rank and position in the SERPs.

Navigation and site structure – Make sure your navigation menu is spiderable and that your anchor text includes the primary keywords contained in the target page. Validate your code. Messy, illogical, or poorly structured code can make it harder for search engines to crawl your site. Properly naming (or renaming) files and folders can also have a huge impact on a page’s rank because this will show up in the page’s url. Most search engines give a lot of credit to a page which contains keywords in the url because it is a very good indication of what the page is really about. “Page rank passing” is another consideration regarding site structure. Web pages share their page rank status with the pages they link to. Every link from one page to another is a “vote” for that other page. Proper internal linking structure will ensure that page rank is evenly distributed throughout your site and directed at the pages you feel are most important.

These things all take time and at the risk of sounding redundant, SEO is an ongoing process. Many SEO experts claim that the age of a site contributes to page rank. This is not technically true. The age of a site does not determine page rank, but it is a necessary correllary to page rank because site owners who have addressed all of these issues and have built and established a well-structured, well-linked site will undoubtedly have to spend many months or more doing so. These things take time. Not just the time it takes to edit a page or add a new page, but also the time it takes for those changes to get crawled, indexed, and cached. Google warns that it may take 4-6 weeks for a page edit to show up in the SERPs (whether it’s a single word on one page or several words on several pages). In my experience, you can take them at their word on that.

Optimizing Images

SEO your images. Yes, you can optimize your images for image search. Having images from your site place well in an image search can bring quality traffic to your site. While it is true that search engine spiders cannot read the content of images, such as embedded text, they can read the code used to place them there. And they do use “imageBots” which crawl the index searching for images to include in its image search results. It’s important to know that when an imageBot looks for images, it looks for the same sorts of things that the usual search engine spiders look for: keywords in prominent positions (within the image tag and elsewhere).

Only images from cached web pages can be included in Google’s image search, and pictures that are by themselves irrelevant to your site’s content (a scenic background, e.g.) will not bring the kind of traffic you want. But if a picture is worth a thousand words, then two pictures are worth two thousand words, and having several pictures of your products place high in an image search is worth money. Do not overlook this often overlooked opportunity to promote your product or service and bring tons of quality traffic to your site. The SEO team at Sun Sign Designs knows how to do this. Give us a call, and we’ll show you!

Keeping Track of Your Search Position

As a Sun Sign Designs customer, you can keep track of your search engine position via weekly reports designed to track your website’s performance for various keywords on various engines. This capability can be a valuable part of your overall search engine marketing strategy and your efforts to get your message (or your website) in front of your customers.

These reports are filled with valuable information which nevertheless can read somewhat cryptically, so we have put together the following explanation to help you understand what your SEO reports are telling you and what they’re not telling you.

How to Read Your SEO Reports

The first page of your report (the Summary tab) shows a summary of what is contained in the rest of the report. This is a very broad overview, but it can tell you a few things. First, let’s scroll down to the bottom of the page and look at the General Statistics information in the bottom left hand corner:

Keywords, engines, and matches. What we’re doing here in this example is searching 11 different engines for 74 keywords or phrases, and “Matches Scanned” says that we only want to return listings which appear within the first 30 results (we only want to scan the first three pages). That’s it for this box, except to mention that 74 keywords on 11 engines means we are performing 814 queries (74 x 11=814). This brings us to the next box immediately to the right, Search Engine Queries:

There are 814 queries for this report (74 keywords on 11 engines) and 5,624 queries since we started running these reports eight weeks ago. Now, out of those 814 queries we ran today, how many of them found your site within the first 30 results? That is, how many listings do you have in those 11 engines, for those 74 keywords, within the first three pages of results? This is the subject of the next box we will look at, back at the top of the page on the left. These are your Visibility Statistics:

See where it says “Total Listings”? That means that 609 of those 814 queries we performed produced a result within the first three pages. More specifically, this site for those keywords has 263 listings in the top 5 positions, 429 in the top 10 (the first page of results), 535 in the top 20 (two pages back), and again, 609 within the top 30 (three pages back). This box also tells us that, compared to the previous report, 325 of those listings have moved up while 132 moved down, for a net gain in position of 193. This raises your visibility score. Remember that this is on 11 different engines. For a graphical representation of this, and to see how these listings have changed over time, we go to the next box immediately to the right, the Keyword Visibility Index:

Here we can give a point score to your overall success: if a first place listing on any engine is worth 30 points, and a last  place listing (30th) on any engine is worth 1 point, then having 814 first place listings would give you a Visibility Score of 24,420 and a Visibility Percentage of 100%. This example site has 609 listings in various positions on various engines for a total score of 9,590 or about 39%.

____________________________

The next two boxes in the middle of the page represent the top five engines and the top five keywords. Which engines and which keywords produced the most listings? How many listings? That’s what these boxes can tell you:

That’s it for the broad leveloverview and the Summary tab.

Visibility

The next tab is the Visibility tab, which can tell you how many pages each search engine has indexed (search engine saturation) and how many inbound links you have from other sites on each search engine (link popularity). These figures are often wrong, but you can get a general idea from the graphs. Some engines are better than others at reporting search engine saturation and link popularity, but none of them are consistent or always accurate. Here, it is better to look for trends than actual figures:

Link Popularity

Engine

The next tab, the Engine tab, gives the detailed position report for each keyword by engine, or to quote the headline, “This report shows the current rank, previous rank and change in a keyword’s position categorized by search engine.” This report contains all 814 queries and the result of each query.

Keyword

The Keyword tab gives the same information, only by keyword, not by engine: “This report displays the current rank, previous rank and change in a keyword’s position categorized by keyword for each selected search engine.”

Trend

This tab shows yet another view of the same information with the added dimension of time. The Engine and Keyword tabs told you  the change in position for your keywords since the last report. The Trend tab tells you how each query has performed over time per search engine for the entire mission(since the very first report ran).

Competitive

The final tab is still the same information, only this time your keyword position is compared to your competitor’s position for the same keywords.

Putting it All Together

This is all very helpful information, but it is important to understand what it is telling you (and  what it is not telling you). It is important to remember that the entire report and every statistic in it is primarily dependent upon one thing: your keyword selection. If you choose keywords for which you have no chance of placing well, it will negatively affect your visibility score.

Your choice of engines is also extremely important. If you choose to search an engine which has none or very few of your pages indexed, you will get zero or very few listings, which will negatively affect your visibility score.

So, a visibility score cannot tell you how your site is doing, how much traffic you’re getting, or how your sales are doing. It can only tell you where you stand for specific keywords on your choice of engines. Any further information can only come from other sources such as sales and traffic reports. These reports used in conjunction with your SEO reports, enable you to make informed decisions about how to increase your keyword visibility, drive traffic to your site, and ultimately raise sales. You can adjust the parameters of your SEO reports to raise or lower you visibility score easily, but only a site revision can change your site’s rank or the ranking of specific pages for specific keywords. See my previous post for information about how to target specific search engines.

It should be stressed that keyword analysis and selection, search position, and page ranking are just small parts of what should be your overall search engine marketing strategy. Additional steps beyond choosing the right keywords for your pages include link building, specialty directories, pay per click, banner advertising, direct and email marketing strategies, offline (print) advertising and more. Sun Sign Designs can help you decide when and to what extent these additional steps will be necessary as your business grows.

,

Proper New Website Deployment Over an Existing Website

After spending months at a time designing, coding and preparing your new website for launch, many Information Technology and Website Development Firms still forget to check on the fundamentals of why a website exists. Driving traffic to your site is the key reason why anyone would ever spend time and money having a website done. In most instances, we find that Search Engine Optimization was never done or if it was done, it was not done very well. Whether you have spend time and money having Search Engine Optimization done or if by luck Google has found favor in your existing website, it is important to look at this from the top down.

In our example, the site we are about to deploy has been around for about 5 years. The owners have spend many hours promoting their products through marketing campaigns, and working the Search Engine Optimization angle and have a significant amount of traffic. In short, through all of this pain staking work they have managed to secure and lock down their current site’s link structure. The new site was designed from a strategic marketing point of view to help enhance their products, better serve their customers, and act as a grow reference. Of course this means that the new site’s link structure is completely different than the old site.  At this point launching a new site would be a huge mistake and the lose of Google Rank would be significant enough for them to lose revenue.

How do we work around this? As you know I mention on a regular basis, being strategic and looking at your site, not as images and code, but as vehicle to generate more revenue is very important. In this case not losing your search engine rank is key.