Google AD


Monday, July 30, 2007

The Myth of W3C Compliance?

The Myth of W3C Compliance?
By Sasch Mayer (c) 2007

The past few years have seen a huge increase in the number of search engine optimisers preaching about the vital importance of W3C Compliance as part of any effective web promotion effort. But is compliant code really the 'Magic SEO Potion' so many promoters make it out to be?


For those of you not familiar with the term; a W3C compliant web site is one which adheres to the coding standards laid down by the World Wide Web Consortium, an organisation comprising of over 400 members including all the major search engines and global corporations such as AT&T, HP and Toshiba amongst many others. Headed by Sir Timothy Berners-Lee, the inventor of the internet as we know it today, the W3C has been working to provide a set of standards designed to keep the web's continuing evolution on a single, coherent track since the Consortium's inception in 1994.

Whilst the W3C has been a fact of life on the web since this time, general industry awareness of the benchmarks set down by the Consortium has taken some time to filter through to all quarters. Indeed, it is only within the past 24 to 36 months that the term W3C Compliance has emerged from general obscurity to become a major buzzword in the web design and SEO industries.

Although personally, I have been a staunch supporter of the Consortium's standards for a long time, I cannot help but feel that their importance has been somewhat overplayed by a certain faction within the SEO sector, who are praising code compliance as a 'cure-all' for poor search engine performance.

Is standards compliance really the universal panacea it is commonly claimed to be these days?

Let's take a quick look at some of the arguments most commonly used by SEOs and web designers:



1. Browsers such as Firefox, Opera and Lynx will not display your pages properly.

Browser compatibility is possibly one of the most frequently cited reasons for standards compliance, with Firefox being the usual target for these claims. Speaking from personal experience, Firefox will usually display all but the most broken code with reasonable success. In fact, this browser's main issue seems to lie more with its occasional failure to correctly interpret the exact onscreen position of layers (Div tags - this often causes text overlap) even when expressed correctly, than its inability to deal with broken code.

What about Lynx? Interestingly enough whilst it is somewhat more fragile than Firefox, most of the problems encountered by this text-only browser mostly seem to stem from improper content semantics (paragraphs out of sequence) than poor code structure.



2. Search engines will have problems indexing your site.

Some SEOs actively claim that search engine spiders have trouble indexing non-compliant web pages. Whilst, again speaking from personal experience, there is an element of truth to these claims; it is not the sheer number of errors which causes a search engine spider to have a 'nervous breakdown', but the type of error encountered. So long as the W3C Code Validator is able to parse (*) a page's source code from top to bottom, a search engine will likely be able to index it and classify its content. On the whole, indexing problems arise when code errors specifically prevent a page from being parsed altogether, rather than non-critical errors which allow the process to continue.

* To parse is to process a file in order to extract the desired information. Linguistic parsing may recognise words and phrases or even speech patterns in textual content.




3. Disabled internet users will not be able to use your site.

The inevitable, but somewhat weak, counter-argument to this point is that only an infinitely small percentage of internet users are visually or aurally impaired. However, it is a fact that browsers such as Lynx and JAWS (no, not the shark) will view a web page's code in much the same way as a search engine spider. From this perspective, we once again return to the difference between critical and non-critical W3C compliance errors. As long as whatever tool/browser/spider is used to extract text content from a page's code is able to continue its allotted task, the user is likely to be able to view the page in a satisfactory manner.

Interestingly, one of my fellow designer/SEOs over in Japan has just run an experiment entitled "W3C Validation; Who cares?" testing the overall importance of W3C compliance to long-term web promotion efforts. Whilst the results of this, the world's most non-compliant web page, do initially indicate that compliance does not make much of a difference to a search engine's ability to index and classify a web page, I do rather suspect that further research may be needed in order to establish the long-term effects of this experiment.

At the time of writing however, the page ranks well with Google for the following two non-specific search terms; "Does Google care about validation" and "Google care validation" - not bad for a page which is supposed to be utterly and completely un-indexable. What then is the answer to the W3C compliance conundrum?

In conclusion I would say that ignoring the World Wide Web Consortium's standards at this stage may well have negative consequences in the long-term, as the internet's continuing evolution is likely to place greater emphasis on good coding practices in the future. Having said this, I would also say that the current value of W3C compliance has been overplayed by some professionals in the web design and SEO industries.

Further studies into the effects of non-compliance are certainly needed.

Thursday, July 26, 2007

The Roller Coaster of Link Popularity

The Roller Coaster of Link Popularity
By Bill Platt (c) 2007

Most webmasters are in a constant state of confusion about how to create link popularity and how to rank well in the search engine results. Three of the top four search engines, Google, Yahoo and MSN calculate link popularity as one part of their search algorithms. So, for all intents and purposes, building link popularity is an important part of getting recognition and strong placement in the search engine result pages (SERPs).
Link popularity, in essence, is a count of how many web pages point to one of your web pages.



The Google PageRank Version of Link Popularity

PageRank (PR) is a Google tool that expands on the simplest link popularity calculation. PageRank is a value given to every web page on the Internet, with 12 possible rankings.

* The Gray Bar in the PageRank tool indicates that a web page has not been added to the Google PageRank database, or Google has banned the website. (If any page on a particular domain has its own PageRank, or if any pages are shown in the Google search results when someone searches "site:www.yourdomainurl.com", then the website in question has not been banned by Google.)

* PR0 to PR10. PR0 indicates that the web page has been added to the Google database, but it does not yet have any PageRank assigned to it, generally because there are not any PR value pages that link to it at this time.

If one is tracking PageRank from the Google toolbar, then it needs to be understood that the database that stores PageRank values is only updated about once every 3-4 months.

While Google does use links to a web page to determine the web page's PR value, it is impossible these days to utilize Google to find what links are directed to your pages. Even the Google webmaster tools interface will not show you all of the links Google is counting towards your own Link Popularity or PR value.



Playing Follow-The-Leader

In earlier years, Yahoo and MSN did not employ a link popularity calculation in their search algorithms. But, when one competitor is thoroughly kicking their competition, then the underdog competitor must respond, if they have any desire to remain relevant.

So, after years of lagging behind the Google powerhouse, Yahoo and MSN decided it was time to work a link popularity calculation into their search algorithms.

Both Yahoo and MSN are still struggling to find a way to retake some market share from Google. Even with Yahoo's Project Panama rollout and MSN's Live Search rollout, both are still finding Google to be a difficult 800-pound gorilla to conquer.



Building Link Popularity

In essence, even if search engines did not include link popularity as a portion of their ranking procedures, one would still want to develop links to his or her websites.

Links are the roadways that keep Internet users moving from one website to another. Before the search engines became the all-powerful providers of Internet traffic, the role of Internet promotion was to establish links on pages where a website's target audience is already going.

The goal of course is to get the person reading the page to click the link to the target website. With every visitor to a website being a potential customer, it makes good sense to get as many visitors to the website as possible, and that requires getting as many links as possible pointing to a website.



Google PageRank 101

Since Google drives the largest portion of search traffic on the Internet, I am only going to focus on their link popularity system.

All web pages on the Internet have been assigned a PageRank value by Google, according to the value of the web pages that link to them. This number is always in flux as links are made, lost or change value.

In short, the pages linking to your pages have their own Google PageRank value, according to who links to them, and the value of the pages that are linking to their web page. As the web pages linking to your web pages gain value, then your pages will also gain value in the Google PageRank algorithms.

As a Webmaster, it should be your goal to create as many links to your website, as you can muster. Eventually, most of the web pages with real value will gain their own PageRank, and they will pass some of their PR value to your web pages.



But, I Tried That Once...

Whatever link building strategy one might recommend, there will be someone else saying, "But, I tried that once and it did not work." Some may go a bit further and say that they tried it once and received initial good results in Google's SERP's, but then those results shortly dissipated and the previous high placement in Google evaporated.

A common story I hear is that "we tried" a specific link building process. Shortly after doing so, our website went from result 300 in Google's results to page two or three of the search results. Then a month later, our website dropped to around 100 in the search results as the link page slipped into Google's Supplemental results. These people often conclude that the link building process used was not effective.

They make this statement because they do not understand the inner-workings of what is happening to their link popularity and search engine placements.



Several Factors Drive the Roller Coaster

With press releases, it is easy to comprehend the how and why of the climb and fall. Press Releases are treated as news stories, and as such, they are more important in real time than they will be in a month or so. That is why press releases can generate big results quickly, and it also explains why those results quickly fade away.

With article marketing, it is common for a new article placement to help any website mentioned within the article and its accompanying resource box (about the author information) to rise in the search rankings early, then to drop away for a time, and perhaps rise in value again later.

Let me explain how this process works, and it will make more sense to you.



Google's Main Index and Supplemental Listings

In order for the referenced website to get the PageRank it needs to climb in the search results, the web pages linking to it must have their own PageRank. As a single web page gains in link popularity and PageRank, the web page will also improve in the search results.




When a new article is placed for the first time, it is always placed on a "brand new" page on the Internet. New pages on the Internet, by their very nature, do not have any external links pointing to them and therefore, they do not have any established PageRank.

In recognition of this "brand new" status, Google is giving a pass to those new web pages. As far as the Google algorithm is concerned, these "brand new" pages might have value, but that value cannot yet be determined based on the number of links pointing to the page.

At the end of Google's "pass window", Google checks to see if this new page has developed any of its own inbound links and PageRank value. If the new web page has not developed any value of its own after a window of 30-45 days, then the new page will be moved from Google's main index to Google's Supplemental listings. If the new page has developed PageRank, then the page will remain in Google's main index.

According to Matt Cutts, the Google Guy, "Having urls in the supplemental results doesn't mean that you have some sort of penalty at all; the main determinant of whether a url is in our main web index or in the supplemental index is PageRank." http://www.mattcutts.com/blog/infrastructure-status-january-2007

Many web pages that have slipped into the Supplement listings will gain their own PageRank over the long term, and as such, those pages may return to Google's main index in the future. If articles are valuable resources to their readers, then many placements of those articles will be given their own inbound links and therefore PageRank, but it takes time.

As a general rule, it appears that the average web page will gain a measure of PageRank somewhere in the range of 90 to 180 days from the day the web page was created. While not all pages will receive inbound links and PageRank, enough of them do to make the whole process worthwhile.



You Cannot Win If You Do Not Play

As a Webmaster, your website will never gain link popularity if you do not take actions to increase the number of links pointing to your website. If the web page never accrues any link popularity, it will not gain PageRank, and it will not rise in the search engine rankings.

You are in the driver's seat, so if you fail to accomplish link popularity and search placement, then it will have been the fault of your inaction.

Do you remember my sample scenario above, "Shortly after (completing a link building campaign), our website went from result 300 in Google's results to page two or three of the search results. Then a month later, our website dropped to around 100 in the search results as the link page slipped into Google's Supplemental results."

These people frequently conclude that a specific link building activity produced no results, because they did not stay on page two or three of the results. Surprisingly, these people tell us that they started out at #300 and ended up at #100, and yet they claim that the process did not work in their case. How so? They climbed 200 places in the search results. How is that an ineffective link building campaign?

So, the next time you hear someone crying about the link popularity roller coaster, think back on this article, and you might be able to help him or her to clear the fog of confusion.

4 Great Reasons to use Google Analytics

4 Great Reasons to use Google Analytics
By Sasch Mayer (c) 2007

Having used a large number of web site visitor trackers over the years, I first approached Google Analytics some time ago, with the somewhat jaded attitude of someone who's 'seen it all' or at least 'seen most of it'. What could possibly make this particular utility stand out in such a large crowd of competitors?
But first... What is Google Analytics?

Analytics is Google's very own visitor tracking utility, allowing webmasters to keep tabs on traffic to their site, including visitor numbers, traffic sources, visitor behaviour & trends, times spent on the site and a host of other information gathered via two pieces of JavaScript embedded in the source-code.

Unlike other free visitor trackers, which insist on displaying annoying and often amateurish badges or buttons when they are being used, Google Analytics simply runs quietly in the background, gathering the necessary information without any visible signs of its presence.

Which brings me quite neatly to Analytics' first major plus-point; the price.

What webmasters are effectively getting, is a fully fledged visitor tracking utility without all the irritations and limitations normally associated with free products of this type.

Ok, so its free; but is it any good?

In a word; yes.

The sheer depth of information gathered, really leaves very little to be desired. From search engine analysis to page views, bounce-rates and more, the available data is presented so as to give users an easy overview of the most essential elements, with the ability to 'drill down' to less commonly accessed or more in-depth statistics and figures.

Additionally, on the 18th of July 2007, the Google Analytics old user interface was discontinued, making way for a newer, more ergonomic look which makes reports more accessible and the interface itself more intuitive for the user.

The new Dashboard provides 'at a glance' visitor statistics for the previous month, as well as a graphical breakdown of your visitor's geographical locations in the form of a world map. A pie chart clearly shows what proportion of visitors reached the site through search engines, by referral or through direct access, whereas the 'Content Overview' provides a list of the most commonly accessed pages.



What makes Google Analytics special though?

Although Analytics boasts all the features and statistical data to be expected from a top-class keyword analysis and statistics tracker, it also features a number of additional tools which put it ahead of most of the pack where ease-of-use and depth-of-information is concerned.

1. The Map Overlay

Essentially, this feature brings up a map of the world, highlighting the countries a site's visitors stem from. Clicking on a country produces a close-up view, along with a geographical breakdown according to the region and/or city from which visitors accessed the site. This tool in itself is invaluable for all those webmasters with geo-specific sites, concentrating on a particular catchment area.

2. The Site Overlay

This is conceivably Google Analytics' single most important feature from a webmaster's or online business owner's perspective, as it provides a hands-on view of visitor behaviour. When clicked, 'Site Overlay' opens the tracked web site in a new window and, after a moment's loading time, overlays each link on the screen with a bar, containing information about clicks to the target page and goal values reached [more about goal values in a moment]. Since it allows the webmaster or site owner to navigate his or her site and see exactly how visitors flow through it, it is difficult to imagine a more effective tool than this as far as raising a site's conversion rates is concerned.

3. Goals and Funnels

Unless the site being tracked is an information site which does not rely on generating sales or enquiries, conversion rates are as important as sheer visitor numbers. The 'Goals & Funnels' feature allows users to set up specific goals for their site, such as tracking a visitor to the 'Thank you for your enquiry' page for instance. It also allows the user to set up specific monetary values for each goal, and thus track the site's financial performance and profitability during any given period of time.

The term 'Funnels' refers to the specific path a visitor takes to reach the goal's target page. Since most web sites sell a number of different product ranges or feature a number of ways to enquire, all of which lead to a single 'Thank You' page, the funnel allows for the tracking of each individual path with a minimum of fuss.

4. Graphical Representations

A great many visitor trackers out there will present the collected information in a certain way, be it a list, graph, pie chart, flow-chart or whatever. Whilst all these methods of presentation are of course valid, it is nevertheless a fact that most users are different, and a pie-chart is not necessarily ideal for those users preferring to work with graphs or vice versa. Google Analytics, however, allows users to choose between views on many of its reports. Although this may seem like a relatively minor point, it nevertheless makes things easier, as it allows the user to work with the view he or she is most comfortable with.



In Conclusion:

Google Analytics provides webmasters and site owners with a highly effective means of tracking visitors and analysing statistical data, easily the equal of most subscription based services in the industry.

Although some concerns have been voiced amongst more paranoid internet users, that Google puts everyone's collective data to its own evil demographic uses, there really are precious few reasons not to recommend this fantastic tool as one of the best means to boost any web promotion and marketing campaign.

Fluff vs. Quality Content

Fluff vs. Quality Content
By Devin Hansen (c) 2007

There are basically three types of content you can use for your site. Fluff, leased, or custom. All three have their pros and cons, but which would work best for ranking well with Google and the other major engines? Let's first explore the definitions of this varied content:

Fluff: Written cheaply by non-native-speaking writers, and used to fill up a web site with inexpensive content.

Leased: Identical articles that are well-researched and written, but sold to numerous web sites.

Custom: Well-researched, authoritative content that is tailored specifically to meet the needs of you and your business.



Fluff content is fine for businesses just starting out. It helps you to at least get a place in the race to the top of the search engines, but for long-lasting results, fluff just won't cut it. The wording is often choppy, incoherent, and doesn't achieve your primary goal which is customer conversion. Also, if the content of your site is sloppy, it will not instill confidence in a potential customer.

Leased content works well because it is professionally written, topical, and easy to find.

Search engine algorithms favor content that has keywords and phrases that are strategically placed but those words and phrases must also be embedded in text that is lean and carefully crafted for consistent results.

The drawback of leased content is that it can be found in a wide variety of other websites and cannot meet the unique needs of your business or specifically target the audience that you want to attract.



Custom Content

Custom content is content that has been professionally crafted to feature the keywords and phrases that you and an SEO expert have chosen to rank well with search engines and attract your target audience. The strengths of custom content are:



Specificity

You can consult a copywriting firm to construct your content exactly the way that you want it to convey the unique products and services that your business offers and organically build the rank of your site which leads to lasting results.



Readability

Custom content will engage the reader and invite them to read further which entices them to linger at your site and explore the other content.



Credibility

Custom content immediately lends legitimacy and lasting brand recognition to your site because discerning readers can see that you have taken the extra steps to tailor your message specifically to them.



Lasting Results

Web statistics consistently suggest that the best way to earn placement on that key first page of search results and retain your ranking is customized content.

As search engine bots become more and more sophisticated, keyword stuffing and other gimmicks get sniffed out and dismissed because they do not offer the reader any rewards for investing their time.

So, how do I hire a quality content writer?

Sure, anyone can write and practice keyword stuffing. You see it on hundreds of sites everyday, full of fluff-content that was written cheaply, and reads cheaply. Even the most basic conventions of writing are abandoned, simply to reach a high word count. Because of this, readers are having a hard time finding good, quality content. They want information, not gobbledygook.



Ask for Samples

The first rule in hiring a good content writer is reviewing their work. Ask for writing samples, as well as references. They should know the basic conventions of writing, and excel in creating informative, easy-to-read content that people will understanding.



Work Ethic

Good content writers, like any job, should also have a good work ethic. Meaning, they respond quickly to emails, meet deadlines, and keep in constant communication with the customer. People that are conscientious and prompt in their correspondence are likely to be quick and efficient in their work. This reduces the chances of procrastination as well. A good content writer will use the entire time to work on an assignment and produce good, thorough copy, while a sloppy or lazy writer will wait until the last moment and squeak in an unpolished product right before the deadline.



You Get What You Pay For

As the old saying goes, "You get what you pay for." There are plenty of desperate writers out there that will work for peanuts, but it is an investment to hire a more proficient writer at a higher rate, and you will have much better results on your investment. If you need quick, cheap content, then there are plenty of people willing to produce it. But again, if it is written cheaply, it will read cheaply.



Knowing the Audience

A good content writer should also have a feel for their audience. Any good writer can complete an assignment, but someone that is in tune with their audience can connect with readers much better by tailoring their copy specifically to them. A sympathetic writer should be able to imagine a piece of writing from the audience's perspective and detect what that reader wants or needs from it. This comes from in-depth interviews with the client, and really learning what message they want to convey to their readers.



Trustworthy

Lastly, a good content writer should be trustworthy. While representing a company or employer, a writer must be privy to certain information in order to write effectively. Make sure the writer you are hiring has a good business ethic and won't turn his back on you or exploit your ideas once he is gone. Although it is possible to work with someone and still withhold sensitive business tactics or information, it is much easier to work with someone that can be trusted in an open correspondence. And even if you do trust the writer, it is always smart to get a signed contract.

Maximizing the Triangle of Relevancy With Google

Maximizing the Triangle of Relevancy With Google
By Sydney Nelson (c) 2007

The "Triangle of Relevancy" is used to describe the relationship between the text in a landing page, a sponsored advertisement and the keyword or phrase that's entered into a search engine. Google places a premium on relevancy as it endeavors to ensure visitors have a positive experience by getting search results relevant to their search terms. I will outline specific steps an advertiser can take to maximize their landing pages and sponsored advertisement's effectiveness in their search engine marketing endeavors.



Relevancy with Landing Pages

The product, if you will, of any search engine is the resulting landing pages. The page's relevance to the search terms determines whether the page will show up in a search and at what position. Google's algorithm scores each page and/or sponsored ad's relationship to the keywords or phrase and uses this information to assist in determining the order in which the landing pages and AdWords ads are placed. The algorithm also monitors the amount of time a visitor spends on a page and includes this in the score.

Search engine optimization (SEO) techniques such as placing keywords in the page's title and throughout the body of the page can sometimes affect the position of a page in the search results. But of greater importance to Google's algorithm is whether or not the keywords are located on the landing page and whether they are randomly included simply to increase the density of the keyword on the page.

A common scenario is for Web developers to design a number of landing pages for the same product specific to certain keywords. Using this method you can end up with 10 or more landing pages for each of your products. This can be expensive, time consuming and difficult to maintain as regular updates are required on each page.

This can be much more efficiently accomplished by using a product entitled Search Chameleon. This product uses scripting on a landing page and a related sponsored ad to adjust the text in the landing page in REAL TIME according to the keywords entered in the search bar. The scripting can be used in the page's title or anywhere in the body. This not only saves development time but makes page updates much simpler since you're only working with one page.

It also assures your page will be relevant to the search regardless of the search term entered. This can be a compelling factor in a visitor's decision to spend more time on a landing page. An advertiser is then able to maximize on the relevancy of their landing pages by automating previously manual processes.



Relevancy with AdWords and Sponsored Advertisement

The "Triangle of Relevancy" would not be complete without the search terms being included in the title and/or body of your sponsored ad. Google and most search engines will highlight the search terms in the sponsored ad anywhere it shows up. This allows your ad to stand out and draws attention to the visitor that your ad is relevant to their search.

So, instead of loading your Adwords campaigns with numerous non-relevant keywords, your best bet is to use a single keyword or phrase that's relevant to your ad allowing it show up in both the title and the body of the ad. This means you should write several ads specific to a keyword or phrase for your Adwords campaigns. This not only makes your ad more relevant but it pre-qualifies your prospect as the ad contains the specific key terms they're searching for.

Another way to really boost your sponsored ad's visibility is to have the keyword or phrase in the destination URL at the bottom of the ad. If you're using an affiliate link, you may not get as good a click through rate as with a non-affiliate domain, because people will respond more favorably to your ad if they think you're the product owner.

The best way to show you're a professional is to use your own domain name as a redirect to your affiliate site. You can use the keyword or phrase in a successful ad as the domain name and your keyword will be highlighted in the title, the body of the ad AND in the destination URL!

The second best way to show you're a professional is to use a keyword as a sub-domain for a domain you already own, i.e., http://keyword.MyDomain.com. Notice the keyword is in front catching the eye of the prospect first. An alternative would be to add the keyword as a landing page name, i.e., http://MyDomain.com/keyword.htm. Using these two methods works best when you have a generic domain name like http://123.com which will work with any product and does not conflict with the keywords.



In Summary

The "Triangle of Relevancy" is the most important aspect of a successful search engine marketing strategy. Google is very careful to ensure their visitors have a positive experience with their search engine so they reward the more relevant advertisers with a higher position in the search results and their AdWords ad placing. Both the landing pages and the AdWords ads should focus on specific keywords or phrases for maximum relevancy.

As previously highlighted, Search Chameleon will allow you to customize a single landing page, which will update the page title and body text with the specific keywords or phrase a visitor enters into the search engine. Search Chameleon is a proprietary application included in a suite of B2B productivity software called PromoBlackBox. Also included are Google AdWords training CDs developed by a top Internet marketing company. There are a number of additional proprietary applications and software included that will assist advertisers in maximizing on the triangle of relevancy.

The search engine marketing landscape is continually evolving as new technology is introduced. Search engines are continually updating their processes as developers learn how to counteract them. One thing that probably won't change is the triangle of relevancy with the search term, the sponsored ad and the landing page. People will always want specific answers to specific questions.

Wednesday, July 18, 2007

5 Tips to Effective SEO Keyword Research Analysis

5 Tips to Effective SEO Keyword Research Analysis
By Valerie Di Carlo (c) 2007

Keyword research and analysis can be a daunting task, when done correctly, and expert keyword research is the foundation to a successful SEO campaign. Many new website owners think the keyword research analysis process is easy. They think free tools, such as the Overture Search Term Suggestion Tool is the profit pill that will bring them instant results.
Unfortunately, the free tools will only give you a rough guide and a quick indication whether a hunch is worth further research. These free keyword research tools are limited to basic information. When performed correctly, expert keyword research exposes so much more - all the gems that are tucked away deep.

Real keyword research requires research AND analysis. There are so many aspects to the process that cannot be left to chance. Attempting to do the keyword research on your own is like going to a veterinarian to fix your car. My advise to all clients I do SEO consulting services for is to simply leave this task to the experts who have the correct keyword research tools and expertise.



Following are 5 tips for effective keyword research analysis:


1. Latent Semantic Indexing (LSI) - Use multi-word phrases

Latent Semantic Indexing (LSI) is a vital element in Search Engine Optimization (SEO) for better keyword rankings in search results. LSI is based on the relationship, the "clustering" or positioning, the variations of terms and the iterations of your keyword phrases.

Expertly knowing LSI and how it can be most useful and beneficial for your SEO and the importance it has with the algorithm updates to search engines like Google, MSN and Yahoo which will benefit your keyword research for best practice SEO.

LSI is NOT new. Those doing keyword research over the years have always known to use synonyms and "long tail" keyword terms which is a simpler "explanation" to LSI. More often than not, these long tail, less generic terms bring more traffic to your site than the main keyword phrases. The real bottom line is that Latent Semantic Indexing is currently a MUST in keyword research and SEO.


2. Page Specific Keyword Research - Target your niche keyword phrases for each site page

Probably the most common mistake in keyword research is using a plethora of keywords and pasting the same meta keyword tag on every web site page. This is SO not effective! Your keyword research needs to be page specific and only focusing on 2 to 5 keywords per page. It's more work, but combined with best practice SEO, gives each site page a chance for higher ranking on its own.


3. Country Specific Keyword Research and Search Engine Reference

Keep in mind that keyword search terms can be country specific. Even though a country is English speaking, there are different keyword terms you must research - and then reference that country's search engine when doing your initial keyword research. For instance, UK and Australia may have different expressions, terminology and spellings (i.e. colour, personalised). Referencing the terms in the corresponding search engine is an important element to keyword research that is often forgotten. So for example, be sure to check the search terms on google.co.uk or au.yahoo.com. And, of course, if you have 3 to 4 really comprehensive research tools in your arsenal, you will be able to search for historical, global and country specific search terms easily and effectively.


4. Keyword Analysis - Cross referencing in the search engines

Once the majority of the keyword research has been done for a site page, it's time to plug those terms into the search engines to determine:

- If it is really the desired niche keyword for that page

- To assess the competitiveness of your keywords. Along with checking the competitiveness of your keywords you should look at the strength of the competition.

- Are the other sites listed for your keywords truly your competitors?

- Are the sites listed for your keyword even related to your industry, products or services?


These critical analyses of keyword phrases are often forgotten. Since the keyword research and analysis is the foundation of a successful SEO campaign, you certainly don't want to build your on-page optimization on the wrong niche keywords!


5. Ongoing Keyword Research - Repeat your keyword research on a consistent basis

While you may think that you have completed your keyword research analysis and laid a solid foundation for your SEO, you need to keep monitoring your keywords and tweak as necessary. Keywords can change from month to month as keyword search terms change, genres change and/or if your niche is within social portal networking sites - to name just a few. Maintaining ongoing keyword research is essential for best practice SEO.

Most Successful Strategy to Streamline Your Keyword Research Efforts:

Yes, many website owners will opt to do the keyword research and analysis themselves with only a marginal effect on an SEO campaign. It's not the most successful strategy to use for the most effective results.

To be certain of your keyword data, accurate keyword analysis should be performed - and cross referenced - across multiple expert keyword tools.

Effective keyword research lays the ground work for effective SEO results and can help you kick-start the ranking process - perhaps even giving you a step up on your competitors.

The most successful strategy to streamline your keyword research efforts is to hire an expert. Focus your business efforts on your strengths and expertise and allow the SEO experts to effectively perform the keyword research analysis correctly.


About The Author
Keyword Research Analysis expert Valerie DiCarlo helps companies large and small - worldwide - enjoy a long term improvement in website visibility, increased brand awareness, a continuous flow of new sales leads and higher revenues. To discover how you can improve rankings across multiple keyword phrases and search engines, go to: http://www.seo-web-consulting.com

Monday, July 16, 2007

10 Steps To Top 10 Rankings In Google

10 Steps To Top 10 Rankings In Google
By Titus Hoskins (c) 2007

Most webmasters go totally "gaga" for top 10 rankings in Google. And for good reason, Google is the most dominant search engine on the net and will deliver the largest amount of traffic.
More importantly, those same webmasters will also inform you, getting top 10 rankings in Google often means your site will prove profitable. Mainly because obtaining targeted traffic is usually your first obstacle in creating a viable online business. In other words, if you get top ten listings in Google for good searchable keywords, it is almost impossible not to earn money.

How To Proceed?

First, you must know the rudimentary basics of how keywords work. Keywords and keyword phrases are the exact words someone types into a search engine to find what they're looking for online. If you have a site on "dog training" then your goal is to get a top 10 ranking for the keywords "dog training".

Now if no one searches for "dog training" it would be a useless keyword, you would get no traffic no matter how perfectly your site is optimized for that keyword.

How Do You Know If A Keyword Is Good?

To find out, you have to do some keyword research on your particular keywords. Many professional online marketers use keyword research software like Brad Callen's Keyword Elite. However, you can also use the keyword suggestion tools supplied by Google Adwords or Overture. Try here: http://www.digitalpoint.com/tools/suggestion/

Now if you check "dog training", you will find it receives around 4,469 searches each day. That's a lot of traffic but you must realize that it may be too good, or rather too competitive for your purposes, especially if you have a new site.

Biggest Mistake When Choosing Keywords

The most common mistake most novice webmasters make is targeting keywords which are too competitive. You simply will not be able to compete or place for extremely competitive keywords. Well established sites and businesses with very deep pockets have the resources to completely dominate those keywords.

While it is not entirely futile nor a waste of time to concentrate your efforts on highly competitive keywords, you will have better success if you target low to medium competitive keywords.

Long Tail Keyword Marketing

Besides online marketers have discovered that longer keyword phrases are usually the most lucrative. These phrases deliver traffic which is better targeted and more likely to convert into a sale. "Dog hunting training" which gets around 100 searches a day will be more targeted than the general term "dog training" and if you have a site devoted to training hunting dogs then this keyword phrase may convert better for you.

Always keep this "Long Tail" keyword strategy in the back of your mind as you implement the following steps to achieve your own Top 10 Rankings in Google.

1. Make A Master Keyword List

Your first step is to make a master list of the keywords you wish to target. Obviously these should be closely related to the theme of your site. Check the keyword competition by seeing how many sites are listed in Google for that keyword. Webmasters should also check the Google PageRank of the sites that hold the top 10 positions. If all those sites are PR6 and above it may be hard to get ranked high for your keywords.

2. Choose Related Keywords

Once you have your master list of keywords, find long tail related keywords to target. Again, check out the competition and daily searches made for each chosen keyword.

3. Use Quality Content For Your Keywords

Creating quality content should always be your main goal. Write for actual visitors who will see and read your content. First and foremost you must have good useful content that your visitors will use themselves and recommend to their friends or colleagues. Tie this quality content in with your chosen keywords. Use one keyword phrase per page.

4. Keyword In Domain Name, Title and URLs

Having your keyword in your domain name will score big points from search engines. Plus, each page of content should contain your keywords in the title & meta tags for that page. Most experts also suggest you have your keyword in the URL and use hyphens to separate your keywords. Although the author has gotten good results by using an underscore and htm in URLs.

Example: www.yoursite.com/your_keyword.htm

5. Do On Page Optimization

Keyword ratio is a much discussed topic by SEO experts and many suggest you should have your keyword in the H1 or headline title of your page. Sprinkle your keyword and variations of it throughout your page. Don't over do it but make sure the robot/spiders will clearly discover what your page is about. Many webmasters make sure they include their main keyword in the first and last 25 words on their pages.

6. Use Traffic Modules

One technique that works extremely well in Google is clustering a closely related topic or subject into a distinct separate section on your site. For example, if you have a marketing site, you could create a whole section on article marketing where you would have 50 to 100 keyworded pages all relating to your subject. Writing articles, formatting articles, submitting articles, article software... place a keyword linked menu on each page to connect all your pages together.

Keep in mind, your main objective is to supply quality information to your visitors. One reason Google may favor this type of structure is because they want quality content returned in their SERPs.

7. Try Article Marketing

Article marketing is writing short informative articles on keyword topics related to your sites. You then submit these helpful keyworded articles to ezine directories on the web. When your articles are picked up by related sites, you receive quality One-Way links. The higher the quality of your article, the more links you will receive.

Another ranking tactic to use: If you're just starting out your site will probably have a low PR rank and you will find it hard to rank for even modest keywords. That's why it's useful to take advantage of the higher PageRank of the major ezine directories. Your keyworded articles on these high PR sites will get picked up by Google and displayed in the top 10 rankings. Now the displayed URL will be the article directory site but the links in the resource box will be pointing back to your site. Over time this article marketing technique will raise your own site's rankings for those keywords. Simple but effective.

8. Anchor Text And One Way Links

Off page optimization is important in obtaining high rankings in Google. Getting quality One-Way links is very important. Anchor Text simply refers to "the underlined clicked on words" in your links. Most webmasters include their keywords in their anchor text as this tells the search engines exactly what the links are about.

9. Tags, Blogging And Web 2.0

Take advantage of Web 2.0 by using blogs, RSS feeds and the social bookmarking sites like Reddit and Digg. Try AddThis.com for a simple social bookmarking system. At the very least your site should have a blog and RSS feed attached to it as this is an effective way of boosting your keyword rankings.

Tags have become very important for getting higher rankings. Keep in mind, in free blogging software such as WordPress, categories will automatically be seen as tags. Blogger, which is owned by Google, now has a form where you put your keywords (tags) for each post you make.

10. PPC vs Organic Search

Of course, one of the fastest ways to get your links displayed on Google is to pay for them by using Google Adwords. Your ad and links will sit side by side with the organic link results. In Pay Per Click advertising you bid or pay so much per click for your keywords and you only pay when someone clicks your links. But smart marketers also know since you're getting millions of impressions advertising your products, acquiring name recognition and branding through PPC advertising can be a major side-benefit.

However, most webmasters would say that organic links (SERPS) will return better traffic than paid links or advertising. In most cases, this may be true because Google's organic rankings are becoming more respected and more trusted by users. They simply carry more weight with surfers.

This makes it even more beneficial to obtain top 10 rankings for your keywords in Google. Depending on the competitiveness of your chosen keywords reaching the first page listing or even the favored number one spot is well within any webmaster's reach. Just go for it. The rewards are well worth your efforts.


About The Author
The author is a full-time online marketer who contributes his high rankings in Google as the major source of his online income. For the latest web marketing tools try: MarketingToolGuide.com . For the latest Internet Marketing Strategies go to: BizwareMagic.com . Titus Hoskins. This article may be freely distributed if this resource box stays attached.

3 Top Tools and Services for an Effective Internet Marketing Solution

3 Top Tools and Services for an Effective Internet Marketing Solution
By Sydney Nelson (c) 2007

Everyone knows the Internet is growing exponentially and continually evolving. The major players in the continual evolution of the Internet are the major search engines and consumer generated media such as blogs. Because of the continual changes, a well rounded Internet marketing strategy must include a variety of proven marketing options to ensure success. I will outline 3 of the most effective Internet marketing solutions available and how they relate to your marketing strategy.
SEO Optimization

Search engine optimization (SEO) is the first because the free advertising afforded by the search engines is partly determined by on-page optimization techniques. With thousands of new Websites being created daily, the competition for a listing on the first search result page is increasingly saturated for the popular keywords. Statistics show better than 90% of people only look at the first search result page, and a similar percentage only click on the top search listing.

In order to compete effectively, a Web page must be designed around a keyword or more effectively, a keyword phrase. The keywords must be included in the page's title, keyword META tag, page's description, the first heading (using the H1 HTML tag), throughout the body of the page and within the last 25 words on the page. These are just a few of the basic SEO on-page options that, although will not guarantee a first page listing, are definitely required as a starting point in an effective Internet marketing strategy.

In most cases, there is too much competition for certain keywords such as "golf clubs". A better strategy would be to use "golf clubs Chicago" or handcrafted golf clubs". The more specific you make your keyword phrase the better. A number of free tools are available that will show the popularity of keywords and how often they are used in the search engines on a monthly basis. These will allow you to customize your Web page knowing the keyword phrase's popularity. If you get the first page listing for a keyword with less than 100 searches per month for instance, then it doesn't matter because your traffic will be very limited.


Link Exchanges

Once the on-page optimizations are complete, the off-page optimization options need to be addressed. Link exchanges are perhaps the best off-page SEO technique available. If you download and install the Google toolbar on your browser, you will see the Google PageRank of each site you visit. Your PageRank is determined by the number of other sites linking to your site, and the QUALITY of the link is a major consideration. Pages are ranked from 1 to 10. The higher the PageRank of the site linking to yours, the better. Sites with a PR7 and above are considered as authority sites and a back link will send Google's spider to your page on a regular basis.

There are a number of ways to get a back link to your site. You can email a request to the Webmaster for a link exchange and if they agree you would return the favor by placing a link to their site on your page. You can also purchase the link. A number of sites offer this service for a monthly fee. So if you want to quickly get your site listed by Google and the major search engines, then paying for a PR8 back link for a month or two might be well worth the investment in the long run, as new sites are placed in a "sand box" by Google for several months until they prove their longevity.

One back link from a PR8 site is worth 1,000 back links from PR2 or less sites. As a matter of fact, numerous links from low ranking sites will actually be detrimental to your site, because each back link is like a vote for your site. If you have too many low ranking sites voting for yours, Google will be reluctant to reward your site with a higher PR. You have to do your research on the site that offers to link to yours because if they were blacklisted by Google, your site will receive the same fate and you probably will never recover from this.

A PR 4 or 5 is relatively easy to attain. So your best bet is to limit your back links to PR5 and above for the best results. And by all means avoid link farms. Before Google upgraded its algorithm, just the number of back links was considered. But now the actual PageRank of your back link is considered AND the PageRank of the sites listing to THAT link. Links from link farms are now looked upon as basically sp@m links. You would be better off purchasing a listing in a major link directory like Yahoo or DMZ. It's definitely worth the investment as these are authority sites and are a major vote for your site!


Blogs

As previously indicated, consumer generated media is a major factor in the evolution of the Internet. Case in point is the enormous effect blogs had on the last Presidential campaign. A very large percentage of most searches will include blogs on the first page listing. Over 50% of purchases, online and offline, are preceded by an online search for more information. And a large percentage of the information is offered by blogs. People are very interested in the opinions of others on their topic of interest.

The major blogs are updated on a daily basis which is very important to Google's algorithm which uses the frequency of updates (daily, hourly and by minute) as a determining factor in the search result ranking. A powerful marketing method used by the major blogs is to submit articles on a particular topic to the major article hubs such as EzineArticles.com. The article hub benefits by having continually updated information (which Google likes) where they can place their money making AdSense ads. The blogger benefits by having a back link to their blog at the bottom of the article.

As numerous other bloggers and Webmasters access the article hubs for fresh information, the article can be syndicated virally to a number sites on the Internet. This creates a number of back links to your blog or your site generating an enormous amount of pre-qualified free traffic.


In Summary

These are just a few of the most effective online Internet marketing options available. Other options such as auto-responder email, Ezine advertising and pay per click advertising should be included in a well rounded Internet marketing strategy. No matter what options you use, you definitely need to start with your on-page SEO techniques. Your listing in the search results will be enhanced by a properly optimized page.

Further optimization must include the off-page SEO techniques such as generating back links with the major site directories, link exchanges and articles submitted to the major article hubs. Using a blog for a reference in your article can be a good thing as people will be more inclined to visit a blog since it's not a sales page. Once they have received enough information and you have generated their confidence in your opinions, you can end up with a life time customer.

Monday, July 2, 2007

For SEO Beginners: Twelve Definitions You Need To Know

For SEO Beginners: Twelve Definitions You Need To Know
By Mike Tekula (c) 2007

SEO is a trade that exists solely on the internet, and even then it is comprised almost entirely of the hot air of so-called "expert opinion." Plenty of it blowing around these days as search maintains position as one of the most important marketplaces in the modern business world. Many DIY webmasters will end up searching for blog entries, articles, informational web sites, etc to help get them up to speed. The problem is that in most cases certain key terms are flung around like household names while the people doing the flinging are way out of touch with the average web browser. What some of us don't realize is that not everyone knows even the basics of SEO.
This list of twelve SEO-related definitions in alphabetical order (with notes) serves as a great companion for your initial SEO reading. Read alone it will get you up to speed on some key terminology that you'll need to know to intelligently engage the ever-changing world of SEO.


- Algorithms.
A search algorithm is, in short, the incredibly complex mathematical formula that a search engine uses to "rank" web sites for keywords. Based on a huge number of variables and calculations, algorithms are among the most closely-guarded secrets on the internet. Why? Imagine if they were leaked - suddenly the less-than-honest would have a very specific guideline to follow in climbing to the top of search results in a less-than-organic way, ruining the quality of Google's search results and their entire competitive advantage with it.


- Bot or Bots. See also "crawlers"


- Crawlers.
Googlebot, for example, is a search engine crawler. Googlebot periodically traverses the web in record time, indexing content, links - everything contained in page source code - and storing it in Google's search index. Then, when a user visits Google and enters a search phrase, the index, filtered by the algorithm, is what the user gets. Please note: there is some delay in this process since the results you're getting are from the index and not the live web.


- Directories.
When webmasters realized just how much power inbound links have in determining search rankings they quickly set out to do two things: 1) get inbound links and 2) set up web sites where other webmasters could achieve inbound links (meaning big traffic revenues for the site). Hence the directory farms you'll find today. Link building has been a priority on the list of any SEO-savvy webmaster for years, and as a result "quick fix" directories that allow streamlined listing submissions get a ton of traffic. However, Google and the other major search engines are on to this tactic, and the word among SEO "experts" is that the benefits of listing your site at directories are diminished if not gone.


- Frames.
Frames are a way of laying out a website with multiple documents in one browser window. Essentially, there is one main document which contains the frameset tag - this document specifies the dimensions/placement of the frames and also the documents that will "populate" those frames. From an SEO standpoint the use of frames for your layout is not recommended. Since frames do not use links in the same way, and since links may point to one frame from another, they may cause serious problems for crawlers. Additionally, there are almost no uses for frames that can't either be 1) duplicated with other methods or 2) thrown away without much fuss. If your site was built with frames and you're thinking you don't want to rebuild - it might be tough luck if you're interested in optimizing for search. Consider it a learning experience - build yourself a CSS-based layout.


- Gateway Pages.
Also "doorway pages." Although there isn't a real consensus about what these pages are, their function is always cited as their definition. In other words, these pages are created to "rank well in search engines" by playing to the algorithms. Often viewed as "spammy," "gray hat" or even "black hat." However, any page written with search in mind, and geared towards search, can be construed to be a "gateway page." The difference between a page well-optimized for search and a "gateway page?" No clear lines there, but quality of content is probably the determining factor.


- HTML.
Okay, most of you probably know this one, but there are probably some of you who don't. HTML stands for Hyper-Text Mark-up Language, and it is the core building block that has made the web the greatest modern tool for business, social, informational, political and any other causes. Search engines look exclusively at a web page's HTML code to determine its relevance. Therefore, it's a good idea to pay attention to HTML and familiarize yourself with proper tagging techniques if you're hoping to get a good handle on SEO.


- Link Popularity.
Inbound links are probably the most important optimization point for web pages. Number, quality, trust - these are all factors that affect the value of an inbound link. Going back to the HTML root of search, link popularity (in terms of quantity) measures how many pages point to your site using anchor text ( link text ).


- Link Building.
In short, the process of gaining links at other web sites pointing in to pages on your own.


- Link Baiting.
The process of generating high-quality content on your pages that users will appreciate and link to voluntarily.


- Meta Tags.
Meta tags are found at the top of a page's source code. They are used to specify certain things that might not be found in the page content. They also allow webmasters to put up certain "flags" that search engine crawlers can react to. There are many Meta tags available for use, and many of them can help with SEO to a great extent and for a variety of purposes. However, Meta tags are no longer used in the way they originally were - as a place to stuff keywords to drive your site up in rankings. Some webmasters out there are still doing this, but they are decidedly behind the times and unaware of the impending, or already cast-down, penalties.


- Robots. See also "crawlers."


- Search Engines.
If you don't know what a search engine is congratulations on finally making it out from under that rock. Search engines are essentially programs that scan an existing index of the web based on a query of search terms, or keywords, that a user enters. However, the word more commonly refers to companies as a whole - Google, for example, controls a search engine, while Googlebot is the crawler that gathers content for its index, but most users and webmasters think of a search engine as the whole package.


- Search Engine Marketing.
Most often this refers to Pay-Per-Click marketing in which an advertiser bids on chosen keywords and writes several ads to be displayed should their bid achieve placement. These ads are displayed in the "sponsored" section of search engine result pages (SERPS). However, in some circles this term is used to refer to any action taken to gain rankings both paid and organic.


- Search Engine Optimization.
This one is open to interpretation. It is quite often used to encapsulate a huge amount of different tactics. On-site optimization, off-site optimization (link building, etc) and many other techniques all feasibly fall under the SEO blanket. However, there is an obvious difference between optimizing a page's code to be clean and search friendly and writing link bait that will be popular and get linked to.


- Search Engine Results Pages (SERPs).
The pages resulting from a search engine query run by a user. Webmasters review these pages to determine where their pages are ranking for certain search terms.


- Sp@mming.
Basically, any unnatural effort to bring a page higher in search results. What constitutes sp@m is open to some interpretation, but the only interpretation you need to worry about is that of the major search engines. If Google, for example, considers a technique "spammy" you'd be wise to cease at once.


- Spiders. See also "crawlers."


- Submission.
For SEO this has traditionally meant submitting a web site to search engines so they'll know about and crawl it. SEO firms offered submission services as a big selling point to bring in clients. However, for a long time now submitting your site to search engines hasn't done jack. They're all much smarter now - just focus on gaining quality inbound links and your site will be indexed in no time.


This is just a sample of the core vocabulary associated with SEO. Is this all you need to know? Absolutely not. But in my experience these are the words and phrases that newcomers have the most trouble with. If these definitions help one person have a better understanding of SEO, then I will be satisfied.

Search Engine Optimization for Universal Search - Back to Square One?

Search Engine Optimization for Universal Search - Back to Square One?
By Scott Buresh (c) 2007

Organic search engine optimization, until recently, had been a fairly straightforward endeavor. The goal was to optimize the content on a website so that it would show up in the organic results on one or more of the major search engines - results that were comprised of nothing but other websites. However, in the middle of May 2007, Google began rolling out its new "Universal Search," something it had been working on behind the scenes for several years. This new search option may have long-term repercussions for every search engine optimization company in the industry if it is something that is preferred by the public at large and if it becomes the standard going forward.
What Is Universal Search?

Someone using Google's Universal Search will find that a query brings back results that encompass not only web pages, but also videos, blogs, images, news articles, and other media available online. While Google already had in place options for searching each of these areas individually, many searchers did not notice those options or did not know how to use them, a phenomenon that became known as "invisible tabs."

With Universal Search, there's no need to select a separate menu item - the search will return results that encompass many different types of media. For example, a search for "breakdancing" might bring up not only web pages about breakdancing, but also blog posts about it, videos showing technique, and news articles about it. It would not, however, give you the reason why you were wearing parachute pants and trapped in the eighties.

However, Universal Search hasn't been rolled out fully yet. Currently, certain terms will give Universal results, while other searches will remain the same as before. This is a classic Google move - roll something out gradually, see how it plays in the public eye, and then decide what to do from there. Basically, Universal Search as it exists right now is very likely to change, depending on user feedback.

And if the limited queries that now return Universal Search results do not garner positive responses, it's likely that Google will revert to its previous, webpage predominated results. They obviously don't want to lose market share, and they already learned a valuable lesson not long ago when they released a new algorithm that was poorly received and which was subsequently dialed back.

What are the Benefits of Universal Search?

Universal Search brings several benefits to searchers. A searcher no longer has to specify the media he or she is looking for - one keyphrase search will cover everything. And the results from a search will be more comprehensive in many instances, giving a well-rounded picture to the searcher that may include better information than would previously have been found in a search of just one type of media.

What are the Drawbacks of Universal Search?

The problem with Universal Search is that it can muddy the results, and it can also introduce irrelevant results that a searcher cannot use. A search for "Paris Hilton" (ever heard of her?) will bring up news, videos, and other information about the heiress. But it will also bring up a map of the city of Paris showing locations of Hilton hotels, something most searchers that typed that exact phrase probably did not have in mind. Plus, 28% of Internet users are still using slow dialup connections (1), according to RVA Market Research. Many of these people are likely not interested in videos or other results that require much bandwidth, and such users may turn away from Universal Search entirely - there are, after all, other search engines. No, really - there are.

In addition, there is no way to turn off Universal Search; as it exists right now, it is part of the standard "Web" search, eliminating the ability to simply search web pages and introducing a new wrinkle in search engine optimization efforts. Now, a website is competing not only with other websites, but also with all the other media that will be included in the results that an average searcher achieves. And Universal Search makes it difficult for Google itself to determine the relevance between different types of media, since the factors that determine a web page's relevance are much different than those that would determine a video's relevance, for example.

What Can You Do Now to Make Sure Your Site Is Ready to be Found in a Universal Search?

Clearly, Universal Search will change how an SEO campaign is run if it catches on. But this is a real if - users' search habits are hard to change overnight, even if you are Google and you essentially define what searching is and how it works. If it does catch on, you'll need to analyze the industry you are in and figure out which types of media might be most important for you. For example, if you are a real estate firm, images of the buildings and homes you are selling might become a very important part of your site, and so you will want to focus on adding alt tags to each image so that not only does your site show up for certain keyphrases, but your images do as well. If you are a business services firm, you instead might want to focus on news items produced by your company - press releases and white papers - and make sure that those are available to search engine spiders and optimized for critical phrases.

If you are working with an outside search engine optimization company already, now is the time to ask what they plan to do in regard to Universal Search. Your search engine optimization company should at least have an awareness of the magnitude of this new way to search on Google and should be able to present you with some sort of plan of attack, even if they plan to wait to embark upon the plan until they know for sure that Universal Search is going to catch on. If you are looking into hiring an outside search engine optimization company to launch a new campaign, the same holds true - ask your contacts at the firm how they are planning to handle Universal Search. They should at least be familiar with the concept and have a general outline to present to you.

Conclusion

If you thought that it was just Google that was working on what it calls Universal Search, think again. Yahoo, MSN, and Ask, as well as several minor search engines such as A9, are all working on their own versions of a universal search that will display different media types. These versions are currently still in the testing phase, but they could be rolled out at any time. What all this means for you and your search engine optimization company is that the face of SEO will be changing dramatically over the next several months - or it won't. Only time will tell. (1) http://www.birds-eye.net/directory/statistics/2007.htm - Accessed June 2007
The Myth of W3C Compliance?The Roller Coaster of Link Popularity4 Great Reasons to use Google AnalyticsFluff vs. Quality ContentMaximizing the Triangle of Relevancy With Google5 Tips to Effective SEO Keyword Research Analysis10 Steps To Top 10 Rankings In Google3 Top Tools and Services for an Effective Internet Marketing SolutionFor SEO Beginners: Twelve Definitions You Need To KnowSearch Engine Optimization for Universal Search - Back to Square One? | Search Engine Optimization (SEO) | PageRank