Google AD


Monday, April 30, 2007

How To Compete With The Big Boys

How To Compete With The Big Boys

By Jerry Bader (c) 2007

Every business needs to do everything it can to stand out from the crowd, to differentiate itself from the competition. This is a major challenge for companies that sell substantially the same thing as their competitors.

The average business does not have the resources of a multinational corporation that often uses its substantial marketing muscle to buy market share or to drive competition out of the marketplace. Big business also uses its deep pockets to flood various media with advertising, making them a pervasive presence.

The Web has always been an egalitarian environment where smaller companies could present themselves using the same techniques as the big boys, and if these companies did it well they could stand side-by-side with their competitive behemoths.

One thing that small and medium sized businesses should take some comfort in is that many large corporations are notoriously poorly run, relying on brawn rather than brain to get the job done. Many survive because over time they have acquired huge resources, become oligopolies, or they use predatory marketing practices to stifle competition.

As the Web becomes more and more a multimedia environment, corporations are starting to use their financial resources, and inventory of commercial assets and programming (not to be confused with computer programming), to deliver their marketing messages. The question is can smaller businesses compete, and if so, how?



Slipstream Marketing

Dr. Max Sutherland, a Marketing Psychologist and Professor at Bond University, has written about a concept he refers to as 'slipstreaming.' Anyone who is familiar with motor racing or even bicycle racing understands that slipstreaming is a drafting method where a racer tucks behind a front-running rival reducing wind resistance and saving fuel and energy, and with a quick move, the challenger can slingshot past the race leader.

The clever implementation of slipstream style marketing campaigns can allow you to blow by your competition by using the momentum of well-known and instantly recognizable campaigns.

Slipstreaming references a collective audience memory, a kind of shared consciousness. Skillful execution draws ínstant recognition and an "Oh I Get It!" reaction without a lot of wasted setup or groundwork.




"Give Me The Same Thing, But Different!"

The key of course is how you make your version different. What's the twist? Blake Snyder, a Hollywood screenwriter and author, writes about entertainment executives' constant refrain, "Get me the same thing, but different." What Snyder has learned and what he preaches is that movie moguls understand it's easier to get people to go to a movie they understand and that was already a success, but the trick is making the new version different, that is different but the same.

If you think slipstreaming is an easy way to be creative you're wrong. Being different but the same is not as simple as it sounds, but success can depend on it. Done poorly slipstreaming comes off as lame and imitative, but done correctly you appear clever and cutting-edge, and more importantly you deliver the marketing message in a way your audience will remember.

There are an endless variety of things you can slipstream: personalities, icons, slogans, music, advertisements, news events, pop culture phenomena, movies, television shows, commercials, and sporting events.



Personalities

One of our favorite personality slipstreaming techniques is the use of voice-over. It can be implemented as part of a video campaign or as a stand-alone feature. We have used sound-alike actors to portray Rod Serling, Sam Elliot, Steve Irwin, Paul Winfield, Tom Brokaw, and many others.

What makes this approach so valuable is that most people will relate to the voice as someone they know, or are familiar with, but not immediately recognize.

This method captures people's attention with the familiar sound of a famous voice but without the cost of hiring the celebrity. Often the voice does not even have to be that close to the original, it's the cadence, delivery, tone, and scrípt that makes people sit-up and take notice.

Cutting through the jungle of advertising noise is a challenge for everyone in business and this technique is a very effective method of getting heard and being remembered.




Television Shows

Another slipstream technique we've used is to play upon the audience's knowledge and familiarity with certain television shows. We have created Web-videos, written scrípts, added dialogue and composed music that reminds people of the old 'Twilight Zone' series and the popular A&E show, 'City Confidential.'




Commercials

One of our most successful Web-promotion campaigns was the 'Multimedia Versus SEO Campaign' where we took advantage of the well know Macintosh Versus PC television commercials. Nobödy needed an explanation or setup to understand what was going on in the commercials. We basically slipstreamed Apple's television campaign.




Slogans

Slogans are another resource for slipstreaming and if you think only small companies slipstream, think again. The A&E Network used the slogan "Time Well Spent" for many years, while The Comedy Network slipstreamed it with their own twisted version "Time Well Wasted" - the same thing, but different.




Music

With the popularity of Hip Hop music, the milk marketing board developed a series of commercials with dairy farmers rapping to a catchy Hip Hop tune well prancing around their farm animals. Hip Hop was also slipstreamed by Smirnoff in their Raw Tea campaign and 'Tea Partay' viral video.




Pop Culture

With the popularity of poker and the World Poker Tour, we developed a Mike Sexton style character, host of the television show, for one of our projects. We've even created nostalgia radio-style audio pitches that hark back to the olden age of radio plays.




Movies

We created an entire campaign for a client based on the idea, "Life Deserves A Sound Track" where everyday situations were described in dramatic style with familiar voice-over announcers, which was our take on Will Ferrell's hit movie 'Stranger Than Fiction.'




Sports

We've created presentations using the personas of famous sports figures like Hall of Fame pitcher and broadcaster Dizzy Dean and Mel Allan. We created scenarios and scrípts using the voices and personas of World Champion racecar driver Jackie Stewart and one crazy scrípt fashioned in the style of college basketball analyst Dick Vitale.




Conclusion

As you can see from these examples, there are an endless number of ways to take advantage of the public's shared experience. So the next time you need to come up with a new Web marketing campaign for your company, think like a Hollywood mogul: Come Up With Something That's The Same, But Different.

Let Google's Algorithm Show You The Traffic

Let Google's Algorithm Show You The Traffic

By Titus Hoskins (c) 2007


Recently Rand Fishkin of Seomoz.org brought together 37 of the world's Top SEO experts to tackle Google's Algorithm, the complex formula and methods Google uses to rank web pages. This ranking formula is extremely important to webmasters because finding which factors Google uses to rank their index is often considered the Holy Grail of site optimization.
Google's ranking factors affect how and where you are listed in their search engine results or SERPs. Since obtaining top positions for your targeted keywords often spells success for your site, knowing Google's ranking factors can be very beneficial.

Every experienced webmaster will know Google is the main supplier of search engine traffic on the web, getting listed on the first page or anywhere in the top 10 positions for popular keywords will result in plenty of free quality targeted traffic.

Briefly listed below are some of the main ranking factors you should be optimizing your web pages for in your marketing. The majority of these ranking factors will be very familiar to most webmasters who take full advantage of any and every SEO tactic which will give their site an edge over their competition.


Here are some of the main ranking factors to consider:



1. Keywords In Your Title And On Your Page

Place your keyword or keyword phrase in the title of your page and also in your copy. Many webmasters use variations of their keywords on this page and also include it in the H1 headline.



2. Keywords In Your URL

Keep your page on topic and place your keyword in the URL. Use your keyword in the H2, H3... headlines. Place it in the descript-xion and meta tags, place it in bold/strong tags, but keep your content readable and useful. Be aware of the text surrounding your keywords, search engines will become more semantic in the coming years so context is important.



3. Create High Quality Relevant Content

Have high quality relevant content on your pages. Your content should be related to the topic of your site and updated regularly depending on the nature of your site.



4. Internal Onsite Linking

Internal linking is important to your overall ranking. Make sure your linking structure is easy for the spiders to crawl. Most suggest a simple hierarchy with links no more than three clicks away from your home/index page.

Creating traffic modes or clusters of related links within a section on your site has proven very effective for many webmasters, including this one. For example, creating a simple online guide on a subject related to your site's topic can prove very beneficial. Keep all the links connected and closely related in subject matter and don't forget to have occasional external 'anchor keyworded' links coming to these internal links on your site instead of to your homepage. Deep build your links.



5. Only Linking To High Quality Related Sites

Don't forget to link to high quality PR related sites. Linking to high quality sites shows the search engines your site is very useful to your visitors. Build relationships within communities on the topic of your site. Be extremely careful not to link to bad neighborhoods, link farms and sp@m sites... when in doubt, don't link out!

Unless your site has been around for years and is well established and trusted by Google, this factor will have an adverse effect on your site's overall ranking. Linking only to high quality content sites will give your site an edge over your competition.



6. Global Linking Popularity

One of the major ranking factors is the Global Linking Popularity of your site. You should try to build plenty of inbound links from quality sites. One simple and effective way to do this is through writing articles and submitting them to the online article directories. Only related sites will pick up and display your articles with your anchor text links back to your site. These are often ONE-WAY-LINKS.

But don't just write articles to get links, write quality content that will help the reader first and the links will come naturally. Also remember an article is an extremely good way of pre-selling your products and gaining trust with your potential customers.




7. Anchor Text Is Very Important

Anchor text is an important factor your must not forget to use. Perhaps more importantly these inbound links should be related or relevant to your site's topic, which will play an important role in your rankings. Don't ignore the text surrounding your links and use different anchor text links to avoid keyword sp@mming.

Keep in mind, as search engines become more semantic, the whole text of your article will probably be considered your anchor text, thus making articles even more important to your rankings.



8. Number And Quality Of Your Inbound Links

Your inbound links should also come from related high Global Link Popular sites. The more links your have from these popular related sites the higher rankings you will get. Many SEO experts suggest you should have a steady stream of new sites (inbound links) added each month to keep your rankings growing. These links will age and increase your rankings after 4 or 5 months. Both quality and quantity is important.



9. Reliable Server And Service

Like any business, Google is only serving up a product (SERPs) to its customers, this service must be continuous and available at all times. Make sure you have a good reliable server because any extended downtime when your site is inaccessible to the Bots may be detrimental to your rankings. If it is down for over 48 hours, you could be dropped from the index. Ouch!



10. Duplicate Content Is A NO NO!

Make certain you don't place duplicate content on your site. This may affect your rankings and get your pages thrown into the supplemental index. Be careful not to use duplicate title or mega tags on your pages as this will lower and disburse your internal page rankings, resulting in poor optimization.

Your overall SEO strategy should be to provide valuable relevant content and links for your visitors and the search engines. Furthermore, as mentioned earlier, be extremely careful who you link out to from your site. Avoid sp@m sites, link farms or selling links. Although it is a bit outdated, using the Google Toolbar will still give you a general overview of a site's PR or Page Rank.

These are some of the most common and important ranking factors Google uses to rank and display their search engine results. Optimizing your site or keywords for these factors can prove very beneficial and rewarding.

There are many more factors so you should use the link in the resource box below to get all the gory details. For any novice or experienced webmaster it makes for a fascinating read and is extremely helpful in tackling Google's complex ranking system or algorithm. Conquer it and an endless supply of free organic traffic is yours for the taking.

Green With Envy in the Google Game

Green With Envy in the Google Game

By Bill Platt (c) 2007 Links And Traffíc

Beginning on April 14th, 2007, a firestorm blew through the Internet community with the search engine optimization (SEO) community burning the hottest. The embers were warm and waiting for a strong wind to blow and kick up the flames, but it took Matt Cutts, the Google engineer extraordinaire to fire the flames with an off-the-cuff comment about "paid links."

The flames raged and in most forums, the wind quickly shifted moving the firestorm back towards Cutts and Google. Thread Watch offered the most biting rebuttal to Cutts' comments: http://www.threadwatch.org/node/13925 and http://www.threadwatch.org/node/13941 .

Aaron Wall at Thread Watch is a respectable fellow, and he tore into Google with a ferociousness that I had not anticipated. Matt Cutts tried to answer some of Aaron's questíons, but it seemed that Cutts' rebuttals only added more fuel to the fire.

I would not have wanted to be in Matt Cutts' shoes that week. Oh my, it was brutal!

Even on Cutts' own blog where the "paid link" comment originally surfaced , Danny Sullivan posted a question that went unanswered, so Sullivan commented about it on his site.

Search Engine Watch even mentioned this issue and linked to additional forums where the debate was raging.




What Most Readers Took From Cutts' Comments

There were only a few readers who took Matt Cutts' comments to be brotherly-advice.

The vast majority of people were screaming that Google intended to exercise their "monopoly control" over the Internet to run all of their competitors out of business.

Generally, I am not a "reactionary" type person. But for about an hour, even I had a ball in the pit of my stomach. The ball passed from the pit of my stomach when I read a post that mirrored an opinion I have openly written about numerous times before: How does Google determine the "intent" of a person making a link? They can't!




Understanding The Nuances Of Similar Items

Some people suggest that I should be ashamed of myself for speculating about the future of Google's algorithms. There is even one clown, who has suggested that I should fear mentioning Matt Cutts' name in an article, because I am bound to draw Cutts' ire against me and my businesses. But, I am not worried.

I am simply laying out my "speculative" opinion about what Cutts' comments might mean to my business and yours. You are free to use your own brain to judge the value of my words.

Am I playing a double standard when I say that Google cannot determine the intent of the person placing a link, and then I comment on how I interpret the future of the Google search algorithms? I don't think so, and let me tell you why.

Google uses algorithms (software programs) to make distinctions about what a web page is about, how they value that page, and to judge the nature of a link.

I use my intellect (or as some would suggest, my lack thereof) to make a judgment about what Google has told us we should expect from them in the future.

I trust software to a certain extent, but software cannot always read the nuance that separates two very similar items. So, how can the Google algorithm be expected to determine the intent of a person who placed a link?

It has always been my contention that humans are "required" in any process that must make an interpretation of nuance. In my businesses, we refuse to trust computers to make judgments of nuance, because they can't. That is the reason we employ human beings to process orders.




What Is Google's Intent Behind The Paid Links Issue?

The whole of Cutts' argument seems to hinge on nixing "paid links" that are designed to manipulate or "game Google's PageRank" and to a lesser extent, their organic search results. Google seems to be really agitated that webmasters are "selling links based on the PageRank value of a page."

The problem is that webmasters are selling an intangible asset that is wholly owned by Google and maintained for "Google's benefit." Webmasters are selling this Google asset, but Google will not receive any of the proceeds from that sale.

As a result, Cutts suggested that webmasters should use some method that Google's spider can use to recognize and distinguish "paid links" from "given links." Since Google's algorithm is based on the theory that links are given to websites that deserve those links, the paid links on high PageRank pages can really skew Google's PageRank values and its organic search results.




Here Is Where It Gets Ugly

Both honest and dishonest people inhabit this Internet.

Google wants webmasters who are selling links to distinguish paid links from given links, so that Google can ignore "links purchased to influence PageRank."

If honest people distinguish paid links in a way that Google can recognize, then the market demand for those links will dry up. Once the PageRank value of a link is taken away from the buyer, the buyer will be forced to purchase links based only on the traffíc that the specific web page receives. If all paid link decisions were based only on a web page's traffíc, then the market value of a link would be decimated.

Once a webmaster tells his link-buying customers that his or her links will no longer carry PageRank value to the buyer's website, then the value of that link will drop in most cases by 80% or more. Why would a webmaster want to reduce the market value of his links by 80%?

Although Google's links do not pass PageRank to the websites that are in their index or paid listings, we have to ask ourselves one thing. Would Google be willing to take a step that would reduce the market value of their own links by 80%? They certainly would not do anything that would cut their own bottom line that deeply, yet they are asking webmasters to do just that.

This is the reason people are teed off at Google. At least 80% of the market value of a link is driven by the PageRank value of the web page where the link will be placed.

Dishonest people don't care to play by the rules; they will continue to sell their PageRank value, as long as they continue to have buyers. Only the honest will suffer.




Link Buyers Are Green With Envy

Link Buyers are envious of the PageRank value given to other web pages, and they want a bit of that value passed over to their own websites.

Link buyers are green with envy, because they can see that little green bar in the top of their browser that tells them how much value Google gives a web page in its algorithms.

If Google were to keep PageRank as a private value, known only to them, then "paid links" would not be an issue for them to manage.

If the public cannot see what a page's PageRank value is, then link buyers would not be able to use PageRank to influence their link buying decisions, and webmasters would not be able to market their PageRank value to other websites.




How Simple Is That?

All Google has to do to solve this problem of theirs, is to take away the indicator people use to buy and sell PageRank.

Someone suggested to me that Google would nevër do away with the PageRank indicator in their toolbar, because Google feels that it is the only thing that ensures that people will keep the Google toolbar in their browser. Personally, I will continue to use the Google toolbar for my searches, even if the PageRank indicator was not there, because I like the search results Google gives to me. But that is just my opinion, and I am only one person out of millíons of Google toolbar users.

What it boils down to is this. If Google is serious about nixing schemes to buy and sell PageRank, then they should simply take their PageRank indicator away from us. But will they take it away? Only time will tell.

Thursday, April 26, 2007

Google's Last Dance! Could Semantic Search Mean The End Of Google?

Google's Last Dance! Could Semantic Search Mean The End Of Google?

By Titus Hoskins (c) 2007

As a full-time online marketer and webmaster I try to keep my eyes peeled to what is happening with the search engines. These complex creatures control the Internet. They truly are the heart, soul and brains of the web.
Unfortunately, they also control the faith of many struggling webmasters who are clawing their way to the top of SERPs in organic search. Being listed on these first page results for your chosen keyword phrases is the ultimate goal and it is often the determining factor in the success of your site.

Recently, I have noticed some strange movements with my closely watched keywords, especially in Google. Which shouldn't alarm anyone because there are often sudden movements and adjustments as Google tweaks and refines its algorithm, the complex series of formulas it uses to determine which pages and sites get featured.

(Side note: An excellent resource on Google's Algorithm and ranking factors can be found at: http://www.seomoz.org/article/search-ranking-factors#f41

It's way too early to jump to any conclusions but the big question on everyone's mind: Is Google Moving Towards Semantic Search?

Or more precisely will Google have to move to semantic search if it has any chance of surviving in our 'here today - gone tomorrow' search world. Most of us old folks can easily recall a pre-Google web. Is a post-Google web possible?

That's very hard to swallow but stranger things have happened on the net. But the real question should be: will Google have to embrace semantic search or perish?

Wikipedia defines Semantic Search or Semantic Web as the evolving process of taking all the content on the world wide web and "expressing it not only in natural language, but also in a förm that can be understood, interpreted and used by software agents, thus permitting them to find, share and integrate information more easily."

As can be imagined, finding the formats and framework by which all this data can be processed into meaningful responses directly related to a search enquiry is mind boggling. Technologies such as RDF (Resource Descript-xion Framework), data interchange formats (e.g. RDF/XML, Turtle, N3, N-Triples), RDFS (RDF Schema) and OWL (Web Ontology Language) will all probably play a role. Many believe microformats will be very important in this evolving semantic web.





New Semantic Search Engine


We now have our first search engine supposedly based on semantics or meaning: Hakia. Is it the first in a whole new wave of search engines generated and powered by the Semantic Web which is now tagged as Web 3.0? More importantly, can it compete against a more text based search engine such as Google?

Hakia has some great features such as highlighting potential answers to your posted question. For example, ask it a question like: What is the population of Seattle? And you will get an answer. But you will also get a gallery page featuring all the relevant information about Seattle: How to get there? Local Hotels, Restaurant Guides, Local Weather...

Of course, do the same search in Google and you will also find your information along with images and maps of Seattle. However, using Hakia will show you the relevant information faster because it is highlighted and easier to find. And in my opinion having a whole gallery page of information somehow makes your search more relevant and useful.




Can Google Compete?


Is this a better mousetrap? Maybe, maybe not... but it is definitely pointing the way to a better method of searching on the web.

Granted, this type of search engine has a long way to go to match Google's massive resources and obvious dominance in the search market. But even the most devoted Google user like myself must admit Google's method of ranking pages and content on the web is not without some flaws. Take for example the issue of Google Bombing where different webmasters influenced the listing of the keyword 'miserable failure' to point to President Bush.

Google has now solved that problem but Google is basically an elitist system where sites and content are judged by the PR ranking system and its algorithm and filters. One would like to believe it is a democratic system where the best and highest quality content rises to the top. One would like to...

Information is one thing but opinions and the quality of those opinions is something entirely different. Will the new semantic web/search be able to judge quality content and rank it as good as Google presently does?





Problems For Webmasters


Regardless of how the whole Semantic Web scenario plays out, it may have some consequences for webmasters and marketers. At least in the initial stages until you can adjust or optimize your sites to this new 400 pound Gorilla on the block.

One major concern is how will the literal translation or semantic meaning of your site's title and URL determine your placement in a semantically themed search engine? Most webmasters know to place their major keywords in their site's domain name but, if you cover many topics within your site, this is not always possible.

Plus, does a Semantic Web mean everything will probably have to be tagged to the nth degree as we are seeing in blogs, social media and Web 2.0? Thankfully this can be easily done with free software such as WordPress which has tagging already built into its programming.

If we do get truly semantic search, wouldn't on page factors play an even greater role for ranking? Special care would have to be taken as regards to your keywords and keyword variations. Great care will also have to be taken with page Titles, Meta Tags and your URLS.

I notice I am listed in Hakia for certain keywords but those have the direct phrases in the URLS.

Keen observers will also note that Google is now listing five or six related links in the number one SERPs position for certain keyword phrases. All these links come from the same site but are they more semantically related to the search enquiry than traditional links we have seen in Google? Or are they more in line with the gallery pages we see in Hakia?

Of course, jumping to any conclusions based on just one or two examples is foolhardy to say the least. Especially where search engines are concerned.





Brave New Internet


No doubt, Semantic Search or a Semantic Web poses some difficult obstacles and challenges as we seek a more human response from all those bits and bytes. For example, will semantic search mean we will have more closely focused sites strictly sticking to the topic of the url or domain name. Will the semantic web be more restricting than liberating?

When it is all said and done, will we really be able to devise a computer/machine/system that will be able to truly interpret the vast stored knowledge and give us the right meaningful answers to our questíons? Will it be able to be programmed so it's human enough to not only understand but also interpret the subtle differences and meanings we have for different words in the whole context of a webpage?

Perhaps the most intriguing question, can someone take the present day 'www index' and then apply microformats or even new technology to this massive data and build a supplemental exclusive extension of the present day web? Turn it into a more semantic 'natural language search' accessible index. If such a gigantic feat was even feasible, you would also have to wonder who could have the resources to make such a creature possible!

As we have seen from Google a dance is not necessarily a dance and a slap is not necessarily a slap. Could an index be more than an index? It may be too early to tell, but Google will probably be better equipped to quickly adjust than anyone to this new Semantic Web whatever shape or förm it takes.

SEO - Is A High Page Ranking Overrated?

SEO - Is A High Page Ranking Overrated?


A discussion of page ranking, it’s relationship to search engine optimization practices and whether or not it is overrated when it comes to making sales with a website.

One of the raging debates about search engine optimization in general is about whether or not you need a high page ranking or not. People will spend thousands just to get a high page ranking on Google, but in the end – does a high page ranking actually translate into high sales? Many expert SEO gurus say no because sometimes all it brings you grief in the form of lots of emails to answer and window shoppers and no sales! The bottom line is that the only kind of page ranking that matters is the one that brings you buying customers.

So how can you end up in the above described situation in the first place? Usually this happens when you end up subscribing to a link farming service or when you have manually indexed to too many low quality sites.

The only thing that truly matters is that when people type in certain keywords into a search engine box that they can find your information quickly and efficiently and that they find what you offer before they find what your competitor has to offer. This means making sure that you have quality keywords that are not stale tagged to your site and used in the copy of it text. It also means not settling for linking to just anybody who comes along no matter how complimentary your two sites might seem.

Yet another way to make it easy for people typing into search engine boxes to find you is to get your html correct. If you are using HTML make sure it is all correct as the spiders may simply avoid reading HTML that reads like gobbledly gook. Using a CSS style sheet to write your site and sticking to its guidelines is one way to accomplish this. Yet another way is to hire an SEO expert or an HTML expert to clean up your HTML language for you.

About The Author:
Christopher Angus is a SEO and Website Marketer. He can be contacted at: Sales (at) Brilliantseo.com http://www.herringshoes.co.uk http://www.crcwritingservices.com http://www.mobility-direct.co.uk

Is Your Web2.0 Marketing A Goldmine Or Black Hole

Is Your Web2.0 Marketing A Goldmine Or Black Hole

As an Online Business Owner, I regularly subscribe to and read many ezine newsletters. Some of them I find rather silly, some hyped and some just plain annoying. As a internet business owner, however, it takes a lot to get me to unsubscribe because there is usually something to learn no matter how small from most newsletters.

My favorite newsletters of late are those that lead me on a trail to places where others who share my interest and are on the same path as I am. Whether it be a community of dieters, hair style fanatics , a mom friendly place or internet marketing communities.

These newsletters have their finger on the pulse of what Web 2.0 Marketing is and how to make the most of it.

Just recently, I got an email from one of my favorite affiliate programs which led me to his blog. Once I arrived to the blog, I found lively discussion initiated by the owner about his most recent opt in test results along with comments from other members. Unfortunately my two cents was not on topic but I felt my comments would still need to be seriously considered. Before I placed my comment, being the good netizen that I am, I prefaced my post by saying just that - "my comment may not be on topic."

Unsure, if my comments would end up in a big black hole or deleted, I forged ahead with my comments that I felt would help his affiliate members.

Believe me, the suggestion I gave for his membership program was a Win- Win for everyone. I did not say this in my post but any smart business owner could for him or herself conclude that on their own.

Imagine my delight when I received his regularly scheduled newsletter one week later announcing changes to the basic membership program which included a change that actually used my suggestion. Mind you, the membership program I had was for his free program at the time. The suggestion had to do with making changes within the free basic membership program.

Needless to say, I was delighted to see someone take action. This owner not only has Web 2.0 marketing in his business but he is actually using it smartly. As a result of this, I have since upgraded to a paid membership.

Trust me when I say this owner will see his sales continue to increase because he has a few of the Web2.0 Marketing principles at work for him that I've noticed.


1. A Business System - He has a pretty good Web 2.0 Marketing system in place for both front end and back end sales for his business .

2. Integrated Blog - There is nothing like having a blog out there all by itself and no one to visit it. He has a good system in place to generate traffic to his blog.

3. Engaging - How many of your clients and prospects have you engaged in communication with lately? The biggest turn off to people online is when they are unable to interact with you. Don't be afraid to put your thoughts out there and allow your subscribers or website visitors a chance to talk back.

Are you a control freak? Well if you are, it's time to Move on and Let that go.

Stop talking to yourself and allow others to enter the conversation whenever you send out communications to your subscribers.

If you do, You just might talk yourself into a Web 2.0 Marketing business Goldmine.

If you would like to see positive results from Web 2.0 Marketing in your business, I would encourage you to download a free report written together with the help of my friend and mentor Henry Gold.

This report will help you as it did me to clearly understand Web 2.0 and how it will affect your business.

I encourage you to go ahead and download the rest of the report not for me or Henry but for yourself. It changed my life and I am confident it will change yours.

Google Adsense and How You Can Earn More From Adsense Ads

Google Adsense and How You Can Earn More From Adsense Ads

Any web site owner or webmaster who is trying to earn a profit from their sites are likely familiar with Google Adsense.

Google Adsense is a great and easy way to make money from your site if it is done right. Adsense will allow any person with a blog or an informative site to earn money, simply by placing a little code on their site pages. Rather than trying to figure out exactly what ads to put on their web pages, Adsense gives web site owners the ability to concentrate on their sites content.

Many webmasters are able to make a living from Adsense, however, there are also quite a few who spend all their time just trying to figure out the "magic trick" used to earn from Google Adsense. Earning a living from Google Adsense ads, can seem difficult, but it's not impossible.

If Google Adsense is going to be your only source of income, you will want to do more than just taking some Adsense code and placing it on your site. That is just not enough; you will need to do some experimenting, with placements, formats and choice of keywords.

You really should take care to build your page around a specific topic or keyword that is relevant to your site concept. This will ensure that any Adsense ads which are placed on this page are appropriate and useful to any visitors who want to know more about the topic and they will more than likely end up clicking on the Adsense ad.

You will want to take care where you place your ads. It has been proven that visitors often first look to the top left of a website when they arrive. Because this is were your visitors attention is likely to first, it is going to be one place where you might want to consider placing some ads. You can read the Google help on the Adsense website to learn more about the best locations for placing your adverts.

Another consideration when placing your ads, is to put them on high traffic pages. You can identify the pages visited most on your site by taking a look at your logs or your Google account, where you will get the page-by-page details of your visitors.

Although the skyscraper and banner ads may look good on your site, you may want to avoid using them. Often times, banners are ignored. For example, have you clicked on any banners of sites that you have visited lately?

You will want to blend your Adsense ads into your web page by using the Adsense formats. Google supplies a variety of palettes allowing you to change font colors, borders and backgrounds. There really isn't much point in putting an ad on a page if it doesn't blend with your site.

A very important resource that many webmasters ignore is the Adsense preview tool. This tool will allow you to preview the ads that will go on each of your pages and gives you sample ads and formats. Here is where the destination of your ads can be checked, as well as, geo targeted locations.

Remain focused on what it is that you want to achieve. However busy you may be, you must take some time and experiment with your Adsense ads so that CTR can improve. No matter what the experts say, just follow the basics, that's the real magic to making more from Gooogle Adsense.

Wednesday, April 25, 2007

Search Engine Optimization

Search engine optimization, known far and wide as SEO, is an important factor in the success of websites small and large. No matter how popular you think your site is, it's vital that you and your team put some thought and energy into optimizing your search engine rankings and your visibility within search results. But where to begin?

Webmonkey Bryan Zilar recently sat down with strategy consultant and SEO guru Jason McQueen to talk about all things search. They discuss trends in the SEO world and the philosophies behind "white hat" and "black hat" techniques. Jason also offers advice for webmasters who want to adopt an SEO strategy that produces results on a limited (or non-existent) budget.

This Webmonkey Q&A is also available as a .

Webmonkey: Today, I'm here with Jason McQueen, who is a strategy consultant for Mindshare Interactive Campaigns. Today, Jason's going to tell us about search engine optimization technology.

Jason, How do search engine algorithms work on a high level?

Jason McQueen: Search engine algorithms, by nature, are a well-guarded secret. Search engines do not give away the particulars of how they work. We know, as an SEO community, generally what they do and how they index web pages.

I would say that most crawler-based algorithms are particularly interested in location and frequency of keywords on the web page. That's probably the most focused area of the algorithm. Each engine has a set keyword saturation, meaning it allows a certain number or percentage of keywords within a particular block of content.

It's important as an SEO to know where the keyword density stands for a given page. That means the number of keywords per page and per relevant topic. If you have a page that's selling radios, you want to explain that in your content and talk about the benefits of having a radio. But you don't want to over-talk and stuff the page with keywords to the point where the algorithm will pick it up and eliminate it because of an "illegal practice." This would be keyword stuffing.

I also want to discuss linking with concern to the algorithm itself. Not only is keyword density ratio important as far as the algorithm is concerned, linking is as well. Google is particularly interested in a site's linking schema — meaning a site's inbound links. These are more important to Google than outbound links. It's not just about getting an inbound link, but a relevant inbound link for a site.

WM: You've also been talking about "white hat" versus "black hat" SEO. What is that?

McQueen: In terms of search engine optimization, you're talking about two different real techniques as far as how to perform it. Black hat SEO is the practice of using techniques deemed illegal or unethical. These can include using hidden text in your site. The search engine can read the text in the site, but it's not visible to the human eye. That's just an example of keyword stuffing. There are "door-in pages," which allow users to come into your site through a page and then push you to another thought or product or service — things the consumer didn't initially want to visit. Those are all some examples of illegal techniques.

White hat is the opposite of that — basically ethical SEO, which is using established SEO practices to increase a site's ranking. White hat is a more long-term strategy, and black hat can be more immediate.

WM: What's your opinion on using either one? You say illegal, but come on, that's just a word.

McQueen: "Illegal" is a strong term. You get posed that question a lot. It depends on your business model and what you're going after.

I would never request or recommend that a company use black hat SEO, but there are some instances where it can be helpful. It just depends on what your marketing strategy is. The only driver behind using black hat SEO is that the results can be much quicker. You can see the jump in search engine rank happen relatively quickly. The downside to that is that you will eventually get caught and can be penalized by the search engine. You can even be banned and removed from a search engine. That can be detrimental to retail clients specifically.

WM: What would I want to do if I don't have a dedicated SEO resource and can't afford a consultant to help me out? What are some techniques I can use to boost my page count?

McQueen: I get asked that question a lot. Not everyone has an SEO budget, and not everyone is a technophile who understands SEO on that level. I think it's mainly about using common sense. You have to assume that search engines are most concerned with relevant content.

All the major search engines — Google, MSN, Yahoo, et cetera — are specifically interested in providing relevant content. So if you're providing content on your page that's fresh, relevant and useful to the user, you're pretty much going to be OK. That sounds very obvious. However, when you start looking at content delivered on home pages and read it from an SEO or content perspective, you actually realize you aren't doing that.

I've had quite a few clients end up surprised when we started to dig in on the content on their home pages. They thought for years that they were delivering exactly what their clients needed, but (they weren't).

From a content perspective, I recommend rotating the content and keeping it fresh. It can be a huge boost to your site ranking. You'll see immediate results if you do that. I usually suggest that we switch out content every other month and continue that trend.

WM: Speaking of content, blogs often get brought up regarding search results. It's very easy for a blog to interfere. Do you feel that blogs are skewing the web?

McQueen: There's a polarized view about that. I personally like it. The more information you can bring up about a relevant search topic, the better. The reason that blogs are showing up is simple. Blogs that are sitting on a searchable URL satisfy two of the main search ranking criteria. First, they are constantly changing content — daily, sometimes twice daily. Second, they are filled with inbound linking. They're getting ranked. Some of the major sites that have been sitting at the top of search results for years are getting irritated that a bunch of blogs are showing up and skewing (or diluting) the results.

WM: What about things like video, audio and images? How are they getting cataloged? Do you use tags?

McQueen: I think you answered the question. There's still a lot of "wild, wild Westing" going on in those areas. They're still new in terms of search. Google's caching videos. My company's doing a lot of that. It's interesting to see the type of traffic generated from that.

WM: What are your favorite resources for SEO news?

McQueen: I always follow Danny Sullivan's site Search Engine Watch. He's kind of the godfather of SEO. He's been around since the beginning and speaks at all the SEO events. He has always verbalized or set the standard in the industry. He's a good person to watch, because he believes in white hat SEO and he forces the industry to go there. Also, he speaks clearly and helps even a beginner understand.

I have my own blog, appliedseo.com. Me and a couple of other SEOs are on there giving advice and talking about issues in the industry.

I also just have Google News alerts for anything coming out for "SEO." A lot of clipsy articles are coming out and talking about the new emphasis on SEO. There's a lot of chatter about it and you don't have to hunt that far. At the least you can find the defined SEO best practices online.

(Research) is really important if you're a decision maker in a company that's looking to hire an SEO or a firm to do work for you. You need to know a little bit more about the technology to make sure the person you're hiring knows what they're doing. If your company uses SEO the wrong way, it can penalize you and it can be a nightmare to get out of.

Monday, April 23, 2007

Microsoft Still Needs Help Understanding Search - It's Been Done Before

Microsoft Still Needs Help Understanding Search - It's Been Done Before
(Page 4 of 4 )



As I mentioned in the introduction, this isn't the first time that Microsoft has gone after searchers with some kind of reward to get them to use its search engine. Last year it ran "MSN Search and Win," a contest of sorts where users of MSN Search could win prizes instantly ranging from a Starbucks gift certificate to a Panasonic high-definition TV. The five-month-long promotion, according to Adam Sohn, director of global sales and marketing for Windows Live, "drove tens of millions of queries and for a relatively small amount of money."

Microsoft isn't the only company to consider this kind of scheme. About the same time Microsoft came out with MSN Search and Win, Yahoo surveyed its Yahoo Mail customers to see what incentives would entice them into designating Yahoo as their primary search engine. The ten-item list included things such as free music downloads, a Netflix discount, donations to charity, and PC-to-phone calling credit. Yahoo ultimately decided not to create a rewards program.

Still, such programs are time-honored traditions in other industries; just think about airlines and frequent flier miles. There are a number of reasons why Microsoft's version might not work as well as the airlines' programs, however. Just looking at frequent flier programs among business travelers, it's easy to see that employees benefit from earning the miles, but their companies pay for the product. As Mara Lederman, an assistant professor of strategic management at the University of Toronto points out, "If the fare for your preferred airline is $100 more, you don't care because you don't pay. You just want the points because you want to take your family to Hawaii."

Internet search doesn't cost the searcher anything whether they're searching on Google, Yahoo, Microsoft Live, Ask, Searchles, or any of hundreds of search engines. But Lederman notes that "there's no reward going directly to the individual carrying out the search." In other words, the business as a whole might have an incentive to earn money from Microsoft, but the employees within the business will look at it, think "What's in it for me?" and not see any point to using Microsoft Live rather than Google.

Indeed, there's still plenty of reason to prefer Google in a work environment. If Google delivers more relevant results, it makes employees more efficient because they can find what they need more quickly than they would with Microsoft Live. Microsoft Service Credits for Search may have a certain short term effect, but the only way it's going to increase its market share in search is by delivering a better product. To put it bluntly, as blogger and former Microsoft employee Robert Scoble did, Microsoft needs to "Ship a better search a better advertising system than Google, a better hosting service than Amazon, a better cross-platform Web development ecosystem than Adobe, and get some services out there that are innovative (where's the video RSS reader? Blog search? Something like Yahoo's Pipes? A real blog service? A way to look up people?) That's how you win." Microsoft seems to have forgotten this a long time ago.

Microsoft Still Needs Help Understanding Search - The Need for Product Improvement

Microsoft Still Needs Help Understanding Search - The Need for Product Improvement
(Page 3 of 4 )



It's not just about market share though - or at least, not just about buying it outright and wholesale. As Windows Live spokeswoman Whitney Burk explained in a statement, "Currently, we are conducting a trial program through which Microsoft is providing service or training credits to a select number of enterprise customers based on the number of Web search queries conducted by their employees via Live Search. These customers, in turn, are providing valuable feedback to Microsoft on the use of Web search in an enterprise environment. As search evolves into more of a productivity tool, and revenue sharing becomes more commonplace across the industry, we are engaging in mutually beneficial partnerships such as this and our recently announced deal with Lenovo to more easily enable customers to choose Live Search." So it's at least partly about getting the information necessary for improving Microsoft's search engine.

Apparently Microsoft can't pull this off with just the web surfers who choose to use its search engine. There are a number of modern applications that "learn" how to be more relevant the more they're used; consider Amazon's "those who bought X book also bought Y" feature, or how it suggests new items of interest to you when you log in based on what you have searched for and/or purchased before. The more you use it, the better it gets.

This may be why Microsoft's search engine isn't improving fast enough to keep up with Google. As the New York Times explains, "While the quality of results among different search engines is hard to judge objectively, Google enjoys the benefits of a network effect. Its software is tuned to learn from the clicks of its users, and the more users it attracts, the smarter the software evolves, the more users are drawn in, and on the virtuous cycle revolves."

Still, I can't help but think that Microsoft is going about this in the wrong way. Sure, the company can leverage its cash and its installed base to give its search a boost, but will the information it gains be enough to permanently improve its market share? Microsoft is probably hoping that being forced to use the product in the work place will change some search habits in the long term. John Batelle thinks it's more likely to lead to a form of corporate protest against an imposed policy: "How would you feel if, to save a few bucks, the CIO and CFO dictates that you now have to use IE7 preset to Live Search? I can imagine a backlash where usage of Firefox goes way up in large corporations so as to avoid that 'Browser Helper Object' installed in IE7..."

Microsoft Still Needs Help Understanding Search - Buying Search Market Share

Microsoft Still Needs Help Understanding Search - Buying Search Market Share (2/4)




The news was originally broken by John Batelle in his search blog, and later picked up by just about everyone. Microsoft's new program is called Microsoft Service Credits for Search. It's aimed at changing the search behavior of employees at large companies. Here's how it works: companies that sign up for the program earn a $25,000 enrollment credit that can be redeemed for Microsoft products and training. But that's just the beginning.

If the company promotes use of Microsoft Live Search, it can earn anywhere from $2 to $10 a user annually, depending on how much Microsoft's search engine is used. Batelle had a PowerPoint slide on his site that explained how it could work. It showed that a company with 10,000 PCs that did a lot of web searching could earn $120,000; a company with 50,000 PCs that didn't search the web quite as much could earn $200,000. Those numbers can add up very quickly.

So how does Microsoft know you're really with the program? Every computer at an enterprise participating in the program will have IE7 installed with a "Browser Helper Object." (I know, I couldn't help but imagine Clippy popping up and saying "It looks like you're doing a web search!" either). This lovely addition to IE7 will track search queries and send that information back to Microsoft. The tracker doesn't work with any other browser - not FireFox, not Opera, not even earlier versions of IE. According to the New York Times, Microsoft wants to sign up 30 companies, each with at least 5,000 computers, who are willing to install this software.

Microsoft is also hoping to convince enterprises to do the marketing for them by promoting the program internally to their staff. Some of the suggestions it has made for promoting it include in-house training sessions on how to improve your search skills (featuring Windows Live Search of course), setting the home page to Live Search, removing all other tool bars from the browser, and even having the CEO send a "message of encouragement" to the staff. Get them to sign up AND get them to do your marketing - talk about killing two birds with one stone!

Microsoft Still Needs Help Understanding Search

Microsoft Still Needs Help Understanding Search (1/4)


When a company resorts to bribery not once, but twice in an effort to get customers to try their product, you have to consider that maybe there’s something wrong with the product itself. That’s especially true if the product is free – even if rival products are also free. What does that tell us about Microsoft’s recent actions concerning its search engine?
To my way of thinking, it tells us that Microsoft still doesn't understand search nearly as well as Google or even Yahoo. But you don't need to believe me when the figures speak for themselves. A little over two years ago, in February 2005, market research firm Nielsen/NetRatings reported that Microsoft had 14 percent of all web searches to itself, compared to Google's 46 percent share of the market. Two years later, after rebranding its search engine, Microsoft could only claim a 9.6 percent share of web searches, compared to Google's 56 percent share.

Those percentage points aren't small potatoes. They represent a loss to Microsoft of nearly 300 million searches per month. This loss is happening at a time when search advertising is set to explode. According to Piper Jaffray, revenues in this field will reach $44.5 billion in 2011. That's more than double the $15.8 billion they reached in 2006. Even Microsoft can't ignore those kinds of figures. In the first six months of last year, it made less than $1 billion on search advertising, as opposed to nearly $6 billion on sales of the Windows operating system. But Google recently spent more than $4.5 billion just to acquire two companies, YouTube and DoubleClick. It's no secret that Google gets most of its money from search advertising. If you were Microsoft, wouldn't you want a piece of that action?

Clearly Microsoft does want a piece of that action. So far it has been unsuccessful at stealing market share from Google with its own advertising campaigns. And though its product shows signs of improvement, most users seem to agree it still isn't as good as Google. So rather than out market or out compete the search giant, Microsoft is now trying to out leverage Google with a little bribery.

Wednesday, April 11, 2007

The Essential 2007 Code Optimization Tutorial for SEO

The Essential 2007 Code Optimization Tutorial for SEO
By James Kinsley (c) 2007

Do you want to get the traffíc you deserve flooding into your website? Code optimization is an essential component of the search engine optimization process and if you aren't technically minded then it can be difficult to get your head round. This guide is meant for beginners and more advanced webmasters alike.
A shallow knowledge of HTML coding is useful, however, it is not necessary. Optimizing your code can be done by simply opening your html document in a text editor and changing different parts as shown below. Follow these steps carefully and your code will become 100% search engine optimized and ready for promotion and link-building campaigns.

The steps below assume you have chosen the keywords which you want to optimize the page code for. If you have not done that, go and do that now and return to this guide later.





HTML Code Optimization

The optimization of your HTML code for search engines is vital. It is the base of your SEO campaign. It must be optimized in a number of ways in order to improve the relevance of a chosen keyword. Follow the advice below as closely as possible. The closer the better and the higher your rank will be.

Remember: Keywords are the words people will use in search engines. Including a keyword in your site content (and optimizing your site) will cause your site to be returned as a search result. You can choose to optimize your page for a keyword or a keyphrase (a number of related words, eg: 'free red hats'). Using a keyphrase is more advantageous (as discussed later) but for simplicity, I will refer to keywords AND keyphrases as just keywords.

TIP: Try to optimize each page for just one keyword. This will stop each keyword competing against each other for weightings and you will rank higher for the chosen keyword.





The TITLE Tag

Location: just below the < head> tag

< title>Web Promotion, Affilíate Marketing, SEO< /title>, for example

1. The title tag should not contain any of the words Google disregards. These are words like 'and', 'not', 'a', 'the', 'about' etc which are too common for Google to take any notice of. Using these words will dilute the importance that your keyword is given in your title (if you put it in your title). These words are known as 'stop' words.

2. Include your keyword in the title of your page. Including other words in your title that are not your chosen keyword/s will be detrimental to your ranking. This is because it makes your keyword seem less relevant to the title of the page. This relevance is known as 'weíght'. The more weíght your keyword has in a certain criteria the better.

3. Don't include the name of your website in the title of your page: for example 'Share The Wealth � affilíate marketing'. This is because it will dilute the prominence of your keyword (in this example 'affilíate marketing'). It is tempting to include your site's name as it may look better, however it is not that important as people don't pay much attention to the title.





The Meta tags

Location: just below the title tag.

Meta data appears as follows:

< meta name="Description" content="Free articles and guides on affilíate marketing and SEO">

< meta name="Keywords" content="Affiliate Marketing,SEO">

1. This is where you specify your keywords:
< meta name="Keywords" content="keyword1, keyword2, keyword3">
Also, weíght is given to how near your keyword is to the beginning of your keywords líst. So you should try to have your most important keyword in the place of 'keyword1' in the above example.

< meta name="Description" content="Free articles and guides on affilíate marketing and SEO">

1. The above line is where the description, shown in google results, is written. It goes after content=". Do not worry about keyword weighting in here as search engines do not take this into consideration anymore.





The BODY of your HTML

Once you have written the content of your page, you can begin SEO on it. Complete the page ready for publishing and then apply the following rules to it to ensure its optimized 100% for the top search engines.

1. Your keyword should appear in bold at least once on your page. This will show the search engines that the word, your keyword, is important to the subject of your page and so must be relevant to the keyword search performed by the search engine user.

2. Your keyword should have a weíght of 2% on your page. This is the ideal percentage as if it is too high a search engine may penalize your page for sp@mming. Sp@mming is a term used to describe the action of webmasters that trick search engine page ranking systems (SEPRS) into thinking they are relevant in order to get a high ranking. These pages will not usually be relevant at all and simply "cäsh in" selling advertising space with the high traffíc they receive. Sp@mming is increasingly becoming a thing of the past as the search engine page ranking algorithms become more sophisticated. To work out the percentage weíght your keyword has, visit www.live-keyword-analysis.com .

3. Use heading tags ( < h1>heading< /h1> etc) and put your keyword into the heading. Again the usual weighting rules exist. Have your keyword as close to the beginning of the heading and have as few other words in the heading as possible. Position this heading as close to the top of your page as you can for increased relevance.

4. Put your keyword in up to three of the alt attributes for images and include it in one of the first three alt image attributes in your code. Alt image attributes are the alt tags given to images in your code which can be seen if the image fails to load. These are great for hostíng your keyword as users cannot usually see them. Don't sp@m though, stick to three alt tags. Alt tags are used as follows:

< img src="imagename.gif" alt="alt-text-here" width="image-width" height="image-height">

5. Keep your page content between 100 and 1400 words. This is for a number of reasons, including the size of Google's page cache (amount of data from a page Google stores). If you have too much content, you could try splitting the page into two separate pages and perhaps having a 'page 2' link at the bottom of the content.

6. Your keyword should appear at the beginning of your content and at the end (The first and last 50 words)





Code Optimization Checklist

-No stop words in your title tag
-Keyword included in title
-Website name not included in title
-Keyword in meta keywords líst
-Keyword placed as close to the beginning of the meta keywords líst as possible
-Keyword appears in bold at least once in the content
-Keyword has a 2% weíght
-Keyword is in the first heading tag and is at the top of the page content
-Keyword is in the first 50 words and last 50 words of the page
-Page content is between 100 and 1400 words
-Keyword is in one of the first three alt image attributes and is in three of them in total





Tips and Advice

Try to optimize each page for just one keyword. This will stop each keyword competing against each other for weightings and prominence and you will rank higher for the chosen keyword.

Not every page of your site will be able to be optimized for every criterion. Don't worry; just try to hit each criteria as best you can. Sometimes you won't be able to achieve a content size of above 100 words: on a contacts page for example. Issues like this are of little importance as not every page will have a particular need for perfect optimization, because surfers will find contact information from a link shown on the home page.

Constantly chëck your competition. You may not feel it is possible to get onto the first page on Google for a certain keyword/phrase. Choose a less contested keyword.

Gear Up Your Site For Social Media Marketing

Gear Up Your Site For Social Media Marketing
By Deepak Dutta

The year 2006 saw the emergence of social media. If you are engaged in operating a website, you must realign your site to exploit the popular social media sites for increased traffíc.
You should also introduce social media components to your site because web users are experiencing these new forms of interactions on more and more sites and they may have an expectation of the same from your site also.

If you want to attract repeat visitors and want them to stay longer, your focus this year should be on the social aspects of your site.

Social media uses technologies like RSS, blogging, podcasting, tagging, etc. and offers social networking (MySpace, Facebook), social video and picture sharing (YouTube, Flickr), and community-based content ranking (Digg, MiniClip) features.

The central theme of these sites is user generated content used for sharing among the end-users. The social aspects of these sites are to allow users to setup social communities, invite friends and share common interests.

You don't have to change your site completely within a month or so to take advantage of these new technologies. Introduce small changes incrementally throughout the year and you will be on your way to meet these new challenges.

The first step is to declare who you are to the online community. People should be able to relate to you. Unless they know more about you, you will be just an unknown identity and most people don't like to deal with unknowns. Create an About Us page and líst your achievements and skills.

Create a MySpace page and link your bio in the About Us page to the MySpace page. Also provide a link back from the MySpace page to your website. Spend an hour every week to develop your online social network in MySpace. Invite a few of these new friends to write blog articles at your site about your products or services.

Install frëe blog software and start publishing at least one article in your blog. Provide an easy bookmarking feature to social bookmarking sites like del.icio.us. This is done by providing an action button for each article on your site. The action button takes users to the submission page of the bookmarking site.

Also, provide an action button for direct posting of blog articles to Digg. Digg is a popular news ranking site. A well dugg article will bring thousands of visitors to your site.

Provide a forum at your site for users to discuss your products and services. Don't delete negative comments because they provide insights into the improvements needed to serve your visitors better. However, censor hate speeches and meaningless bantering. Register your forum at BoardTracker. BoardTracker is a forum search engine.

If you are offering products, allow users to review and rate your products. This will help you in inventory management because you may want to discontinue low rated products.

Provide RSS feeds for your new products, blogs, forum postings, etc. An RSS feed provides teasers of your contents. Users will use RSS readers to scan your teasers and visit your site for more information if the teasers draw their attention.

Publish all your feeds at Feedburner. Feedburner provides media distribution and audience engagement services for RSS feeds. They also provide an advertising network for your feeds. If you have quality content, you will be able to monetize your content using their services.

Create short how-to or new product videos and post these videos in social video sharing sites like YouTube and Google video. Provide a few start and end frames in these videos to introduce your site with your site url. Post these videos using catchy titles, teasing descriptions, and appropriate tags to make them discoverable.

Provide embedded links to your posted videos on your site. This will save your bandwidth and storage space because the videos reside on the video sharing sites.

Besides videos, use social photo sharing sites like Flickr to share pictures related to content on your site. Use the same title, description and tag techniques discussed earlier for social video sites.

Provide a Send to Friend feature for all products and services you offër. This feature is a link that sends the article, product description, etc. to a recipient via e-mail.

For starters, Yahoo provides a service called Action Buttons that adds links to your website for users to share, save, and blog about your website. The Yahoo action buttons use del.icio.us for social bookmarking and the Yahoo blog site for blogging. It also has a print feature.

Social media is not a fad. It is here to stay and bring in profound changes to web surfers' experiences. It is the right time for implementing features that will make your site social media friendly. Also, using marketing techniques that utilize popular social media sites, you will be able to bring traffíc to your site.

The Impact of Personalization on SEO

The Impact of Personalization on SEO
By Claudia Bruemmer

Personalization of search has been a growing topic of interest for a while, but has stayed under the radar for most people until now. With Google's widespread integration of personalization into standard search results, search marketers' attention has finally been firmly riveted on the issue. Up until recently, Google provided two personalization options:

- You could customize your Google Personalized Homepage for quick access to information of your choice (email messages, news headlines, etc.).

- You could get automatic personalization from your search history.



Recently, Google started combining the above two options for users who sign up for services through their Google accounts. When you sign in, you get access to tailored results utilizing information from your search history and your Google home page. If you don't wish to see results based on your past searches, you simply sign out of your Google Account or turn the option to track your history off in your Account settings.

To quote Danny Sullivan, "...anyone who signs-up for any Google service using a Google Account (such as Gmail, AdSense, Google Analytics among others) will automatically be enrolled into three additional Google products: Search History -- Personalized Search -- Personalized Homepage." In the past, Google Accounts required you to manually enable Search History. However, with the recent change, personalized search has been enabled for all accounts, new and old alike. All accounts also automatically get home pages generated based on account information.





Widespread Personalization

We don't know for sure how rapidly search personalization will take hold. However, a 2006 Choice Stream Personalization Survey shows that consumer interest in the issue is strong, with 79 percent of respondents indicating a willingness to receive personalized content and more than half of the 18-24 year olds asked expressing an interest. The study also saw an íncrease in the number of people who would be willing to trade privacy for increasingly tailored results.

These findings can likely be generalized to search users because the information required for search personalization is less intrusive than the content participants were questioned about in the survey.





Benefits and Drawbacks for Users and Site Owners

Personalization benefits users because it can help make their searches more relevant based on past search behavior. It also can help Web site owners who have excellent content and well-written Titles, since the Web sites with the "stickiest" content will be weighted more favorably. However, in both cases there is also the possibility of closing out potentially useful resources because they do not fit a user's previous history.

In addition to good content, Web pages need good Title and Description Meta tags. Because these are displayed on the search results page, they represent the way human users will judge the site and decide whether or not to clíck through.

You can also gain by getting yourself on the Google personalized homepage of many search users. One way to do this is to offër users a feed, a Google gadget, or Add To Google buttons on your pages so users can subscribe to your content. Another tip is to put Google Bookmark buttons on your pages, such as those provided by AddThis. The more a visitor relies on your site, the better ranking it will receive when that user performs searches related to your keywords. The winners in personalized search are those who make a connection to their users because the results reward loyalty.





Implications for SEO

Increased personalization in search results has obvious implications for anyone performing search engine optimization since search results will now differ from user to user based on search history and user profile. Naturally, all queries will show a change in ranking positions between personalized and non-personalized results. Practitioners have analyzed this effect and found that results for personalized vs. non-personalized search can vary as much as 90 percent. Clearly, on page elements, particularly in the content and Meta data, will become extremely important again.





Rank Checking

The area most affected in the search optimization process is rank checking. An article by Mike Moran in Revenue January/February 2007 states, "Widespread personalization will doom traditional rank checking". Moran also asserts, "It's the biggest change in search marketing since paid search."

Extensive personalization will affect the traditional rank checking process because site rankings will differ based on users' idiosyncratic search habits. SEO analysts will be looking at average rankings rather than absolute rankings. This will force a change in search engine optimization techniques. Currently, SEO requires decision-making based partly on researching targeted keyword phrases used by leading competitors. With personalization, it becomes difficult to identify the leading competitors because all search results will differ.

Therefore, new methodologies for making search engine optimization decisions will have to be devised. Traditional SEO and on page optimization will still be very important and SEOs will need to continue to improve pages, making them superior to other pages for specific targeted keyword phrases. This will require more thorough analyses of competitor on-page and off-page factors.

The process of SEO competitor analysis will require data collection, quantitative and qualitative analyses, as well as multivariate analysis. Multivariate analyses can help determine the relative importance and influence of multiple factors compared to each other, yielding the competitive landscape for your targeted key terms. The strengths and weaknesses of this landscape will help practitioners make the SEO decisions needed for targeting the right terms for optimization.

In-depth competitor intelligence will give SEO practitioners more accurate readings of how their client's Web pages compare to their competitors' pages, and the result will be more accurate information than we currently get with rank checking.






The Challenge of Competitor Intelligence

In-depth competitor intelligence can reveal what's working and what's not for a site's strongest competitors. It can reveal which sites are competitively strong (or weak) compared to the client's site, regardless of what the respective ranking numbers would show with rank checking.

New age competitor intelligence will tell you what optimization factors are most important for specific competitive landscapes. Technicians will learn the true competitive nature of a keyword phrase rather that just the number of results returned for a specific query. They will know exactly what SEO factors to work in order to strengthen their client's position rather than guesstimate based on general guidelines.

In-depth competitor intelligence will tell practitioners how to prioritize the SEO factors to be optimized, revealing semantic relationships between the client's content, the competitors' content, and the semantic nuances of a keyword phrase related to search personalization of user results. Optimization in the era of personalization requires robust competitive intelligence, and this will pay big dividends to those who master analyzing the competitive landscape.

It is undoubtedly true that search will change dramatically once personalization is widely adopted. However, SEO is an art that is extremely flexible and will adapt with widespread use of search history to affect rankings. SEO practitioners have always been creative, and we will develop new techniques to achieve search visibility for our clients as personalized search becomes more prevalent.

The Importance of Search in Internet Marketing

The Importance of Search in Internet Marketing

By Claudia Bruemmer (c) 2007

For long-time search marketers it will come as no surprise that search has become a media darling. Pay-per-click advertising is the most popular online marketing strategy, and organic search engine optimization provides top click-through and conversion performance.
Search is more popular than display ads and emaíl marketing because of its excellent performance and ROI. Internet marketers should take note of the development in the search marketplace in order to better focus their advertising budgets.




Search is Evolving

As the Web grows exponentially, search engine databases suffer from information overload. As a result, technology and consumer search behavior adapts and changes. One of the early attempts to make search easier was the use of metadata in search. The search engine, Clusty came out in 2004 to "deliver groups or clusters of similar results rather that millíons of search results in one long líst." The clusters were supposed to help users see search results by topic so they could more easily find what they were looking for.

Another of the early changes to search was the development of multiple databases within general search. Whether you go to Google or Yahoo, you'll see category choices such as Web, video, images, local, news, etc. These are all different databases that can help you target specific queries. Google aptly named this concept its OneBox solution; you could access many different databases from one box.

Along with the development of multiple databases on niche products or subjects came the vertical search engines. These verticals are particularly useful for B2B companies. The latest trend in the evolution of the market is social search engines, which give consumers the ability to interact with search queries, putting the human touch in search results. Social search engines seek to connect people through personalization and human understanding, using community knowledge to íncrease relevance.




Consumers Are Key Drivers

Social search highlights an important point to remember: consumer behavior has become a key factor in driving the search economy. Consumers are performing more searches as the Web becomes legendary for finding information quickly and effortlessly. It used to be that search was second to email in Web activities, but in 2006, Marketing Sherpa reported that search surpassed email, becoming the most popular online activity. comScore reported that the number of searches in the U.S. grew by 28 percent, year-over-year in August 2006.

While search behavior is changing, the proliferation of Web 2.0 platforms and applications such as social networking, RSS and blogging are impacting search, making it even more complex. The information universe is becoming too vast and complex to catalog by keywords alone. This has resulted in the development of expanded search opportunities into local search, vertical search and social search. Many times, consumers are slow to adopt new search resources. Local and vertical search took several years to gain a foothold. Social search is still in the early stages of development.




Local Search

Local search is a key growth area. Borrell Associates estimates that local paid search spending reached $1 billion in 2006 and will reach $1.7 billion in 2007. It will continue to rise, reaching $4 billion by 2010, when it will account for 47 percent of local online advertising.

The U.S. Government estimates the number of small businesses at 24 million, all of whom are in a great position to leverage the power of local search. While many small businesses still don't have Web sites, the promise of local search is there for the asking. These businesses spend $90 billion annually on local advertising, mostly in traditional media. This reflects the potential for online advertising growth as businesses shift monëy from traditional to online advertising because of its effectiveness and ROI.

Nielsen/NetRatings shows that Google is catching up with Yahoo on local searches. Verizon SuperPages and SBC's YellowPages are also big players. As users continue to use the local search option on major engines, local search continues to gain in popularity and advertising revenue. Now is a great time for a small business to get into search engine marketing on a local level. The field is relatively open and not nearly as competitive as the general search engine results.




Vertical Search

Another good option for niche businesses is optimization focused on vertical search engines. Vertical search engines, along with the new social search engines, are beginning to lure consumer and B2B searchers away from the general search engines as the desire for more targeted answers and the ability to pose more focused queries increases. This is an indication that general search leaves many questíons unanswered, resulting in lower productivity.

Vertical search engines can provide the targeting that general search engines lack. This is why they are becoming increasingly popular. The market leaders in search, Google, Yahoo, MSN, AOL, all are focusing efforts in the vertical space in order to respond to the needs of users. Social search is also on the rise with social search engines like Collarity and Rollyo allowing users to limit irrelevant results and benefit from the collective intelligence of previous searches.




Social Search

The interesting thing about social search engines is that they change search algorithms to include the human factor rather than depend solely on computer data. They not only include consumer-generated content, they can also include human intent. Collarity delivers search results with consumer-driven answers to queries and allows searchers to select various aspects of a search query. Rollyo allows users to create their own search engine roll, serving information from a preselected líst of sites and/or from other users' rolls. The social dimension of the Web and search engines is a fast growing phenomenon, and the major search engines are also experimenting with social search. We've had Yahoo Answers and Google Base in beta for a while, and Microsoft is reportedly negotiating with Eurekster for social search technology.

As search technology moves forward, new search models will continue to be launched. As the mobile Web and mobile search continue to expand, search queries will drive commerce around the world anytime anywhere, across platforms. This can only enhance the role of search in Internet marketing.

Monday, April 9, 2007

Advanced Link Building

Advanced Link Building
Hosted Content, The Quest for the Perfect Link
By Frederick Townes

Ask Google, search engines love links. Of course, they love some links more than others. For example, a simple link exchange (reciprocal link) doesn't have as much value to search engines and so, it doesn't receive the same weíght as a non-reciprocal (one-way) link � the theory being that a one-way, in-bound link is a recommendation from a site owner to visit this linked site. The link, itself, is testament to the quality of the site being referred.


Article Syndication

In recent years, many sites have employed article syndication to develop links. These site owners write (or have written) articles of interest to a particular audience. The site owners then offer these articles to other relevant sites free in exchange for a link back to the originator of the content in the "about the author" section of the article. In this way, a single site owner can submit dozens of articles for syndication receiving an inbound link from each article in return for the frëe use of content. They can also watch other sites post the content virally to keep their sites fresh, as well.

Sites need fresh content so many will happily display your article and provide a link to your site. It's a tried and true link building tactic. However, search engines are programmed to seek out the most natural, and therefore valuable, links they can find.

The way articles are syndicated is through sites like goarticles.com and ezinearticles.com. The standard format for the display of the article is: headline, article body followed by a small blurb about the author with a link back to the author's site. Since those links appear in the body of the page, they appear to be more valuable in comparison to most purchased or reciprocal links which often appear at the bottom of a page column, or in the footer surrounded by lots of other links somewhat effective, but not necessarily the best way to acquire inbound links.

In addition, syndication leads to duplication when a single article appears on 10 sites all at the same time. This diminishes the quality of the text and the back link to the author's site. It's still more valuable than a plain link exchange, but search engines are placing less emphasis on syndicated content. So, what's a site owner to do?




Hosted Web Content

It goes by many different names: content swapping, advertorials, pre-sell pages and hosted content all basically the same idea.

The way hosted content works is that you, the author, pay a site owner to display your article. However, now, instead of the back links to your site coming at the end of the article, you embed those links in the body of the text surrounded by your target keywords and actually useful content for the reader. In the "eyes" of a search engine, this is among the highest valued back link.

Hosted content is basically renting a page on another site with links to your site embedded in the main body of the article. The web site that hosts the content receives payment from the author plus fresh content, the author gets a valuable back link and visitors to the hostíng site get useful content.

This strategy isn't new. It's simply doing what search engines want us to do is to produce content that's useful, beneficial and appears on quality sites. Not only does a quality piece of content receive more visibility when hosted on an authoritative site, it also delivers increased benefit to the author, and the page may even rank itself for target key phrases. When a major site hosts your content, you gain from its page rank in strong testimonials and referrals. Whether or not the site owners want to monetize their site by allowing approved authors to post content is the same debate as whether or not links should be bought and sold. However, publishing high quality, unique and useful content, rather than just creating inflated link popularity with diminishing returns, is, in comparison, a tested SEO tactic.




Designing a Hosted Content Page

You're paying for the placement of this content so you want it to be good. In the eternal quest for successful link bait, you also want the content to be ranked by search engines because it provides real value to the reader and is hosted on an authoritative site.

Design the hosted content page using standard SEO conventions: a keyword savvy title, header < h1>, subheads < h2> and a keyword density of less than 5%. Any higher and search engines may consider the content to be "spamish" regardless of where the content appears.

Now comes the most important part. As you write the article, carefully place links to topically relevant pages on your own site within the body of the article's text. These are high value links that will improve your SEO. However, it's also important to place your articles on sites that are topically related to your piece (and probably already rank for related topics). The authority of the site hostíng your content, the relevance of the site (topically speaking) and that back link make your site look stronger as far as search engines are concerned. Also, remember that the quality of the content to which you link also matters. Link to strong pages (those with quality back links) on your site, as well. Your article should reference other authoritative, relevant articles so that search engines see that your piece was written to offër real value to readers.




It's Not Quantity, It's Quality

It's no longer simply a matter of how many links point to a site. There are many cases of sites in which 50 quality links outrank sites with hundreds of links. It's not quantity, it's the quality of the links that improve ranking in the SERPs.

Editorial links (links in hosted content) are more "natural" from a search engine's perspective and, therefore, more valuable because the article has, at most, two or three targeted links pointing to your site's pages. Just like quality link bait, which is unique, original and useful content, quality hosted content on respected sites will also naturally develop its own back links - the ultimate validation and the desired outcome of placing quality content. Finally, because these links are found on pages optimized with your keywords, search engines will consider them extremely relevant to the subject at hand.




Start Your Hosted Content Campaign Today

It's being done everyday, successfully building small sites into largër sites, providing frëe advertising for the thought-leader/author, delivering less duplicate content to search engines and more new content (plus revenue) to the hostíng site and, perhaps most importantly, hosted content actually delivers useful, relevant information to readers � exactly what search engines rank in the first place. As with any link-building technique, hosted content can be abused, but topically authoritative sites are not going to accept content that does not meet their high standards � so everyone wins when the goals are white hat.

Start searching for websites that might be interested in hostíng your next article, or start looking for a site owner interested in content swapping. Create content that's unique, useful and well-written and you may find that you won't even have to pay a site owner to share your content with their readers � exactly how it should be.

Friday, April 6, 2007

Sitemaps Improve Site Value

Sitemaps Improve Site Value
By Lisa Barone

Getting your pages indexed. It is your most important SEO goal and perhaps the one most vital in determining the success of your SEO campaign. However, many search engines have trouble finding links buried deep within the structure of your site. So how do you make sure your pages are easy for the search engines to find? With a sitemap. Creating a sitemap provides the search engines with a one-stop-shop for all of the pages on your site. And if designed correctly, your sitemap can also be a valuable resource to lost visitors looking to understand your site structure.

What is a Sitemap?

A sitemap displays the inner framework and organization of your site's content to the search engines. Your sitemap should reflect the way visitors would intuitively work through your site. Years ago sitemaps existed only as a boring series of links in líst förm. Today, they are thought of as an extension of your site. You should use your sitemap as a tool to provide your visitor and the search engines with more content. Create details for each section and sub-section through descriptive text placed under the sitemap link. This will help your visitors understand and navigate through your site, and will also give you more food for the search engines. You can even go crazy and add Flash to your sitemap like we did with the interactive Bruce Clay sitemap! Of course, if you do include a Flash sitemap for your visitor, you will also need to include a text map so that the robots can read it.



A good site map will:

Show a quick, easy to follow overview of your site.
Provide a pathway for the search engine robots to follow.
Provide text links to every page of your site.
Quickly show visitors how to get where they need to go.
Give visitors a short description of what they can expect to find on each page.
Utilize important keyword phrases.



Why They Are Important?

Sitemaps are very important for two main reasons. First, your sitemap provides food for the search engine spiders that crawl your site. The sitemap will give the spider links to all the major pages of your site, allowing every page included on your sitemap to be indexed by the spider. This is a very good thing! Having all of your major pages included in the search engine database will make your site more likely to come up in the search engine results when a user performs a query. Your sitemap pushes the search engine toward the individual pages of your site instead of making them hunt around for links. A well planned site map can ensure your Web site is fully indexed by search engines. Sitemaps are also very valuable for you human visitors. They help them to understand your site structure and layout, while giving them quick access to your entire site. It is also helpful for lost users in need of a lifeline. Often if a visitor finds themselves lost or stuck inside your page, he will begin to look for a way out of his hole. Having a detailed sitemap will show him how to get back on track and find what he was looking for. Without it, your visitor would have just closed the browser or headed back over to the search engines. Conversion lost.




Tips for Creating a Sitemap

Your sitemap should be linked from your homepage. Linking it this way will force search engines to find it that way and then follow it all the way through the site. If it's linked from other pages it is likely the spider will find a dead end along the way and just quit. Small sites can place every page on their sitemap, but largër sites should not. You do not want the search engines to see a never-ending líst of links and assume you are a link farm. Most SEO experts believe you should have no more than 25 to 40 links on your sitemap. This will also make it easier to read for your human visitors. Remember, your sitemap is there to assist your visitors, not confuse them. The title of each link should contain a keyword whenever possible and should link to the original page. We recommend writing a short description (10-25) words under each link to help visitors learn what the page is about. Having short descriptions will also contribute to your depth of content with the search engines. Once created, go back and make sure that all of your links are correct. If you have 15 pages on your sitemap, then all 15 pages need to link to every other sitemap page. Otherwise both visitors and search engine spiders will find broken links and lose interest.



Remember to Update!

Just like you can't leave your website to fend for itself, the same applies to your sitemap. When your site changes, make sure your sitemap is updated to reflect that. What good are directions to a place that's been torn down? Keeping your sitemap current will make you an ínstant visitor and search engine favorite.
How To Compete With The Big BoysLet Google's Algorithm Show You The TrafficGreen With Envy in the Google GameGoogle's Last Dance! Could Semantic Search Mean The End Of Google?SEO - Is A High Page Ranking Overrated?Is Your Web2.0 Marketing A Goldmine Or Black HoleGoogle Adsense and How You Can Earn More From Adsense AdsSearch Engine OptimizationMicrosoft Still Needs Help Understanding Search - It's Been Done BeforeMicrosoft Still Needs Help Understanding Search - The Need for Product ImprovementMicrosoft Still Needs Help Understanding Search - Buying Search Market ShareMicrosoft Still Needs Help Understanding SearchThe Essential 2007 Code Optimization Tutorial for SEOGear Up Your Site For Social Media MarketingThe Impact of Personalization on SEOThe Importance of Search in Internet MarketingAdvanced Link BuildingSitemaps Improve Site Value | Search Engine Optimization (SEO) | PageRank