Monthly Archives: November 2012

Put people first and the keywords will follow

On the road to better search rankings, the temptation to go after keywords promising volumes of search traffic is alluring.  But while that might seem to offer the quickest route to the prize of more customers online – time and time again these goals are unrealistic.  The numbers based search campaign planning methods pay no attention to the intent behind those keywords i.e. they lack the understanding of people who’d actually buy off you, which is crucial if you want you want a positive return from your marketing spend.

Put yourself in their shoes

The first step in improving your search performance is to get a better understanding of the people who want to find you from their search bar.  As marketing managers have always known, this calls for figuring our exactly which segment of people you are trying to reach.  Only with an understanding of who these people are can you then target your content to appeal to their needs. Only with the demographic data like age, sex, location, can you have a background to understanding their search intent.

To understand your client base’s thinking, you have to know what it’s like in their shoes. Demographic profiles are needed to look at who your audience’s role models are likely to be and what they consume. Whether these are girly magazines, the nine o’clock news or the creative review, your direct competitors are likely to be positioning their wares to that audience. This exposure whether it’s advertorial or an editorial feature influences how they think discuss and search for the products or services that you sell.

Misleading shortcuts

With knowledge of how and where your target customers are likely to talk about your brand, you not only have a much deeper an understanding of the types of phrases used by your core buyers, you will hopefully know what it is they value about your products and services that prefer your brand over the competition.  It is high time to dump the notion of a quick trip to Google Analytics API is in any way sufficient; as you will be competing with all the other traffic based on the same set of keywords. A race you cannot win.

It’s a fair bet to surmise that Google itself has been loathe to point out with it’s numerous service updates that focus on quick and easy content or keywords is in no way sustainable.

Where is my business best placed?

Over the long haul, the clever money is creating content that fits with what your customers are already in tune with and is in the right place.  Where you don’t have a first to market advantage, look at what your competitors are writing, the phrases they are using and ask yourself whether you could do better. Only with that in mind do you know the key phrases and topics you should be writing about.

The future of SEO and distribution sites

Google’s bid to become the only gateway to the web is hitting some serious snags from unethical SEO companies. These firms are trying to manipulate search rankings using a number of unethical practices including; the creation of “shadow” domains that funnel users to a site by using deceptive redirects Another illicit practice is to place “doorway” pages loaded with keywords on the client’s site somewhere.  

There are also newer schemes in place that duplicate content across a wide range of sites. Google is however fighting back firmly with its panda and penguin updates, which aim to improve search results and relevancy.

Having read a number of conflicting articles I have come to the conclusion that the SEO debate can be predicted using good old-fashioned non-tech common sense and simple economics. Each day 150,000 new URLs are added to the web. Google also has about 900,000 servers as I write this blog. Imagine the electricity bill on nearly a million servers. Now getting back to the math, as with any PLC you are looking to grow revenues but also manage costs, having thousands of servers searching through spam and dupplicate content is a waste of time and money, it also makes your product less valuable to the user.

Google want to make their search engine as relevant as possible by displaying content that is genuinely interesting and something the user wants to search for and the SEO firms want to position their clients at the top of Google searches because they are being financially compensated to do so. The two objectives are incompatible therefore the solution is a long-term commitment to build engaging content that is genuinely worth sharing.

The dark arts of SEO have a limited shelf life and there is a real danger that when Google is ready, those sites that have immersed themselves with numerous backlinks on spam press distribution sites could pay a heavy price. The vice president of PR Newswire in this video talks about the recent lowering of his own website following Google panda update which only affects 14% of searches, my belief is that these businesses are no longer sustainable not because they are not clever but because they don’t make sense.

These newswires in most instances are distributing paid content disguised as valuable content and then uploading it on pages of news sites, usually tucked away out of site. Some of these news sites are not even proper news sites but a collection of template sites styled as news sites but all owned by the same business, usually you’ve guessed it the same site that sold you the press distribution in the first place. Because some of these sites have meshed their way into more popular sites they may have some protection in the short term but in the long term proper news sites are not going to want to put such large volumes of duplicate content on their site if they are themselves to have a sustainable future. 

Duplicate content itself is probably another way Google will use to identify and remove the content from its searches.The second problem SEO firms face is they don’t actually know how google’s algorithm works. Google does not have secret meetings with tech people to tell them how their algorithm works and then keep it secret to ordinary people. Again using old-fashioned common sense. If you were a billion pound organization and your business was built around an algorithm, would you disclose 100% of how it works? Google on their own site give a Webmaster guide to optimizing your site for Google and for the most part its fairly simple. 

Google uses 200 different calculations as part of its overall search one of which is PageRank, which is determined by the number of links to your site from relevant sites.In my view Google must be aware of the free distribution sites out there and its only a matter of time before they start getting excluded from search results or worse still the people posting get removed from search results. The logical thinking here is that the most recent links from independent sites that produce unique content that is shared often will be seen as most influential. Therefore if you had a link from the economist website to your own say within here last 6 months that will be far more valuable than having hundreds from free sites that just upload duplicate content.

The future of getting your site more hits will rely on you creating good content often, getting that content on other popular sites but only if it is likely to get shared by your target audience.  If you think back to the recent Red Bull sponsored jump from the edge of space and how often that was shared it is clear to see how powerful good and original content can be. This publicity stunt was especially interesting from a marketing perspective as the platform became secondary to the content and was one of the most shared tweets and videos of 2012.

The moral of this blog is to not be baffled by tech speak and if you are using an SEO firm make sure you ask them how they intend to work on your site so you can distinguish between a genuine SEO expert and a SEO cowboy.