Searchmetrics is a brilliant SEO tool, the amount of insight that it gives on client and competitor sites is incredibly useful. One of my favourite reports, along with some manipulation in Excel is to run a quick rankings comparison report on your competitors so you can gain insight into what they’re ranking for, more importantly what they’re ranking for and you’re not, and also how your site matches-up a full range of industry keywords.
For this sample report I’m going to take a look at some of the bigger sites in the insurance sector.
Other good insurance companies do exist, along with quite a few terrible ones.
Run each domain through Searchmetrics and run a long-tail keyword report on each of the sites that you wish to compare.
Export and download each of these reports.
In Excel create different sheets for each of the exports along with the first sheet which should be named ‘comparison’ this is where all of the magic happens and your data will be pulled-in.
Paste each sites data into onto it’s own sheet, as well as cumulatively into the ‘comparison’ sheet.
Then under Data > Remove Duplicates remove duplicated keywords on the ‘comparison’ sheet.
Then delete the following columns in the ‘comparison’ sheet - URL, Pos, Title, and Traffic Index. This should leave just Keyword, Search Volume and CPC.
Next add columns for each of the sites that you wish to compare. This should leave you with a sheet that looks something like this.
Then, using VLOOKUP you’ll need to pull the ranking data from the other sheets into the comparison sheet. So for example into Column C all of the rankings for Aviva will appear.
The formula you’ll need is =VLOOKUP(A:A,Aviva!A:G,3,0) The easiest way to generate this is to use the insert formula function,
Lookup value – Is the value that you’re looking up, in this case is column A, the keyword.
Table array – is the table you’re finding the value in, which is the Aviva sheet, so click in the table array entry field, then go to the Aviva sheet and highlight all of the columns.
Col index num – is the column with the data in that you wish to import, so column 3, the ranking position.
Range lookup – Enter FALSE or 0 here to find an exact match. This will cause #N/A to be returned if the site isn’t ranking for the keyword.
Repeat this for each site. And then expand the selection by dragging the corner of the box down to apply to each of the cells in the sheet.
Tidy the sheet up by formatting as a table, and (hopefully) you should have something that looks like this.
If the #N/A results are annoying you can easily remove them by modifying the VLOOKUP formular from
You can also colour-code the rankings using conditional formatting.
If you would like to download this example sheet I have added it here - CompetitorReport
If you are at all familiar with the concept of classical conditioning, then you should understand why roughly half the webmasters in the world wince every time Matt Cutts (Google’s head of search) mentioned a change to their algorithms. We’ve been burned too many times by the likes of Penguin, Panda and the fold algorithm and as such most of us treat his announcements a little bit like we treat a trip to the dentist – with a lot of trepidation.
Well if your website is called ‘www.oompadoo.com’ then you can breathe a sigh of relief – this time Cutts is overlooking you and giving you a bit more time to lick your wounds. This time Google is interested in targeting the owners of ‘www.buycheapfuronline.com’ and ‘www.bestbodybuildingarticles.com’. That’s right – ‘exact name domains’ or ‘exact match domains’ that have URLs designed to precisely mimic the phrases people are searching for. According to a tweet from Cutts this will only affect 0.6% of English queries – though sometimes as we know these low sounding statistics can leave fairly devastating shockwaves.
Why This Change?
Of course the reason for this change is that many sites that use ENDs do so in lieu of actual good content. This is an easy way for a site to get to the top of the SERPs and so in many cases the quality content simply isn’t there to back it up. At the same time this strategy lends itself to sites that don’t have very diverse content but rather simply focus on answering a single question in order to get AdSense revenue.
In fact this is something that has been on Google’s agenda for a while now, and not so long ago a foreboding announcement came that Google would be favouring websites that focussed on building a brand for themselves with a recognizable name and image rather than one-hit wonders. Of course this direction wouldn’t favour ENDs.
What Does This Mean?
It’s worth noting that Cutts’ tweet also stated that the change was targeting low quality exact match domains – but of course there is likely to be some collateral damage and some perfectly good sites are likely to see their rankings drop too. Some sites of course use ENDs simply because they were there, and some business names happen to be great keyphrases.
That said this will likely call a stop to people buying up keyword domains and selling them on and it might level the playing field for those sites do have more obscure and original URLs (that said ENDs will still have some value due to direct traffic which Google can’t control). For every person who will be angry at the changes there will be a new opportunity created for webmasters to jump in and fill a void at the top of the SERPs. Whatever else you say about Panda and Penguin they do seem to have reduced the amount of spam sites that come up and this does make for a better browsing experience…
So looks like this time ENDs haven’t made the most recent Cutts. But the real question still lingers… could bad puns be next? (Then I’m in trouble…)
The author of this article, Jeet is an avid blogger and expert SEO analyst. He is also a good writer and often writes guest post on SEO niche. He founded GetLinksPro, a link-building and SEO company. He also shares his knowledge and tips on SEO on twitter. You can also follow him on twitter @getlinkspro.
September 28th, 2012.
“Good visual hierarchy isn’t about wild and crazy graphics or the newest photoshop filters, it’s about organising information in a way that’s usable, accessible, and logical to the everyday site visitor.”
Brandon Jones on Sep 28th 2011 www.webdesign.tutsplus.com
The basis of design is communication – relaying a message or inducing a reaction, calls or click. With basic media advertising designers were fixated with the 3-second rule, whereby the advert has 3 seconds to get its message across above other adverts on the page.
The 3-second rule dictates that media adverts should incorporate bold, simple and clear messages and images. When implemented, this often led to aggressive layouts that were not pushing the envelope of design and could actually be classed as de-evolution of design. The finished result was an advert that was not necessarily aesthetically pleasing, but does bring into focus the importance of design hierarchy.
What does the user need to see in order?
- Sector / Product / Service – Industrial and Commercial Flooring (heading and image).
- Key information usually displayed as bullet points for ease of reading.
- Area(s) the company covers or is based in (Loughborough).
- Local Contact Number (01509 000000).
What do you want them to do?
- Recognise the sector, service or product being supplied by the company.
- Gain reassurance that the company is local and supplies what they are looking for.
- Call the telephone number
This hierarchy principle continues today and forms the basis of all media advertising.
Similar time frames have been mentioned when viewing websites but you very rarely get multiple websites shown side-by-side, advertising the same sector.
Website design has increasingly required multiple messages or multiple design prompts which navigate people around a site to a desired location or outcome, meaning hierarchy has become an even more vital part of design.
Once a website is opened it competes against itself, with its own messages and design prompts. A site will succeed or fail according to how well the design functions for the user. This is also dependent on what the user requires from the site. Getting relevant traffic to your site overrides all visual stimuli, but that’s a whole new subject that I won’t get into now.
Good hierarchy in web design should dictate what you want the user to read, in what order and where you want them to go. Destination or conclusion depends on the requirements and reasons of the user and most websites will contain information for multiple products or services including required action for each, such as call, click or renewed confidence with the company for example.
“The human brain has innate organizing tendencies that “structure individual elements, shapes or forms into a coherent, organized whole.”
Jackson, Ian. “Gestalt—A Learning Theory for Graphic Design Education.” International Journal of Art and Design Education. Volume 27. Issue 1 (2008): 63-69. Digital.
The human brain does not view individual items on their own merit and will instead organise them against the items around them. Items are instantly judged and ordered and size, shapes and colours inform us that the items may have more dominance than others. Understanding these principles allows us to form hierarchy within designs and use this to control the users pathway through the design.
Brad Jones explains 5 steps to analyse hierarchy in his post on www.webdesign.tutsplus.com:
Take a website you want to analyse and follow these 5 steps:
- List the key information points that visitors are likely seeking.
- Assign values (1-10) according to their importance to the average visitor.
- Now, look at the actual design again.
- Assign values (1-10) according to the actual visual importance as you see it in the live design.
- Consider: Does the expected importance match up with the actual designed importance?
In most cases, the answer will include shades of “no”. There are lots of reasons for this – client demands, inexperienced designers, design-by-committee – or a hundred other reasons that you’ve probably read.
Brandon Jones on Sep 28th 2011www.webdesign.tutsplus.com
The designer’s role is to take the required information and break it down into visually relevant and easily digestible portions. This has to be done while taking into account the goals and messages of the design. It is not enough to just layout the information, it has to work.
In general, service or product sites tend to work around USPs (unique selling points) and CTAs (call to actions). 2 or 3 USPs encourage the user to have confidence in the design and the 1 or 2 CTAs encourage goal conversions. These have to be displayed around a main message USP to allow the user to be reassured that the site they are using holds the information or product they are looking for.
Unfortunately, design input rarely comes from one direction. In an ideal world sites would be carved from pure creativity while using god-like functionality and subconscious triggers all based around the hierarchy mentioned earlier. In reality, the client will always want the logo bigger and will always ask “can you just make these 12 things all stand out more?”
The struggle continues…
“This post was written by James O’Flaherty on behalf of Adtrak”
February 15th, 2012.
The star ratings that you often see in Google ads are known as seller extensions. These are now likely to appear in the paid, organic and shopping results. These ratings are generated when product reviews are submitted either on 3rd party sites such as ReeVoo or TrustPilot, or when Schema.org mark-up is used to tag internal/on-site reviews.
It is often cited that these star ratings can improve click-through rates by as much as 30%, which will not only increase both organic and paid visitors, but an increase in PPC click-through rate is also likely to reduce your overall cost per click.
Now, while the effects of these are obviously positive when dealing with generic searches, consider the impact on organic brand traffic when seller extensions appeared for one of our clients brand searches.
As you can see, organic brand traffic fell by around 49%. Overall brand traffic remained around the same level, the client was now just paying for a much larger proportion of it via their own PPC ads.
The obvious solution in this case is to turn-off the PPC ads for brand search terms. However in this specific case the situation is compounded by other (legitimate and non-legitimate) companies bidding on their brand term, this includes Amazon, an approved distributor who also benefit from seller extensions in their own PPC ad, so turning-off the client brand ads would probably result in a large share of their own brand traffic diverting to the Amazon result.
So what can be learnt from this?
- Seller extensions have a dramatic uplift in click-through rate
- Protect your brand/trademark results from unauthorised bidders
- Prevent affiliates from bidding on your trademarked terms
- Google are making a lot of money from selling companies their own brand traffic
October 24th, 2011.
Google claim that 16% of more than a billion queries entered every day have never been seen before may sound hard to believe, but perhaps a closer look at how people search online is warranted first. 450 billion new, unique queries have been handled by Google since 2003. All of this begs the question what are users doing that results in such a large number of new and unique queries each day?
Firstly we need to look at how people actually use search engines. In their early experiences with search portals users tend to put in short, generic terms into the search engine. As users become more skilled in searching for the items or information that they want, their search terms become more specific and descriptive.
Instead of using short, generic keywords when searching for a pair of shoes for instance, the user might be inclined to be more descriptive of the type of shoes they are looking for using far more adjectives, e.g. light brown, leather, high heeled ladies court shoes, in the hope that it would be more specific to get exactly what they want.
It is also worth considering the search buying cycle as this especially impacts upon conversions.
Firstly think about how you yourself might behave online when you’re researching buying a product.
Taking a typical online purchase for something like a television. You might start with a search query for a very general phrase like TV or television. You’ll see that there are several irrelevant results for our purpose such as the BBC and ITV results, but using the informational properties such as Wikipedia, or the Google shopping results you may then make a decision that you’re looking for a plasma TV rather than an LCD TV.
Of course you may also decide to visit one of the commercial websites listed for these queries, or buy from the PPC listings, but it’s more likely you’ll want to research a bit more first.
Next you’ll probably search for Plasma TV, this is looking a bit more promising, there are several relevant shopping results some reviews websites and a few more relevant commercial sites appearing. After reading a few of the sites you decide that the Panasonic 50PZ800B looks fairly impressive and you want to find out a bit more about it.
Of course you search for it, possibly adding terms like review, test or comparison to bring up the more informational resources.
It’s about now that you feel you’re happy with your choice, you have compared it against other makes and models, you’re happy that it’s what you’re looking for and you want to go ahead and purchase.
To find online shops selling that specific model you may use buying trigger search terms such as buy or cheap, or possibly even adding geographic search terms such as London or UK.
As a site owner you need to be prepared to be targeting as many of these longer tail phrases as you can with your main site, no easy task when you don’t even know what they are!
Try to develop good (great) content on your site, category and product pages warrant special attention for this. Getting this right will result in high levels of targeted, focused, converting visitors.
April 13th, 2011.
Using non-standard characters in the page title and meta description tag seems to be a growing trend in many industries. The idea is that by using eye-catching non-standard characters readers attention is drawn to their result first, even in preference to results that may be above them.
The practice of optimising search results to maximise click-through-rate is not a new one and has been used in PPC advertising to good effect for years, but where PPC ads have to go through an approval process (where many techniques are outlawed) meta descriptions and organic results do not, so boundaries can be pushed much further.
Who Is Doing It?
How Do I Use Special Characters In My Title And Description?
Use of many characters seem to be by trial and error. John Campbell is a man with far more patience than I, and he has tested the indexing of many special characters.
Special characters can be created using Unicode such as,
© is created with:©
® is created with:®
™ is created with:™
A full list of Unicode characters can be found on Wikipedia
What Are The Effects?
Currently there is largely anecdotal evidence for the benefits of an increase in click through rate. It would be difficult to test definitively as there are several other variables to factor-in.
Shaun Anderson at Hobo is well respected within the industry for running extensive tests on theories rather than relying on guesswork, he is running one the tests shown above,
It’s incredibly hard to test the impact of this on SERPS in an accurate manner. I am currently running some tests on pages on my site. You need a page with stable rankings, and a stable flow of traffic to get exact results, and that’s kind of difficult with the ever-fluctuation of Google SERPS and how changes to the UI (based on query or geo-location – for instance) impact your rankings and clicks on a daily basis – over time – in a natural way. Special characters in snippets certainly get noticed and commented upon, that’s for sure. Once you rank, GETTING CLICKED is what it is all about – every little thing that might help, should be tested on for size. You can get a way with a lot in terms of getting special characters in your snippet DESCRIPTION – but not so much in your TITLE link description (Google strips out some special characters from this element if you try it).
I was also lucky enough to hear from Craig Parker at Soula.com who has conducted some tests of his own.
In a short test I ran on a UK based e-commerce site I found implementing special characters in title tags had a small positive effect on click-through but this was not statistically significant, after around a week it caused a small negative change in [Google] rankings.
Implementing special characters in the meta was difficult to get indexed/displayed on the SERPs and provided a very minimal increase, again not statically significant.
The Bigger Picture…
The largest problem with this technique is that the more people use it the less effective it becomes. how long until our search results pages look like this and nobody derives any benefit from it?
The new and improved version
What are your thoughts on this?
February 24th, 2011.
The current trend of Newspaper sites to publish their content behind paywalls seems to be gathering speed. The recent Google announcement of its OnePass payment system can only increase the process by making payment technology available to a wider audience.
I thought it would be interesting to look to see how the move to paywalls has affected the news sites backlink acquisition rates.
So far the main newspapers that have added Paywalls have been,
- The Financial Times – 2002
- Moneyweek – 2005
- The Times and The Sunday Times – April 2010
- The News Of The World – November 2010
- The Telegraph is set to add a paywall in September 2011
Taking the two most recent examples of The Time and The News Of The World, and using the excellent Majestic SEO graph functionality we are able to see changes on their backlink acquisition rates.
Similar, but less dramatic results for The Times. This is slightly more confusing as the paywall coincided with a domain change from timesonline.co.uk to thetimes.co.uk. We can see clearly that link gains to the old URL start to decline without the new domain ever really gaining links as a comparative rate.
Where I see some really interesting data is in the rate of acquisition for competitors sites who chose not to implement a paywall. A close online and offline competitor to both The Times and NOTW is The Daily Mail.
Their acquisition rate starts to climb sharply from the date The Times paywall goes live, and their highest ever month coincides with the NOTW adding their paywall. It’ll be interesting to see if the following two low months, December and January are a result of incomplete link data or some other trend.
It’s an interesting theory to see of the final few content producers within a market start to perform far better in terms of finance and popularity than those that eventually choose to follow the paywall route.