October 24th, 2011.
Google claim that 16% of more than a billion queries entered every day have never been seen before may sound hard to believe, but perhaps a closer look at how people search online is warranted first. 450 billion new, unique queries have been handled by Google since 2003. All of this begs the question what are users doing that results in such a large number of new and unique queries each day?
Firstly we need to look at how people actually use search engines. In their early experiences with search portals users tend to put in short, generic terms into the search engine. As users become more skilled in searching for the items or information that they want, their search terms become more specific and descriptive.
Instead of using short, generic keywords when searching for a pair of shoes for instance, the user might be inclined to be more descriptive of the type of shoes they are looking for using far more adjectives, e.g. light brown, leather, high heeled ladies court shoes, in the hope that it would be more specific to get exactly what they want.
It is also worth considering the search buying cycle as this especially impacts upon conversions.
Firstly think about how you yourself might behave online when you’re researching buying a product.
Taking a typical online purchase for something like a television. You might start with a search query for a very general phrase like TV or television. You’ll see that there are several irrelevant results for our purpose such as the BBC and ITV results, but using the informational properties such as Wikipedia, or the Google shopping results you may then make a decision that you’re looking for a plasma TV rather than an LCD TV.
Of course you may also decide to visit one of the commercial websites listed for these queries, or buy from the PPC listings, but it’s more likely you’ll want to research a bit more first.
Next you’ll probably search for Plasma TV, this is looking a bit more promising, there are several relevant shopping results some reviews websites and a few more relevant commercial sites appearing. After reading a few of the sites you decide that the Panasonic 50PZ800B looks fairly impressive and you want to find out a bit more about it.
Of course you search for it, possibly adding terms like review, test or comparison to bring up the more informational resources.
It’s about now that you feel you’re happy with your choice, you have compared it against other makes and models, you’re happy that it’s what you’re looking for and you want to go ahead and purchase.
To find online shops selling that specific model you may use buying trigger search terms such as buy or cheap, or possibly even adding geographic search terms such as London or UK.
As a site owner you need to be prepared to be targeting as many of these longer tail phrases as you can with your main site, no easy task when you don’t even know what they are!
Try to develop good (great) content on your site, category and product pages warrant special attention for this. Getting this right will result in high levels of targeted, focused, converting visitors.
October 20th, 2011.
Trusted Stores is an ecommerce certification program that Google launched early in October. The idea behind the program is that it will give people more assurance in buying from online retailers. At the moment the program is still in beta those ecommerce stores that attain Google qualification will be able to add a badge to their site, proclaiming them a Google trusted store. The program is backed, more interestingly, with a consumer purchase protection package worth $1,000.
Those retailers interested in applying to become a trusted store will need to furnish Google with certain consumer information as the company is of the opinion that retailer’s data is more trustworthy than customer surveys. In order to qualify for the Trusted Stores status internet retailers will need to demonstrate good customer service and a record of shipping goods on time. In terms of customer service retailers must have evidence of resolving any customer issues and disputes in a timely manner.
When customers move their mouse over the Trusted Stores badge, they will see the store’s customer service and shipping grades. Unlike the Google Checkout the company states, there is no connection between the new program and Google Adwords. Google further reiterated that the program is still in its early stages and too soon to speculate on how the program might be enhanced and expanded.
With respect to the purchase protection package mentioned earlier, it appears to work in a similar way to credit card companies that extend manufacturer’s purchase warranties. Google however, does not offer guarantees rather the $1,000 is potentially money back where retailers fail to resolve problems. The customer can only benefit from this package if they have chosen the free purchase protection option. The consumer should contact the retailer first where there is a problem, if this is not resolved, then the customer can call on Google to deal with it, or be able to claim money back. The fact is that Google is capable of getting retailers to find quick problem resolutions.
While Google have stated that their motive for introducing the program was to increase buyer confidence in online retailers, some may suspect the company of having hidden motives. Notions of a future tie in with Checkout or Adwords are at the moment, pure speculation. As yet it’s unclear precisely what data Google will be capturing, but if customers choose the personal protection, the retailer is more likely to have a record of the transactions.
August 25th, 2011.
I’ve been playing about a bit with Google Plus posts this morning, and with the recent share of Vic Gundotra’s Icon Ambulance post I know a lot of people have been viewing the same page that led me to dig a little deeper into Google Plus pages.
Take a look at the source code of the cached version of this page- scroll down and you’ll notice a lot of names appearing in the source code within the <span class=”To”> tag. This tag appears to contain the names of almost everyone who has shared the post, and in this particular case this is a lot of names. On the page this either appears as:
or in some cases:
I’m not yet able to determine why some pages do display some of this text and why others don’t- it doesn’t appear to be influenced by the number of shares, comments or age of post from what I’ve seen. In any case this still contains a list of names hidden from the page:
In order to determine whether Google Plus pages were ranking for people’s names included in the hidden text I decided to run a small experiment. I took this Google Plus post from Matt Cutts and decided to check the rankings of the first 2 pages of Google UK for 38 of the names included in this span tag:
Out of the 38 names I tested for this URL only 2 ranked this URL within the top 20 results. This isn’t a massive feat but I’m sure we’d see more results if we rolled this out across the thousands of post URLs indexed, or expanded the depth past the second page of search results.
This goes to show that the usernames contained in the hidden text can (and does) rank which may be a violation of Google’s Guidelines on Hidden text and links.
Now I’m 100% positive that this isn’t deliberate- I think this is simply a classic case of a developers oversight… another classic example of why SEO needs to be baked into the development process from the very beginning- no matter how big an organisation you are!
…a wheelbarrow in an open field that you drag along every day filling it with this and that – each thing you add to it has some significance and some use.
Now imagine you never empty the wheelbarrow. Each day, not only do the things you found the week before now lie at the bottom covered by the newest additions, but the device also becomes increasingly heavy to pull until eventually, it becomes almost impossible.
Now think of the wheelbarrow as your website, and think of its contents as the factors affecting its speed – Let’s explore these factors…
- Empty spaces between code (This only adds to processing time)
- Missing tags (Causing internal errors & bugs in the site)
- Bulky HTML (such as using unnecessary tags where something more CSS compatible would work better e.g. using the tag “font-size” rather than just “small”)
- Background colour being the same as text colour (making all text unreadable)
- Hyperlinks that fail (Devaluing your site in terms of credibility, and possibly increasing bounce rates)
- Missing images
An overload of HTTP requests:
Whenever your web browser fetches a file from a web server, for example when it loads a picture, it does this by using HTTP which stands for “HyperText Transfer Protocol”.
HTTP is an action whereby you’re computer requests for a particular file. One example is a request for ‘home.html‘ (the homepage of a particular website). The web server then sends a response to the computer that says something like: “Here’s the file you asked for” which is followed by the actual file itself.
Understandably, if your server is receiving a very high volume of requests for a range of different things, such as pictures, graphics, photographs, music players and video rendering, it can take its toll and end up really slowing your website down.
Too many cookies:
HTTP Cookies are used mainly for personalization and authentication purposes. A series of saved information is exchanged between the web server and the browser in order to remember things about how you are using the internet. For example if you are shopping online and exit the website returning at a later date, a cookie will enable the site to remember what you had in your shopping cart so you don’t have to spend time finding the same items again.
Web hosting is the business of providing storage space and access for websites. Bad web hosting happens when said storage space is overloaded with many websites, yours is added to the list and so runs slow. Other issues caused by a bad web host include:
- Search engines being unable to crawl your site resulting in a fall in Search Rank
- Your website being “down” (not working, sending out 404-errors)
- Not being able to contact your web host to fix the issue (since the service is so bad the system has probably crashed)
Excess of external media:
Embedded YouTube videos, actually embedded anything that is coming from another website can potentially slow yours down. When you embed something from another site, you are relying on that sites web server, that sites speed, and that sites ability to ensure the embedded item is working properly there, so that it works properly on yours site. Often, even when it works just fine, it might add an extra few seconds to a certain page loading…a few seconds a potential customer may be unwilling to wait!
Spam is so much more than just a bunch of annoying emails. It slows down the Internet and it increases consumer fees.
The internet is a network where spamming effects everyone that uses it. To push spam around the internet relies on a process; it begins with global networks that pass the spam along to their destination, and ends with the message being received by the recipient.
Simultaneously, time, money and resources are used trying to catch and prevent spammers from infiltrating mail servers resulting in higher costs to the consumer because providers are forced to add more security to their servers and hire more staff to manage and prevent the problem.
Be sure to spam proof all web forms by adding “captchas” or similar.
A ‘favicon’ is an image (as shown above) that stays in the root of your server. It’s definitely needed because even if you don’t care about them, the browser still requests one. If there isn’t one, it will respond with a 404 error (meaning not found). Any error message, such as a 404 or 301, is an extra message sent that adds time to the processing of a site.
This image or lack thereof, interferes with the processing sequence by requesting extra components in the load, and since the favicon is the first thing that is downloaded before these extra components, if there isn’t one, the first thing downloaded will be an error.
Too many advertisements:
Any time a site uses advertisements, you are adding to other processes a site goes through in order to function correctly. Programmes like Google Adsense and Microsoft adcenter are external, and reputable, however it is logical to practice the same rules as with external media; everything in moderation – besides, sites with too many ads look un”site”ly!
If any of these apply to you, take active steps to protect your website against sloth! Speed be with you!
July 8th, 2011.
1. You didn’t explain exactly what it was that you wanted…
Did the SEO agency you chose actually understand what it is you do? Did you assume they would? I bet you did! Well that was a rookie error – just because they know SEO, it doesn’t mean that automatically they’ll know all of your business goals and aspirations. It certainly doesn’t mean that through SEO, all of your dreams will come-true overnight. Covering things such as budget and goals are essential in order for us to devise the appropriate strategy for you.
2. The SEO’s weren’t told what already worked (or didn’t work) for you…
Were you clear about what the best features of your online endeavours are so far? Did you talk about what proved successful, or things you tried and that were unsuccessful?
All conversions can be tracked which shows any progress SEO’s have (or haven’t) made. However, if you don’t inform the SEO’s of what already works or doesn’t then you can’t argue if there are repeat mistakes.
3. You didn’t indicate the importance of having one main person oversee the account…
Because any reputable SEO agency isn’t made up of just one person behind a desk and computer handling every enquiry made, but is rather formed of a team of people ranging in size (the team not the people, although this applies to both ) that help manage your account – it is likely that, much like a ‘Chinese-whisper’, your goals, aims and dreams are somewhat diluted to anyone that didn’t speak to you directly.
For example, when person 1, explained the information to person 2, who made brief notes and handed those to person 3, person 3 wasn’t following your direct instructions. They might not have fully understood the notes…however, you don’t have to accept this. If you only feel comfortable with one person in particular handling your account, request that only that person have access to it. This way, any changes made by you won’t come as a surprise to the SEO.
4. You didn’t understand the amount of work needed and so were surprised when costs were higher than expected…
Good Search Engine Optimization will get your site discovered in online search results. There is however, more to it than that. Many people in an SEO agency work to get your site to its optimum, and you need to be aware of just how much work goes into this.
This team will mainly be in charge of making sure that SEO is being carried out for all your online needs
This team works alongside the SEOs to help get you publicity online.
Usability & Design:
This team will have the job of creating a smooth user experience for all users that come across your website.
This team will develop, build and ensure things work – such as buttons on your site, conversion tracking and more.
Providers of Content:
This team will ensure that good content is maintained, and optimised so that people can find it.
5. You didn’t maintain a good relationship with the agency…
Chances are, you started off all guns blazing, before slowly falling into a pattern of laziness, assuming the agency would take care of everything the way you wanted – meaning you wouldn’t have to worry about it.
Further, you were unavailable for meetings, you didn’t specify what kinds of reports you wanted, and changes were (or were not) made that you caused dissaproval. When (on your say-so) these changes were reversed, rankings and conversions fell and this caused (even more) tension between you and the agency.
Remember, rankings and conversion rates can see-saw and any changes made to your site can take time to show the positive affect they are having. You should try not to ignore advice about possible re-designs or new pages that should be added to your site. Other things to consider are using services to monitor your online reputation and testing better versions of your website to get the best results.
April 15th, 2011.
PPC is a complex system of bidding on low cost, undiscovered but really high traffic keywords in attempt to rank as high up in Google’s SERP’s for your brand as possible.
Often underestimated, users create campaigns which run okay. Maybe they break even, perhaps their site is getting more exposure and if they are lucky, they might even get some conversions. One quick search, and the internet overflows with hints, tips and tricks on how to effectively create PPC campaigns to maximise your ROI, and everyone lives happily ever after…
Unfortunately this isn’t the reality for everyone. Sometimes campaigns can take an awful turn for the worst and instead of those fluffy guides that explain how to be a PPC mastermind, I often wonder if those company owners and PPC newbie’s who suffer have done so because they read a different, slightly darker guide that mislead them. This is how I imagine such a guide would read:
Spend wisely and try to set a reasonable budget that you will be able to pay.
Invest copious amounts of money into every campaign almost breaking the bank. It doesn’t matter if you have other bills to pay or budgets to keep to, now that you’ve read a little here and there, it’s guaranteed that this will pay off – the more money invested the better!
Avoid the main keywords for your brand, there is likely to be high competetion for these which will result in high CPC rates!
Try your very best to beat-out the competition by going head to head for the most competitive keywords for your brand. Be generic and avoid specific. For example, if you’re selling sportswear, bid on “shoe”, “trainer” and “clothing” so that when somebody searches for those terms, your ad will appear somewhere in the results as long as you followed that first rule about money!
Try to use long-tail keyword prhases that have lower search volumes but also lower CPC rates. Using a variety of broad and “phrase” match terms can help with this too.
Be extremely precise by using [exact match] for everything. Long keywords are for suckers, get to the point with one word terms, be honest who has the time to think up long-tail keywords anyway? Instead, spend the time you have saved and go shopping or catch up with an old friend!
Carry out keyword research so you can get an idea of the kinds of things people are searching for. This might also help you to think of alternate keyword variations that people might not have thought up, but will get the desired result.
Do everything as quickly as possible! You don’t have the time to hang around when people are selling the same product as you! Use your intuition and instinct, the first words that pop into your head when you think of your product are the ones you should go for. Get them in and bid ASAP!
Monitor your ads throughout the day, this will help you to discover what is getting clicks and impressions and what isn’t. If something isn’t working, change it.
Time is money. Once you have quickly set up one campaign leave it to simmer and create the next one. If you have followed this guide so far then everything should be a-okay!
Don’t worry if you aren’t getting a good enough ROI to begin with. Use whatever results you have as a learning curve and improve what you need to. Use helpful features like the opportunities tab, or the many reporting tools to make a difference.
Money is everything. If you check and your campaigns aren’t doing well, you’re doomed and should probably give up. Shame on you!
Follow this guide and be a professional failure now!
April 13th, 2011.
Using non-standard characters in the page title and meta description tag seems to be a growing trend in many industries. The idea is that by using eye-catching non-standard characters readers attention is drawn to their result first, even in preference to results that may be above them.
The practice of optimising search results to maximise click-through-rate is not a new one and has been used in PPC advertising to good effect for years, but where PPC ads have to go through an approval process (where many techniques are outlawed) meta descriptions and organic results do not, so boundaries can be pushed much further.
Who Is Doing It?
How Do I Use Special Characters In My Title And Description?
Use of many characters seem to be by trial and error. John Campbell is a man with far more patience than I, and he has tested the indexing of many special characters.
Special characters can be created using Unicode such as,
© is created with:©
® is created with:®
™ is created with:™
A full list of Unicode characters can be found on Wikipedia
What Are The Effects?
Currently there is largely anecdotal evidence for the benefits of an increase in click through rate. It would be difficult to test definitively as there are several other variables to factor-in.
Shaun Anderson at Hobo is well respected within the industry for running extensive tests on theories rather than relying on guesswork, he is running one the tests shown above,
It’s incredibly hard to test the impact of this on SERPS in an accurate manner. I am currently running some tests on pages on my site. You need a page with stable rankings, and a stable flow of traffic to get exact results, and that’s kind of difficult with the ever-fluctuation of Google SERPS and how changes to the UI (based on query or geo-location – for instance) impact your rankings and clicks on a daily basis – over time – in a natural way. Special characters in snippets certainly get noticed and commented upon, that’s for sure. Once you rank, GETTING CLICKED is what it is all about – every little thing that might help, should be tested on for size. You can get a way with a lot in terms of getting special characters in your snippet DESCRIPTION – but not so much in your TITLE link description (Google strips out some special characters from this element if you try it).
I was also lucky enough to hear from Craig Parker at Soula.com who has conducted some tests of his own.
In a short test I ran on a UK based e-commerce site I found implementing special characters in title tags had a small positive effect on click-through but this was not statistically significant, after around a week it caused a small negative change in [Google] rankings.
Implementing special characters in the meta was difficult to get indexed/displayed on the SERPs and provided a very minimal increase, again not statically significant.
The Bigger Picture…
The largest problem with this technique is that the more people use it the less effective it becomes. how long until our search results pages look like this and nobody derives any benefit from it?
The new and improved version
What are your thoughts on this?