April 25th, 2015.
Last year I have published three(!) “biggest link building success” articles of 2013. Can I top that for 2014? Let me try. I didn’t write three “biggest” success articles to brag more last year. In fact
I tend to overlook my successes while I rather dwell on my failures.
Once I recognize there was success it’s hard for me to assess it. I succeeded more than once? Which success was the biggest? I can’t really say. Thus I cover more than one of them as possibly the biggest success. Judge yourself!
Success in numbers vs subjective success
I have had the single most shared article in my decade of business blogging published in 2014 with more than 4000 shares by now. No, it wasn’t on a mainstream publication. I usually got below 100 shares there.
So I covered this already in my first post this year dealing with the potentially biggest success. That one was obvious, after all the share numbers are public. Then I thought about the success that made me feel good the most.
Getting my “decade of SEO mistakes” post shared thousands of times was a bit embarrassing to say the least.
There was one kind of success I had that was not evident for outsiders while for me personally it was very uplifting. Is the subjective success less important than the one measurable by public numbers? I think subjectively felt success is sometimes even better. You can’t flaunt subjective success that much but it helps you to develop your strengths.
Is writing real work just like SEO?
For years I have written for client blogs about blogging, social media and search for often very low rates. I was glad I could write for a living because I love writing and especially blogging and I didn’t consider it real work at the beginning. Thus I was satisfied with almost any amount paid to me as long as I was able to do what I love.
Last year I finally raised my rates to what seems to be industry standard. I looked and asked around what other writers of similar quality and renown charge.
Yet writing for third party blogs, even for marketing publications, still doesn’t make you rich. Last year I still didn’t earn enough by blogging so that I needed to work additionally for actual SEO clients to finance my writing “hobby”.
In a way I had to subsidize my writing with my other proper client work.
Meanwhile I was a bit jealous of all the professional blogs that had custom illustrations for their blog posts. I knew that I couldn’t afford to hire an illustrator myself though. Instead I have used free images with a Creative Commons license like I have done for years.
Then one day a guy from Freepik.com emailed me with an offer. At first I thought it was just another outreach message. Outreach messages are perfectly fine but they require work on my part in most cases where I get very little in return. Yet Alejandro from Freepik offered to work for me for free.
All he wanted in return was a credit for the image. I always credit my image sources with a link so that would be no exception for me. I didn’t even have to think about that offer. You have seen Alejandro’s images on my posts ever since. I only occasionally used Creative Commons imagery after that.
How Creative Commons images can backfire
Don’t get me wrong. Creative Commons images (I usually get them on Flickr by way of Compfight) are wonderful once you find a good one. Compfight helps to find free images but they rather sell stock photography instead. Personally I ignore most stock photography or imagery. It’s mostly bland and full of stereotypes. I rather take real world images from photographers around the globe than to feature ridiculous stock photography clichés.
There are also issues with Creative Commons images though. As I have been using them for several years, some of their owners have changed their licenses to the good old copyright. I have located several of such images on my blog. I had to replace them or at least remove them.
Another issue is that some Creative Commons images have a “non-commercial” license. Different people interpret that license in different ways.
As I have no ads on my SEO 2.0 blog an I do not earn money from the blog directly I thought it was not a commercial use until one image owner actually wrote me an angry message. He did not like the metaphoric context I put his photo into and tried to wield the license as a weapon for me to remove it.
Apparently for some people a commercial use is already given when you publish a business blog dealing with business topics. Others consider ad revenue as sufficient to call it commercial use. So at the end of the day you basically can’t use Creative Commons images with a non-commercial license on your blog. You don’t have to resell the images or something. It’s enough that you’re not a charity.
The intrinsic value of link building
Long story short in 2014 I took another big step into becoming a professional blogger. I was able to charge rates equal to my client work in SEO and beyond (as SEO still sells the best despite all the rumors of its death). The reason why is also significant: I get approached by blogging clients now.
Until 2013 I had to apply to job listings like everybody else despite my “big name”, large audience and all the experience.
I got my work out there consistently and colleagues who often have known me for years through modern day relationship building have approached me. When people seek you out they are of course willing to pay more. You don’t compete with others that much anymore. You create your own league.
Now that so many people in the industry know me it’s far easier
not only to get recognized as a person but also to convey the value of your work. It’s not just the relationship building though. It’s the good old literal link building as well. Alejandro of Freepik approached me because a recommendation and link from a post of mine have so much value that I don’t even need to pay them for their work.
I “only” write for 3 to 4 blogs at once, including my own one, so that it’s not only about building links in the traditional “domain popularity” way. Freepik rather got several links from the same site. Nonetheless they were willing to provide me with custom illustrations to get the eyeballs as well, not just the link juice.
So there still is intrinsic value in link building, it’s not only about relationships
and all that hippie stuff I usually preach. It’s still also about getting direct visibility online. The link will be clicked by actual people, highly relevant audiences full of bloggers and Internet marketing professionals. They all need images.
I know this is not exactly the type of link building success story you may been looking for but it was important for me to tell it. It’s not just about getting hundreds of links using the latest tool or technique. Sometimes it’s such cooperation that makes the success big in a subjective way.
* The “biggest success” illustration has been made by my supporters from Freepik.com
April 8th, 2015.
There are numerous ways by now that allow the competition to hurt your site in the results of the market-dominating search engine.
It’s really pitiful but I have to tell the world about it, especially as the search giant uses this situation to discredit the whole discipline of SEO as “negative”.
What is SEO? No idea? You’re not alone!
“77% of respondents could not identify what SEO means.”
For those who know that it’s about Search Engine Optimization the majority rather assumes that it’s about SPAM or at least “manipulating” of search engines. It comes as no surprise to these people that SEO is actually negative. It has never had a positive connotation to the majority in the first place.
SEO experts calling SEO negative
Then there are the experts who read this blog and not only know what SEO is, but also practice it themselves without resorting to black magic. Even these specialists tend to repeat hearsay from others who tell them that links are unnatural or that SEO is negative.
Yes, most SEO practitioners even spread the word about how negative SEO is and try to prove that you can harm other websites in search results not only by links but also by other means. People oblivious to the topic who only scan such articles will only know one thing after consuming them: SEO is potentially dangerous.
Of course the skilled professionals refer to Google sabotage, formerly known mostly as Google bowling by old school SEO practitioners.
There is no such thing as negative SEO like there is no hot ice or dry water. True, you can sabotage competing sites in Google in manifold ways but false, you can’t negatively optimize for search engines.
Either you optimize and improve or you don’t. You can’t improve negatively. So why are SEO experts using such a paradox (in linguistics it’s called an oxymoron)? Well, most of us also say “unnatural links” as if natural links would grow on trees organically.
“Negative SEO” had its 15 minutes of fame in 2007
True, the term “negative SEO” seems to exist at least since 2007 when Forbes wrote about it in an article called – understandably – The Saboteurs of Search. So unlike the other paradox term that is used to demean SEO and is commonly used by Google – “unnatural links” – this term seems to be an invention of some self-proclaimed SEOs.
I have never heard of them despite a decade of reading about SEO other than in that article. Remember that our industry has no rules on using the term. Everybody can say s/he’s an SEO and nobody can prevent them from doing so.
Barry Schwartz confirmed the existence of the term a day later on Search Engine Land. He was still using quotation marks to distance himself from the term “negative SEO” though.
You won’t believe what happened next! Well, what happened? Almost nothing. Most people forgot about it. Google bowling has been mentioned ever since here and there but nobody really cared. Why? It was marginal at best. Also it was much easier to truly optimize sites and build links instead of trying to hurt your competition on Google.
So how can you sabotage your competition?
I won’t explain in detail (for obvious reasons) how you can harm competing websites on Google but even the article from 2007 on Forbes already lists 7 of them. I didn’t even know some of the terms they used but I know, and can imagine even more tactics to hurt your competitors.
Google bowling or pointing SPAM links at your competition’s website is the most known and obvious one. It boomed ever since Google sends out warnings of “manual action” because of unnatural or artificial links by way of Google Webmaster Tools.
You don’t need a PhD to engage in such practices. Just reply to one of those numerous spam mails trying to sell you dozens, hundreds or thousands of links for a measly sum of a few dollars. It’s certainly cheaper than optimizing your site the legit way.
Outing – Forbes introduces a widely used negative SEO techniques as tattling but by now it’s commonly referred to as outing. You have simply to catch your competition in the act of breaching the Google Webmaster Guidelines and report them in one way or the other.
You can snitch right away at Google or you simply tell the press so that they can raise enough hell so that Google has to act.
It works well, even in case of larger brands sometimes. Simply reporting paid links that lead to your competition might take a while or amount to nothing though. Of course you can combine buying links for Google Bowling and outing the competition then.
Google insulation – I’m not even sure that term has caught on but it’s about search result saturation. For example back in the days when I ranked #1 for the German phrase for search engine optimizer the largest economic weekly in Google.de other SEO practitioners got pissed off and started saturating the results with blog posts aiming to outrank me.
As it was just a niche and not that competitive before it worked after a while, especially as the huge number of leads from that magazine largely prevented me from optimizing my site further. I also used this technique for a good cause once, to outrank spammers collectively with other bloggers.
This is not even clearly sabotage, it depends on the context and intent.
In essence you just optimize third party sites to outrank your competition, which is legitimate. This technique is often used in online reputation management campaigns aimed at subduing bad press about a brand or person.
Copyright Takedown Notices – You can make whole sites disappear wielding the Copyright axe. Major sites like WordPress.com, Tumblr or Blogger (owned by Google) won’t ask many questions but instead simply delete your whole blog because of one or two copyrighted images.
You can even buy rights for an image afterwards and claim it’s yours.
Even in case the site comes back up later, a few days being offline are enough to hit you severely in Google and make you seem unreliable for the foreseeable future which results in downranking a few spots. This might be enough for your competition to outrank you then.
Copied content – Website scraping and republishing or manually creating duplicates by copying content can lead to so called duplicate content issues on Google. The search engine still struggles to credit the original publisher of content as the source in many cases.
Sometimes sites copying your content outrank you in Google as if they were the original and you are the copy.
Google doesn’t like to see the same content more than once in its search results so that copied content may quickly damage your site’s rankings. Even the BBC got a page specific penalty because of content scraped from their site by third parties.
Denial of Service – A so-called DoS (Denial of Service) or DDoS (Distributed Denial of Service) attack on your site, that is someone bombarding a site with requests from numerous computers may results in slowing down your site, temporarily or permanently blocking access to your site.
The longer your site loads slowly or is down for good the more risky and unreliable you look to Google. By now Google can react quickly and show alternative sites even based on temporary high loading times. After all Google does not want make people to wait.
Hacking – Website hacking – in the sense of wrecking your site with malicious intent – as in infecting sites with malware may lead to getting blocked by Google. Sometimes Google will downright warn users from entering your site saying “this site may be hacked” right in the search results. When you’re site is down or infected for a longer time it might disappear from the radar altogether. Just like DoS attacks the hacking effectively damages your algorithmic reputation.
Spamming – Google does not value sites that seem to be abandoned because they have a lot of SPAM comments or forum entries for example. So you can harm a site by literally spamming them using comment spam bots.
You can simply invite SPAM comment by solely covering a topic that is often associated with SPAM. So in case you mention gambling, generic pharmaceuticals or NSFW topics you can bet that spammers will find you automatically and insert their comments.
The Fire this Time
That’s it for now. There are more techniques to sabotage sites in Google. I won’t mention all of them. I already feel bad about telling you so many. Most of them have been in the original Forbes article from 2007 so I didn’t start the fire.
There needs to be a public awareness of the current state of affairs on Google.
It’s not fun and Google can fix it with ease. Instead of labelling SEO as negative you rather need to call out Google for it. Why do they penalize websites for third party actions those sites have often no control of? Why penalize the victim for sabotage?
* Creative Commons image by Les Chatfield
Changes that are made by governing bodies and organisers to complex structures are almost unavoidably likely to have an impact on uninvolved bystanders; such is the nature of any system.
Modern economists make frequent use of the law to explain how decisions at a governmental level have significant unintended consequences further down the food chain.
Examples of the law in practice include wind farms that actually harm the environment by killing birds. Laws used to promote green vehicles which with the help of an enterprising salesman resulted in free golf carts for businessmen, and the Australian law making cycle helmets mandatory that actually resulted in an increase in the risk of death and serious injury to cyclists.
So what does all of this have to do with Google?
In the search engine world, there’s little doubt about who makes the rules; and with its recent Penguin update, Google has left some innocent websites suffering in its campaign for good SEO practices.
Some Early Examples
Now of course there is some history here. There are many early examples of unintended consequences that have arisen from decisions and courses of action that Google have taken,
- Using Pagerank to dictate that links had value resulted in the link economy, blog networks, comment and forum spam and a proliferation of low quality web directorie
- Google AdSense for publishers led to an explosion in content scraping, copyright theft and MFA (Made For Adsense) sites.
- The introduction of rel=nofollow led to Pagerank sculpting and siloing.
So Where Does The Penguin Update Fit In?
The Google Penguin update was introduced in April 2012 as a means of identifying and demoting websites that had previously benefitted from aggressive SEO techniques.
According to Matt Cutts (Head of Google’s WebSpam team), the update targeted ‘all those people who have sort of been doing, for lack of a better word, “over optimization” or “overly” doing their SEO, compared to the people who are just making great content and trying to make a fantastic site.’
To put it bluntly, it was designed to demote websites that appeared to be benefitting from undeserved backlinks.
The principles behind Penguin meant that it was welcomed by most web users. It would ensure that websites that engaged in link-spamming and other underhanded black-hat techniques would drop down the rankings. Google speculated that the first update would only have an impact on 3.1% of English search queries and 3% on searches made in the German, Chinese, and Arabic languages.
The Penguin Update was largely successful, resulting in the demotion of a hundreds of thousands of websites that had been ranking unfairly. Unfortunately it also affected some sites that hadn’t knowingly engaged in shady link-building practices.
For example, the specialist WordPress site WPMU.org was crushed by the update, dropping from 8,580 daily visits to a paltry 1,527 after it was introduced.
Despite the site’s owner, John Farmer claiming that there had been no keyword stuffing, link schemes or had any problems regarding quality. Matt Cutts came forward with the claim that the site had been penalised due to a few bad links pointing to it.
Largely it was felt that due to the nature of the site (a WordPress resource); there were bound to be links to the site indicating authorship and design of blogs often of a lower quality – links that were keyword heavy and in footers, blogrolls and often sitewide.
Even so, the damage had been done and it was left for WPMU to rebuild their rankings.
One major unintended consequence of the Google Panda Update was the ‘bad-by-association’ approach to some sites. If one site was penalised, sites associated with that site were shown to be affected negatively. In an online discussion Michael Martinez of SEO Theory said, ‘what they are seeing is a Cascade Effect where the websites that link to them suddenly lose value. So the real problem lies 1 or 2 tiers back. These are not false positives, although they are collateral damage.’ However we define this problem, it is clear that some websites have experienced a drop in traffic and money-earning potential through no fault of their own.
The Target Changes…
Some leading SEO experts have revealed that some more aggressive (and less-ethical) SEOs have posed as their rivals and petitioned sites with fake requests for the removal of perfectly good links. Such tactics have been adopted as a means of reducing competitor’s website rankings. Thankfully as yet this problem doesn’t seem to be widespread.
When fake emails aren’t enough, there have also been some reports of unscrupulous webmasters building spammy links to rival websites hoping to see them penalised as a result.
Even examples of blackmail threats made to websites with the threat of black-hat SEO and possible penalties.
Members of the specialist SEO Forum Traffic Planet revealed how devastating this tactic could be by test-targeting two websites with ScrapeBox blasts. This involved the creation of thousands of anchor-text based backlinks and resulted in a substantial ranking drop for the sites targeted.
The Traffic Planet case study was just one way of outlining the effects of a wide reaching problem. Danny Sullivan, the Editor in Chief of Search Engine Land pointed out that, ‘As for not accepting there’s no negative SEO, I’ve repeatedly said that it is possible … perhaps it [is] more viable now because it’s cheaper now. That’s exactly the opposite or refusing to accept that links could be cheaply and trivially pointed at any site. What remains unclear is how serious a threat it is to the vast majority of sites out there.’
The cautionary message here appears to be: as much as Google’s addressing of black-hat SEO may make for a quality content-driven user-experience, it’s by no means flawless. When in doubt, leave it out and play it safe.
September 3rd, 2014.
We’re always telling our clients that following Google’s best practice is the best strategy to ensure longer-term success in the search results.
So it’s no surprise that every small announcement of changes to their algorithm now gets picked up upon quickly and generates a rush to ‘comply’ regardless of the detail.
In our opinion site speed is really important for the user. Whilst Google will also look at this metric it is way down its list on important factors affecting page/site rankings. Matt Cutts himself even stated that these changes would affect less than 1% of all queries,
You’ll notice that the current implementation mentions that fewer than 1% of search queries will change as a result of incorporating site speed into our ranking. That means that even fewer search results are affected, since the average search query is returning 10 or so search results on each page. So please don’t worry that the effect of this change will be huge. In fact, I believe the official blog post mentioned that “We launched this change a few weeks back after rigorous testing.” The fact that not too many people noticed the change is another reason not to stress out disproportionately over this change.
Site owners seem happy to panic about site speed and security before addressing more fundamental (and infinitely more important) aspects such as page mark-up, site structure and hierarchy, and on-site copy.
It goes without saying that a fast site will improve usabiltiy and help conversions and this is as good or better reason for addressing it than for a Google announcement.
Recently Google announced that they will treat sites served with https better then sites that aren’t using a secure certificate.
As with site speed metrics, this is way down on Google’s list of priorities in the ranking algorithm, and we do not consider that it is a particular band wagon to be jumping on with the aim of improving search results. Google themselves admit that
“For now it’s only a very lightweight signal—affecting fewer than 1% of global queries, and carrying less weight than other signals such as high-quality content—while we give webmasters time to switch to HTTPS. But over time, we may decide to strengthen it, because we’d like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web.”
Google certainly has history of implementing changes which website owners feel obliged to comply with, only to backtrack on them later.
So our advice is to implement HTTPs on your site only if you feel it is required for other reasons than just search (eg customer confidence), we have outlined the possible pros and cons for you below,
- Your site is protected by an additional SSL layer
- A minimal algorithmic gain in rankings, implementation is certainly unlikely to result in any visible rankings changes
- An increase in trust from site visitors, especially important on sites offering more sensitive data or transactions
- The initial connection to your site may be slower, especially if you have to implement 301 redirects from non https pages
- Cost of an annual SSL (if you don’t already have one)
- In some cases you may lose referral data
- You’ll need to ensure all https pages are correctly 301 redirected to the newer https versions
- Need to update your internal link structure if you’re using relative URLs for internal links
- You may find that many products don’t support the use of https
June 5th, 2014.
Are you currently designing a new website for your business? What are you looking to achieve? Are you confused with how to balance an amazing user experience with higher organic traffic?
Questions like the above are asked all the time and if you’ve been following design trends over the past couple of years, then you should be no stranger to Parallax scrolling.
Let’s be upfront, parallax design is beautiful, it give voice to brands and an experience that wow’s most users. However it would seem that SEO and Parallax are still no closer to a perfect romance and website owners often sacrifice one for another.
My job is to help get SEO and Parallax into the same bed with a mission to start a new breed of super sexy search savvy slick sites. (I’m a sucker for sibilance).
The parallax design list of compatibility problems
Design – In many cases a designer using parallax will opt for a single page design which makes it difficult to optimise a site for a wide variety of search terms. Having a good spread of topical pages not only gives users access to deeper content and knowledge but also helps increase your over all search visibility. Given that, you should plan out your sitemap to allow for supporting pages to have deeper content around most of the keywords.
Analytics – Another challenge is in gaining valued and accurate analytics for your website. How accurately can you measure engagement on a single page website? The problem is parallax pages take a long time to load so users can often get frustrated and click elsewhere.
The second problem is identifying the stronger pieces of content on the page as the “time spent on page” metric will become more ambiguous. The only workaround I can see from this problem would be to use heat mapping software which might highlight the pieces of content that users were most drawn to.
Finally what to do about setting conversion goals? You would require another page if you want to measure users who were interested in a part of the content but didn’t follow through on their intent.
Page Speed – Another sacrifice of having a beautiful parallax design is the heavy load on the home page. This is generally caused by having very high quality images and videos all located on one file. Although some of these issues can be resolved using faster servers, tidy code and CDNs. Google still dislikes slow page speeds as this represents a poor user experience in their eyes. Use Google’s page speed insights to check your site doesn’t offend.
Mobile – There still isn’t a perfect solution for using Parallax scrolling on mobile devices. Webmasters will have to create separate versions of the site specifically for mobile devices. A popular example of this is Google’s “How search Works”.
How to make the two compatible
Yes it’s possible. The two can co-exist in a way that pleases the eyes of both users and Google.
1) A “One page” design with parallax scrolling using jQuery.
This was first written about by Kevin Ellen of iProspect. The solution uses the help of jQuery’s “pushstate” functionality. This allows a parallax scrolling page to be cut into many various sections which can be identified by Google in the SERPS. The great thing about using this function is that each section will have its own unique URL and Meta data. This is great because one single page can be indexed multiple times for different content. This is handy because parallax designs that are implemented without any specialist SEO advice sometimes result in a severe lack of indexed pages meaning a poor search visibility.
– Pushstate is a great funtion for an existing one page parallax scrolling website that needs to be optimised for search.
-The function is perfect for smaller businesses or a branded mini site. Perhaps these are more interested in UX than SEO. This solution offers the best of both worlds on a very small scale but could never work on a bigger scale, such as an ecommerce site.
– It is a bit of an analytics fail. Scrolling through each of the sections will send signals to your analytics packages that users are bouncing or exiting content very quickly. Again, this might not be a huge issue for small businesses or mini sites that use the function for branding or simply referring traffic.
2) Adopt an SEO architecture allowing multipage parallax scrolling.
Put simply, you start with SEO page architecture pyramid and then place the parallax scrolling design effects on each URL. This is perhaps the perfect compromise between UX and SEO, neither outweighed by the other.
– Beneficial for analytical software because each URL has its own content. Tracking can be placed across all the pages making setting up conversion goals and understanding user behaviour much easier.
– It doesn’t exactly follow the parallax scrolling trend and is therefore not as effective for telling potential of a brands story.
-The design might seem attractive however, having more pages also means that more maintenance is required which could lead to more expensive costs from your design team.
3) Parallax scrolling on homepage and regular SEO architecture.
This technique places parallax scrolling on the homepage and then includes other URL’s that are SEO-friendly, but do not have parallax scrolling. This is the method adopted by brands such as Spotify. It allows them to have an attractive home page which helps communicate the brands voice. Having other non parallax pages helps the users can then dig deeper into the websites content if they’d like to find out more about a particular service.
Another idea would be add a blog on the site. The addition of a blog to your parallax site can add tremendous value when trying to attract visitors. It’s also another way to showcase your industry knowledge and authority.
– Keeps the website light and flexible, making it easier to maintain and design whilst being more affordable than the previous two options.
– Creativity is kept in a box and fails to make the whole website UX and super interactive experience.
Can Parallax actually help your SEO?
Can a parallax design actually help your SEO? The answer is ‘yes it can’, but the best way forward would be to consult your search agency before you think of going gung ho on a funky new design. Seeing a mesmerising new design for your business can be exciting but don’t let it blind your judgement. A short consultation with your search agency should ensure that you can benefit from beautiful design without risking a huge drop in traffic because you dropped most of your web pages.
As SEO has started to become increasingly integrated into the normal marketing process and more specifically content marketing, using a parallax design is the perfect excuse to delve deep into the world of producing great content. Giving users an intuitive brand experience whilst delivering a compelling story can help attract more backlinks, repeat users and referrals. “No other recent web design technology has done more to impact the way we tell stories online than parallax”.
And if you are connecting more with your audience, then your content is likely being shared more widely, thus increasing your visibility in social media whilst hopefully grabbing the attention of industry influencers. All of which means you can create content for conversion.
March 20th, 2014.
Again, Google gets the backs up of companies investing heavily in its services, though this time it’s not through an algorithm update or a change in the webmaster guidelines. Rather, it’s their comparison feature that has sandbagged the major comparison shopping firms.
If you’re involved in Travel, Finance or Insurance, you need to be aware that Google is interested in controlling these verticals within its own search engine, as much as possible. The opportunity for profit is huge as is the tonnes of valuable data that will be collected.
The Google comparison feature was soon released after they acquired comparison site “Beat That Quote” back in 2011. The feature meant that Google would appear for generic competitive industry related keywords, such as car insurance or mortgages. This is still the case today.
This move was understandable, Google’s desire to keep growing and monopolising the internet means that creations such as this are going to be more and more common. At the end of the day, they’ve reach mass market penetration in the UK, the only way to please the shareholders is to diversify into other lucrative industries.
Brand Bidding is Bad, Unless You’re Google
However, it looks like one rule for everyone else except Google, who haven’t been following their own rules again. Their position as overlords of the internet has entitled them to take advantage of the very companies that are paying them remarkable figures in Google adwords advertisement and other services.
Scratching your head?
Google’s comparison engine has gone a step further than simply appearing for the generic big industry keywords.
A branded search for anyone of the top comparison website rivals will return this:
They’ve effectively done MoneySuperMarket’s job for them, how thoughtful…
Despite the fact that MoneySuperMarket will probably be paying incredible sums of money to raise awareness of their brand name, all of which supports their offline marketing efforts, which includes extensive above the line media adverts. Their efforts are being sabotaged by Google’s “Sponsored” comparison engine which is essentially hijacking users away from the MoneySuperMarket website. Whilst doing this they’re also trying to force the users to adopt Google’s own engine instead, which features a list of alternative competitor insurance companies.
In a nutshell Google’s comparison engine seems to be a glorified affiliate site.
You thought Google only favoured the big brands…
So what do the big comparison sites do, how would you react? It would appear that they just have to accept it. Thanks Mike… ground breaking revelation there.
This isn’t the first time, nor will it be the last time that Google have tried to force users to use their platform over a potential rivals, this should sound familiar? Google are being hypocritical of their own guidelines and company mission statement.
We’ve all heard that providing a good user experience and unique authoritative content are what Google rewards the most, which makes perfect sense. So why when companies such as MoneySuperMarket provide awesome content, such as this, are they being pushed further down the SERPS real estate?
Kevin Gibbons recently wrote a great piece on how to beat Google in a vertical search, making the point that relying on Google is always a risky game, it’s your biggest competitor. It has your mindshare whenever want to find something or buy a product.
Kevin, goes onto to give great examples of how MoneySuperMarket are beating Google hand’s down by ultimately using their marketing as an acquisition channel which rewards them for coming back. They’re running newsletters, social media, blog, apps, SEO and remarketing to such an effect that a Google search is becoming more and more irrelevant.
And if all that doesn’t work, well at least they have Snoop Dog.
February 10th, 2014.
Are you still waiting to implement a winning strategy in 2014? Is the doom and gloom hokum surrounding Google updates preventing you from making the right decisions?
Let me take you back to the past and into the shoes of a university student who chose to follow the straight arrow path of Marketing.
Amongst the countless amounts of acronyms and matrix tables that flooded lecture handouts is the classic “SMART” formula. The formula exists to guide you to defining better objectives.
Specific – Define what it is that you want to achieve. Answer those 5 W’s! Who, what, when, why and where.
Measurable – Quantify your objectives, how are you going to back up your results?
Achievable – We all like to overreach at times. When setting objectives make sure that they’re likely to be achieved by your team.
Relevant – Make sure the objectives are relevant to the business and in line with the overall marketing plan.
Time based – Set a date for the objectives to be complete (tricky in SEO).
A smarter SEO would also add
So I suppose you want a SMART example in SEO?
A fictitious Mexican food restaurant business based in the UK … “Guapo – Mexican”
“We want to target food lovers from the UK who enjoy tasting exciting Mexican dishes (specific) to raise awareness to our restaurant (actionable and relevant). We will aim to bring over 30,000 visits to our site (measurable) within 8 months (timely)”.
So, how will the SMART acronym apply to your 2014 strategy?
First you must begin to understand how search will change in 2014. So let’s take a look at the predictions.
2013 saw Google unleash the shackles on countless updates. If you weren’t scared at anyone point, then you’re a liar! We saw more frequent Penguin and Panda updates, Hummingbird and the (not provided) debacle finally hit its peak. Enough to send the SEO world into to complete disarray…
It’s safe to say that the SERPS changed in a big way last year. We saw steps to include more localised results as well as better integration of the knowledge graph.
A basic search for “Mexican food” returns a mixture of locally, knowledge based and contextually relevant results.
To get the most out of your SMARTER objectives for 2014, I’d suggest doing the following:
Make the most of local
1) Get listed on Google local places, claim your profile and add all the bells and whistles (360 Photos and Videos) to make sure that your profile stands out amongst your competitors.
2) Encourage sentiment and reward customers who review the restaurant on Google, Trip advisor, Yelp, Top table and other platforms.
3) A mobile version of your site is a must. The majority of mobile searches are for local services, take advantage of this by making your menu and deals accessible and shareable on mobile devices.
4) Go social… Nothing new there, but certainly a necessary step to taking up more first page real estate. In the case of the above example, the love of food is universal. Therefore a restaurant is blessed with the amount of social media tools available at its disposal. Facebook, Twitter, Google +, Instagram and Pinterest can also be used to great effect. You could also top this off by adding a blog to your site. Adding a blog is an easy way to increase the amount of pages, helping you rank for a wider set of keywords.
5) Add separate pages for multiple locations. This helps Google deliver the best result to the searcher, it’s probably a good call to also add your contact details to many pages.
Move away from one type of Analytics
Predicted by Rand Fishkin in Moz’s 2013 predictions was that marketers would need to stop relying on Google analytics as the sole platform for web marketing. He was right by some degree, other platforms such as Mixpanel, Piwick, Omniture and Hubspot did grow significantly last year.
Google’s (not provided) alienated many web marketers who put all their eggs in one basket. Being able to measure and report became tough but that wasn’t the only issue. Identifying opportunities for growth also became difficult. The market is becoming more competitive and margin for errors of ignorance is less forgiving.
Heavier correlation of G+ in search results
A Moz report in 2013 found a high amount of correlation in search rankings and their number of Google +1’s. Cyrus Shepard reported on the findings as surprising, although “correlation doesn’t necessarily mean causation”. The post did create some controversy which sparked Matt Cutts to respond to the debate via hacker news to poor cold water on the findings.
Make the most of Google + by building relationships with your audience and like minded businesses in your niche, identify the industry influencers and connect with them. Take advantage of rel=”publisher” and connect your website to your Google+ brand page.
Incorporating Google + tactics in your strategy will be more important than previous years. The research done by Moz and search metrics indicates the social networks significance and correlation to higher rankings. You will also benefit from increased Click through rate, relevant and influential communities as well as growing your brands authority.
Content Marketing continues to grow but who’s taking over the reins?
Content marketing has been the buzz word for the last couple of years now and it’s taken some time for many businesses to adjust, but to give you an idea of how far it’s come, it is said that up to 92% of marketers are now practicing some form of content marketing. But do marketers really know best?
The content marketing institute estimate that Marketers will have to up their game if they want to remain relevant. Perhaps not surprisingly, it’s Journalists who are leading the way with great backgrounds in writing and storytelling and information design. They know how to orchestrate content that makes you care, is different to the competition, is new and surprises people. Beyond their storytelling abilities are tech and research skills and the ability to meet strict deadlines.
Marketer’s who can learn to think like journalists in 2014 will reap the benefits of good content marketing.
Don’t rely on any one tactic
This is nothing new… perhaps it could just be received as conventional wisdom. Relying on anyone SEO tactic will result in either one of two things:
You’ll get burned
Your gains will be short term and eventually … You’ll get burned
Ace job there…
Unfortunately, there will always be digital marketers that will want to get the best returns with as little investment as possible. Hereby lays the problem. The result of this cavalier attitude is low quality content that is happy to be placed on any site that will accept it.
For those of you who pay attention to the latest SEO news might have read about Google unleashing a fire demon on guest blogging this year. My advice to you is, just up your game and you should be fine. Guest posting isn’t dead: Google just raised the quality bar. Matt Cuts has recently blogged about guest posting and its use effectiveness as an SEO tactic. He says, “there are many good reasons to do some guest blogging (exposure, branding, increased reach, community,etc.) Those reasons existed way before Google and they’ll continue into the future.”
The bottom line is, if you love your brand … why risk its demise? Make sure that you comply with the search engine guidelines and stay up to date with best practices. Try to focus on contributing thought leading articles and information that give you exposure, branding and increased reach. When trying to find a blog to post on, ask yourself … Would you be proud to see your brand exposed here? Does this blog capture my audience? Are the blog’s users engaged in its content?
How to be SMARTER in 2014
Specific – Are you taking local and mobile into account? … Your audience probably is.
Measurable – Google analytics is great, but to stay competitive you’re going to need more data.
Achievable – Can your team do the job? Maybe it’s time to look to hire journalists for your content marketing needs. Be aware how the SERP’s have changed this past year, it would appear that the contrast of real estate on the 1st page of Google keeps diversifying, with only 7 positions for some phrase types and 10 for others. Local listings, knowledge graph and semantic markup such as reviews and ratings also mean that there is so much more to play for.
Relevant – Are you tactics still relevant to your business plan. Does local SEO, social media’s integration in search and improved level of guest posting apply to your overall strategy and brand message?
Time Based – Setting a period in which to see results will always be tricky. However, you can set time periods for work to be completed. Reflect on the content marketing strategy, more and more journalist style marketers are going into content marketing not just because they know how tell a story but because they also know how to meet challenging deadlines.
Ethical – Make sure you’re meeting Google’s guidelines. Relying on anyone tactic will get you burned, you have to remember that your brand is at stake.
Recorded – Record the processes that you’re implementing throughout the strategy. Are the tactics working? Are they future proof? Are they following the plan?
February 6th, 2014.
We’ve been making infographics as a linkbuilding method for our clients.
If you don’t know why, see here.
Last week we launched a new piece for our friends at Love Reading. We’d researched the crimes committed by the most popular children’s book villains and worked out the sentences they would have received in a European court.
You can take a look at the piece here.
Long story short, the infographic came to the attention of The Times and they ran the research on page 3 of the Saturday edition. They mentioned the client’s site (and provided a link in the digital edition).
A testament to the power of infographics.
If you want to talk about an infographic for your brand, give us a call.
January 23rd, 2014.
I ran an experiment last year. I had a website with no blog. It had lots of pages on a niche topic, but very few readers. I installed a blog and began posting once per month. In a year, the traffic doubled – (I’ll admit it increased from ‘barely perceptible’ to ‘quite unremarkable’, but you can’t argue with the numbers).
The massive spike around April 2013 was from some experimenting with paid discovery. The second, smaller spike was a particularly controversial blog post.
I think this settles the argument once and for all: A regular content schedule is a sure-fire way to get traffic.
I know what you’re wondering – ‘How does this affect me, the business owner?’
Well, business owner, I’ll tell you.
It means that you should be publishing regular content on your site if you want people to be visiting it. But as a business owner (or marketing manager) you’ll be plenty busy enough with all sorts of other concerns – do you have time for creating a content marketing strategy too?
You need to be producing content – that’s a fact. It’s a thing you can’t deny. I create content for 30 clients – I use the ‘DEAL’ system, from Tim Ferris (author of The Four Hour Work Week):
Define, Eliminate, Automate, Liberate.
Define the sort of content you need. I daresay you won’t go far wrong with one blog post per week and one infographic per month.
You’ll also need to consider sharing and seeding the content as it’s produced. This can be done via the regular social networking channels, but also on targeted interest sites via email outreach.
All of these things take time – hours and hours of time. But only if you do them all yourself…
Remove any unnecessary steps in the programme. Don’t waste your time getting bogged down with trying to design things yourself or write blog posts yourself – there are plenty of people in the world who will do it for you in exchange for money. They are called freelancers and are readily available online.
Think about what you really need to do for the job to work. In fact, I’ll do it for you – you need to come up with content ideas and you need to check it, then post it. The rest can be done for you.
Automation is achieved by setting up a system that handles the tasks for you. In essence, you feed the machine with briefs and it comes back with content. Online freelancing services exist purely to make your life easier, and they’re really great.
My favourite freelancing sites include:
O-Desk is very useful for finding people to do basic tasks – data analysis, basic research, number crunching etc. I use O-Desk for jobs that are too time consuming to handle myself. For example, if I was trying to make an infographic about football transfers (which I am), I’d post the job on O-Desk and find someone more capable and efficient than me to handle the research and analysis while I concentrate on planning the next infographic.
O-Desk also allows you to create teams of people to handle larger ongoing projects. It’s efficient and easy to manage and provides a screentracker so you can make sure your freelancers are staying on task.
Do note, however – O-Desk has a very high number of have-a-go-heroes. They aren’t necessarily qualified in a given field, so although they are competent, you can’t expect them to do more demanding tasks. For basic stuff though, it’s ideal.
Textbroker’s site is fairly basic in functionality, but it focusses solely on copywriting so it’s far more targeted. Prices vary based on the writer’s rating (out of 5). I’ve found some really fantastic writers on Textbroker, but also some absolute stinkers. Usually I have to edit a few things as it’s easier than sending it back for amendments, but it saves a lot of time.
Good copywriters also tend to be good researchers. They’re generally more able to follow a complex brief than their counterparts on O-Desk, so you can offer them more in-depth projects to research.
People Per Hour – covers pretty much every digital-based job, but I use it for designers
PPH is more useful to me than some of my own body parts. I can post a job at 9 in the morning, receive proposals and have the job in the bag before I go to bed that night. People Per Hour has the benefit of knowing where your freelancer is located, so you can target areas that are likely to have more qualified personnel.
For instance, in searching for a designer, Europe has more reliable design schools than other parts of the world, and by choosing someone in Britain I can guarantee we’re in the same timezone, language and operate on the same working hours. It makes the tasks much more manageable.
The site is really fun. You could spend hours looking at the fantastic artwork and designs people come up with. It costs a lot as it’s targeted solely for design and membership is by invitation only so the vetting process is quite thorough.
Hiring works like a traditional jobs list – you post your jobs and people apply.
It is possible to contact the designers for one-off work, but generally they know the value of their work so be prepared to pay for it.
You need to get your content in front of people. Using services like O-Desk will be futile as the workers tend to take the easy option, and language barriers often mean briefs are misinterpreted. People Per Hour is better as you can find people with proven experience who can provide you with a list of relevant sites to contact with a view to posting your content.
Seeding is an essential part of the content process. Making sure your content appears in the right places and in front of the right people is undoubtedly going to reap its own rewards. By building lists of relevant sites to post to, you can automate this process and make sure every piece of content is placed in front of the influencers, sharers and promoters you need.
If you’ve got a bit of budget, you might also consider paid promotion on social media. ‘Boosting’ a post on Facebook, or StumbleUpon’s paid discovery service guarantee the content will be exposed to more people. However, the content needs to be useful and relevant to the audience to gain more traction. If it’s not engaging, people won’t engage with it (click/share etc.) and you’ll have wasted the promotion budget.
As you practice and refine this process you’ll find yourself free to do other things for your business. You’ll be free to chase new clients and more work, and the best part is, you won’t need to do any more work yourself – the system can handle it!
You’ll notice I didn’t mention anything about idea generation – that’s because I think idea generation is the one thing you shouldn’t outsource. You need to make sure your content is completely suitable for the purpose, and you can have a lot of fun coming up with new ideas.
January 16th, 2014.
What is link reclamation?
Link reclamation is where you’re looking to re-establish links or mentions that were directed towards your site in the past. There are many reasons why previous links may have disappeared but usually it comes down to technical reasons, such as updated pages or a wrong redirect put in place. You could argue it necessary to carry out a link reclamation project every time a website is being redesigned and content is migrated.
Put simply, link reclamation is the process of locating, contacting and fixing broken links to yours or your client’s website. It also has the added benefit of being a totally organic process, with virtually no risk attached. You’re only making the most of current mentions of your company.
Link reclamation is the perfect go to method when starting any link building campaign. It’s simple, quick and will give your campaign a steady footing right from the word go. Examples of where to look for previous links could range from charity work, local or national press, sponsors, exhibitions and review sites such as trust a trader or trust pilot.
Shall we begin… Exciting!
For this you’ll need:
Moz’s Fresh web explorer
One for brands is to look for misspellings. Frequently people will have webmaster error and for whatever reason, they will misspell your domain name.
For instance if you’re a big brand, say Renault or something, you could look for alternate spelling mistakes for your brand (Renualt.com) and where people have linked to the wrong site. From there, it’s simple enough to get in contact with the source of the link and ask that the link be corrected, helping both “our” users.
John Henry Scherck wrote a fantastic post on building links from brand misspellings, all you need is excel, majestic and Aaron Wall’s keyword misspelling tool and you can scale this to another level.
Reverse image search
Have any interesting images on your site? What about your logo? YES! This one is easy. Use the Google reverse image search. This can be a very effective piece to your link building puzzle. Monitor your images and see who’s used them without crediting you as the source. There are other tools out there that can help achieve the same, such as Tineye, Creative commons and Compfight.
You can take this a step further by using your competitor’s images or logos and see what websites are linking to your competitors. A good attitude to take from here would be to try and analyse why they’re using your competitor’s images over yours. It could be that they have a direct interest in your industry and therefore a chance to outreach presents itself.
Fresh Web Explorer – Moz & Google Alerts
Fresh web explorer really has to be one of the easiest ways to locate mentions of your brand that are being scattered around the web. Simply enter your URL or Brand name and search. You’ll hopefully be rewarded with a list of recent mentions that may have passed under your radar. You can also search for multiple phrases at a time, which is handy.
Similarly, you can use good old fashioned Google alerts. You can set this up to track your keywords, brand mentions and even Url’s. If someone mentions you, you’ll get an alert sent through to your email. From there, you can decide if you’d like to get a link from the resulting website.
Use webmaster tools
Go to crawl > Crawl errors, click on your URL’s to see where they’re linked from.
Simply click on that link and you should have a pop that gives you a more detailed look. From here, click on “linked from”.
This should give you the complete run down of who’s linking to you. From here, you can decide if these links are worth keeping or not. If they are and you have another page that is up to date and has thematic relevance to your 404 URL, simply place a 301 redirect in place. Then click “mark as fixed” and let Google get to work.
This is such a simple fix that it would be a crime to leave it out.
Moving Links to your Primary domain
Many companies have more than one domain. Perhaps it was that new intern that recommended a new domain or mini site that you’ve completely forgot about. It could even be an old product that is no longer available.
Going through all your old web assets can sometime uncover some golden opportunities, sometimes going beyond links. Perhaps you’ll rekindle an old business or promotional partnership that served you well in the past. By resolving this issue with a 301 redirect, you can transfer link equity from the unfavoured to the favoured.
Important note: Don’t redirect an old site to the new if it suffered from a Google penalty. You’ll only be breathing new life into those spurious links that caused you all that bother.
Redirected Pages & Server response errors
Using the scraping frog tool, scrape through the depths of your site, as deep as you can possibly go. Make an export of the crawl and pay attention to the response codes that are being found.
If you’re seeing server errors pop up, you can run backlink checker and identify problem areas. Pay attention to 302 redirects, change them to 301’s if possible, allowing previous link equity to pass through. You can also use a header checker tool to follow redirect paths. My favourite tool for doing is Ayima’s redirect tool. I can simply follow the previous redirect path for any problem URLs.
Links to tweets
This is a slice of genius from Ross Hudgens at Siege Media. If you have an active twitter account for your brand, you can make use of your historical data and create an archive of all your tweets and interaction, which can be done by going to account settings. You should then receive an email with instructions to download the zip file. This may take away depending on how active you are.
Place into a CSV and upload using screaming frog. Once it’s been crawled, you can easily see which web addresses have linked to tweets in your archive. If you’re responsible as the source of that content, try getting in touch with that web-master and ask if they can kindly change the link to your site instead of your Twitter handle.
This is just a handful of easy ways to reclaim or identify links that you should be making the most of, a great way to get a link building campaign off the ground. I’m always up for learning, so if you know any other cool little tricks, please comment below. Who knows, perhaps I’ll even be kind enough to link to you in the future.