Changes that are made by governing bodies and organisers to complex structures are almost unavoidably likely to have an impact on uninvolved bystanders; such is the nature of any system.
Modern economists make frequent use of the law to explain how decisions at a governmental level have significant unintended consequences further down the food chain.
Examples of the law in practice include wind farms that actually harm the environment by killing birds. Laws used to promote green vehicles which with the help of an enterprising salesman resulted in free golf carts for businessmen, and the Australian law making cycle helmets mandatory that actually resulted in an increase in the risk of death and serious injury to cyclists.
So what does all of this have to do with Google?
In the search engine world, there’s little doubt about who makes the rules; and with its recent Penguin update, Google has left some innocent websites suffering in its campaign for good SEO practices.
Some Early Examples
Now of course there is some history here. There are many early examples of unintended consequences that have arisen from decisions and courses of action that Google have taken,
- Using Pagerank to dictate that links had value resulted in the link economy, blog networks, comment and forum spam and a proliferation of low quality web directorie
- Google AdSense for publishers led to an explosion in content scraping, copyright theft and MFA (Made For Adsense) sites.
- The introduction of rel=nofollow led to Pagerank sculpting and siloing.
So Where Does The Penguin Update Fit In?
The Google Penguin update was introduced in April 2012 as a means of identifying and demoting websites that had previously benefitted from aggressive SEO techniques.
According to Matt Cutts (Head of Google’s WebSpam team), the update targeted ‘all those people who have sort of been doing, for lack of a better word, “over optimization” or “overly” doing their SEO, compared to the people who are just making great content and trying to make a fantastic site.’
To put it bluntly, it was designed to demote websites that appeared to be benefitting from undeserved backlinks.
The principles behind Penguin meant that it was welcomed by most web users. It would ensure that websites that engaged in link-spamming and other underhanded black-hat techniques would drop down the rankings. Google speculated that the first update would only have an impact on 3.1% of English search queries and 3% on searches made in the German, Chinese, and Arabic languages.
The Penguin Update was largely successful, resulting in the demotion of a hundreds of thousands of websites that had been ranking unfairly. Unfortunately it also affected some sites that hadn’t knowingly engaged in shady link-building practices.
For example, the specialist WordPress site WPMU.org was crushed by the update, dropping from 8,580 daily visits to a paltry 1,527 after it was introduced.
Despite the site’s owner, John Farmer claiming that there had been no keyword stuffing, link schemes or had any problems regarding quality. Matt Cutts came forward with the claim that the site had been penalised due to a few bad links pointing to it.
Largely it was felt that due to the nature of the site (a WordPress resource); there were bound to be links to the site indicating authorship and design of blogs often of a lower quality – links that were keyword heavy and in footers, blogrolls and often sitewide.
Even so, the damage had been done and it was left for WPMU to rebuild their rankings.
One major unintended consequence of the Google Panda Update was the ‘bad-by-association’ approach to some sites. If one site was penalised, sites associated with that site were shown to be affected negatively. In an online discussion Michael Martinez of SEO Theory said, ‘what they are seeing is a Cascade Effect where the websites that link to them suddenly lose value. So the real problem lies 1 or 2 tiers back. These are not false positives, although they are collateral damage.’ However we define this problem, it is clear that some websites have experienced a drop in traffic and money-earning potential through no fault of their own.
The Target Changes…
Some leading SEO experts have revealed that some more aggressive (and less-ethical) SEOs have posed as their rivals and petitioned sites with fake requests for the removal of perfectly good links. Such tactics have been adopted as a means of reducing competitor’s website rankings. Thankfully as yet this problem doesn’t seem to be widespread.
When fake emails aren’t enough, there have also been some reports of unscrupulous webmasters building spammy links to rival websites hoping to see them penalised as a result.
Even examples of blackmail threats made to websites with the threat of black-hat SEO and possible penalties.
Members of the specialist SEO Forum Traffic Planet revealed how devastating this tactic could be by test-targeting two websites with ScrapeBox blasts. This involved the creation of thousands of anchor-text based backlinks and resulted in a substantial ranking drop for the sites targeted.
The Traffic Planet case study was just one way of outlining the effects of a wide reaching problem. Danny Sullivan, the Editor in Chief of Search Engine Land pointed out that, ‘As for not accepting there’s no negative SEO, I’ve repeatedly said that it is possible … perhaps it [is] more viable now because it’s cheaper now. That’s exactly the opposite or refusing to accept that links could be cheaply and trivially pointed at any site. What remains unclear is how serious a threat it is to the vast majority of sites out there.’
The cautionary message here appears to be: as much as Google’s addressing of black-hat SEO may make for a quality content-driven user-experience, it’s by no means flawless. When in doubt, leave it out and play it safe.
September 3rd, 2014.
We’re always telling our clients that following Google’s best practice is the best strategy to ensure longer-term success in the search results.
So it’s no surprise that every small announcement of changes to their algorithm now gets picked up upon quickly and generates a rush to ‘comply’ regardless of the detail.
In our opinion site speed is really important for the user. Whilst Google will also look at this metric it is way down its list on important factors affecting page/site rankings. Matt Cutts himself even stated that these changes would affect less than 1% of all queries,
You’ll notice that the current implementation mentions that fewer than 1% of search queries will change as a result of incorporating site speed into our ranking. That means that even fewer search results are affected, since the average search query is returning 10 or so search results on each page. So please don’t worry that the effect of this change will be huge. In fact, I believe the official blog post mentioned that “We launched this change a few weeks back after rigorous testing.” The fact that not too many people noticed the change is another reason not to stress out disproportionately over this change.
Site owners seem happy to panic about site speed and security before addressing more fundamental (and infinitely more important) aspects such as page mark-up, site structure and hierarchy, and on-site copy.
It goes without saying that a fast site will improve usabiltiy and help conversions and this is as good or better reason for addressing it than for a Google announcement.
Recently Google announced that they will treat sites served with https better then sites that aren’t using a secure certificate.
As with site speed metrics, this is way down on Google’s list of priorities in the ranking algorithm, and we do not consider that it is a particular band wagon to be jumping on with the aim of improving search results. Google themselves admit that
“For now it’s only a very lightweight signal—affecting fewer than 1% of global queries, and carrying less weight than other signals such as high-quality content—while we give webmasters time to switch to HTTPS. But over time, we may decide to strengthen it, because we’d like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web.”
Google certainly has history of implementing changes which website owners feel obliged to comply with, only to backtrack on them later.
So our advice is to implement HTTPs on your site only if you feel it is required for other reasons than just search (eg customer confidence), we have outlined the possible pros and cons for you below,
- Your site is protected by an additional SSL layer
- A minimal algorithmic gain in rankings, implementation is certainly unlikely to result in any visible rankings changes
- An increase in trust from site visitors, especially important on sites offering more sensitive data or transactions
- The initial connection to your site may be slower, especially if you have to implement 301 redirects from non https pages
- Cost of an annual SSL (if you don’t already have one)
- In some cases you may lose referral data
- You’ll need to ensure all https pages are correctly 301 redirected to the newer https versions
- Need to update your internal link structure if you’re using relative URLs for internal links
- You may find that many products don’t support the use of https
June 5th, 2014.
Are you currently designing a new website for your business? What are you looking to achieve? Are you confused with how to balance an amazing user experience with higher organic traffic?
Questions like the above are asked all the time and if you’ve been following design trends over the past couple of years, then you should be no stranger to Parallax scrolling.
Let’s be upfront, parallax design is beautiful, it give voice to brands and an experience that wow’s most users. However it would seem that SEO and Parallax are still no closer to a perfect romance and website owners often sacrifice one for another.
My job is to help get SEO and Parallax into the same bed with a mission to start a new breed of super sexy search savvy slick sites. (I’m a sucker for sibilance).
The parallax design list of compatibility problems
Design – In many cases a designer using parallax will opt for a single page design which makes it difficult to optimise a site for a wide variety of search terms. Having a good spread of topical pages not only gives users access to deeper content and knowledge but also helps increase your over all search visibility. Given that, you should plan out your sitemap to allow for supporting pages to have deeper content around most of the keywords.
Analytics – Another challenge is in gaining valued and accurate analytics for your website. How accurately can you measure engagement on a single page website? The problem is parallax pages take a long time to load so users can often get frustrated and click elsewhere.
The second problem is identifying the stronger pieces of content on the page as the “time spent on page” metric will become more ambiguous. The only workaround I can see from this problem would be to use heat mapping software which might highlight the pieces of content that users were most drawn to.
Finally what to do about setting conversion goals? You would require another page if you want to measure users who were interested in a part of the content but didn’t follow through on their intent.
Page Speed – Another sacrifice of having a beautiful parallax design is the heavy load on the home page. This is generally caused by having very high quality images and videos all located on one file. Although some of these issues can be resolved using faster servers, tidy code and CDNs. Google still dislikes slow page speeds as this represents a poor user experience in their eyes. Use Google’s page speed insights to check your site doesn’t offend.
Mobile – There still isn’t a perfect solution for using Parallax scrolling on mobile devices. Webmasters will have to create separate versions of the site specifically for mobile devices. A popular example of this is Google’s “How search Works”.
How to make the two compatible
Yes it’s possible. The two can co-exist in a way that pleases the eyes of both users and Google.
1) A “One page” design with parallax scrolling using jQuery.
This was first written about by Kevin Ellen of iProspect. The solution uses the help of jQuery’s “pushstate” functionality. This allows a parallax scrolling page to be cut into many various sections which can be identified by Google in the SERPS. The great thing about using this function is that each section will have its own unique URL and Meta data. This is great because one single page can be indexed multiple times for different content. This is handy because parallax designs that are implemented without any specialist SEO advice sometimes result in a severe lack of indexed pages meaning a poor search visibility.
– Pushstate is a great funtion for an existing one page parallax scrolling website that needs to be optimised for search.
-The function is perfect for smaller businesses or a branded mini site. Perhaps these are more interested in UX than SEO. This solution offers the best of both worlds on a very small scale but could never work on a bigger scale, such as an ecommerce site.
– It is a bit of an analytics fail. Scrolling through each of the sections will send signals to your analytics packages that users are bouncing or exiting content very quickly. Again, this might not be a huge issue for small businesses or mini sites that use the function for branding or simply referring traffic.
2) Adopt an SEO architecture allowing multipage parallax scrolling.
Put simply, you start with SEO page architecture pyramid and then place the parallax scrolling design effects on each URL. This is perhaps the perfect compromise between UX and SEO, neither outweighed by the other.
– Beneficial for analytical software because each URL has its own content. Tracking can be placed across all the pages making setting up conversion goals and understanding user behaviour much easier.
– It doesn’t exactly follow the parallax scrolling trend and is therefore not as effective for telling potential of a brands story.
-The design might seem attractive however, having more pages also means that more maintenance is required which could lead to more expensive costs from your design team.
3) Parallax scrolling on homepage and regular SEO architecture.
This technique places parallax scrolling on the homepage and then includes other URL’s that are SEO-friendly, but do not have parallax scrolling. This is the method adopted by brands such as Spotify. It allows them to have an attractive home page which helps communicate the brands voice. Having other non parallax pages helps the users can then dig deeper into the websites content if they’d like to find out more about a particular service.
Another idea would be add a blog on the site. The addition of a blog to your parallax site can add tremendous value when trying to attract visitors. It’s also another way to showcase your industry knowledge and authority.
– Keeps the website light and flexible, making it easier to maintain and design whilst being more affordable than the previous two options.
– Creativity is kept in a box and fails to make the whole website UX and super interactive experience.
Can Parallax actually help your SEO?
Can a parallax design actually help your SEO? The answer is ‘yes it can’, but the best way forward would be to consult your search agency before you think of going gung ho on a funky new design. Seeing a mesmerising new design for your business can be exciting but don’t let it blind your judgement. A short consultation with your search agency should ensure that you can benefit from beautiful design without risking a huge drop in traffic because you dropped most of your web pages.
As SEO has started to become increasingly integrated into the normal marketing process and more specifically content marketing, using a parallax design is the perfect excuse to delve deep into the world of producing great content. Giving users an intuitive brand experience whilst delivering a compelling story can help attract more backlinks, repeat users and referrals. “No other recent web design technology has done more to impact the way we tell stories online than parallax”.
And if you are connecting more with your audience, then your content is likely being shared more widely, thus increasing your visibility in social media whilst hopefully grabbing the attention of industry influencers. All of which means you can create content for conversion.
March 20th, 2014.
Again, Google gets the backs up of companies investing heavily in its services, though this time it’s not through an algorithm update or a change in the webmaster guidelines. Rather, it’s their comparison feature that has sandbagged the major comparison shopping firms.
If you’re involved in Travel, Finance or Insurance, you need to be aware that Google is interested in controlling these verticals within its own search engine, as much as possible. The opportunity for profit is huge as is the tonnes of valuable data that will be collected.
The Google comparison feature was soon released after they acquired comparison site “Beat That Quote” back in 2011. The feature meant that Google would appear for generic competitive industry related keywords, such as car insurance or mortgages. This is still the case today.
This move was understandable, Google’s desire to keep growing and monopolising the internet means that creations such as this are going to be more and more common. At the end of the day, they’ve reach mass market penetration in the UK, the only way to please the shareholders is to diversify into other lucrative industries.
Brand Bidding is Bad, Unless You’re Google
However, it looks like one rule for everyone else except Google, who haven’t been following their own rules again. Their position as overlords of the internet has entitled them to take advantage of the very companies that are paying them remarkable figures in Google adwords advertisement and other services.
Scratching your head?
Google’s comparison engine has gone a step further than simply appearing for the generic big industry keywords.
A branded search for anyone of the top comparison website rivals will return this:
They’ve effectively done MoneySuperMarket’s job for them, how thoughtful…
Despite the fact that MoneySuperMarket will probably be paying incredible sums of money to raise awareness of their brand name, all of which supports their offline marketing efforts, which includes extensive above the line media adverts. Their efforts are being sabotaged by Google’s “Sponsored” comparison engine which is essentially hijacking users away from the MoneySuperMarket website. Whilst doing this they’re also trying to force the users to adopt Google’s own engine instead, which features a list of alternative competitor insurance companies.
In a nutshell Google’s comparison engine seems to be a glorified affiliate site.
You thought Google only favoured the big brands…
So what do the big comparison sites do, how would you react? It would appear that they just have to accept it. Thanks Mike… ground breaking revelation there.
This isn’t the first time, nor will it be the last time that Google have tried to force users to use their platform over a potential rivals, this should sound familiar? Google are being hypocritical of their own guidelines and company mission statement.
We’ve all heard that providing a good user experience and unique authoritative content are what Google rewards the most, which makes perfect sense. So why when companies such as MoneySuperMarket provide awesome content, such as this, are they being pushed further down the SERPS real estate?
Kevin Gibbons recently wrote a great piece on how to beat Google in a vertical search, making the point that relying on Google is always a risky game, it’s your biggest competitor. It has your mindshare whenever want to find something or buy a product.
Kevin, goes onto to give great examples of how MoneySuperMarket are beating Google hand’s down by ultimately using their marketing as an acquisition channel which rewards them for coming back. They’re running newsletters, social media, blog, apps, SEO and remarketing to such an effect that a Google search is becoming more and more irrelevant.
And if all that doesn’t work, well at least they have Snoop Dog.
February 10th, 2014.
Are you still waiting to implement a winning strategy in 2014? Is the doom and gloom hokum surrounding Google updates preventing you from making the right decisions?
Let me take you back to the past and into the shoes of a university student who chose to follow the straight arrow path of Marketing.
Amongst the countless amounts of acronyms and matrix tables that flooded lecture handouts is the classic “SMART” formula. The formula exists to guide you to defining better objectives.
Specific – Define what it is that you want to achieve. Answer those 5 W’s! Who, what, when, why and where.
Measurable – Quantify your objectives, how are you going to back up your results?
Achievable – We all like to overreach at times. When setting objectives make sure that they’re likely to be achieved by your team.
Relevant – Make sure the objectives are relevant to the business and in line with the overall marketing plan.
Time based – Set a date for the objectives to be complete (tricky in SEO).
A smarter SEO would also add
So I suppose you want a SMART example in SEO?
A fictitious Mexican food restaurant business based in the UK … “Guapo – Mexican”
“We want to target food lovers from the UK who enjoy tasting exciting Mexican dishes (specific) to raise awareness to our restaurant (actionable and relevant). We will aim to bring over 30,000 visits to our site (measurable) within 8 months (timely)”.
So, how will the SMART acronym apply to your 2014 strategy?
First you must begin to understand how search will change in 2014. So let’s take a look at the predictions.
2013 saw Google unleash the shackles on countless updates. If you weren’t scared at anyone point, then you’re a liar! We saw more frequent Penguin and Panda updates, Hummingbird and the (not provided) debacle finally hit its peak. Enough to send the SEO world into to complete disarray…
It’s safe to say that the SERPS changed in a big way last year. We saw steps to include more localised results as well as better integration of the knowledge graph.
A basic search for “Mexican food” returns a mixture of locally, knowledge based and contextually relevant results.
To get the most out of your SMARTER objectives for 2014, I’d suggest doing the following:
Make the most of local
1) Get listed on Google local places, claim your profile and add all the bells and whistles (360 Photos and Videos) to make sure that your profile stands out amongst your competitors.
2) Encourage sentiment and reward customers who review the restaurant on Google, Trip advisor, Yelp, Top table and other platforms.
3) A mobile version of your site is a must. The majority of mobile searches are for local services, take advantage of this by making your menu and deals accessible and shareable on mobile devices.
4) Go social… Nothing new there, but certainly a necessary step to taking up more first page real estate. In the case of the above example, the love of food is universal. Therefore a restaurant is blessed with the amount of social media tools available at its disposal. Facebook, Twitter, Google +, Instagram and Pinterest can also be used to great effect. You could also top this off by adding a blog to your site. Adding a blog is an easy way to increase the amount of pages, helping you rank for a wider set of keywords.
5) Add separate pages for multiple locations. This helps Google deliver the best result to the searcher, it’s probably a good call to also add your contact details to many pages.
Move away from one type of Analytics
Predicted by Rand Fishkin in Moz’s 2013 predictions was that marketers would need to stop relying on Google analytics as the sole platform for web marketing. He was right by some degree, other platforms such as Mixpanel, Piwick, Omniture and Hubspot did grow significantly last year.
Google’s (not provided) alienated many web marketers who put all their eggs in one basket. Being able to measure and report became tough but that wasn’t the only issue. Identifying opportunities for growth also became difficult. The market is becoming more competitive and margin for errors of ignorance is less forgiving.
Heavier correlation of G+ in search results
A Moz report in 2013 found a high amount of correlation in search rankings and their number of Google +1’s. Cyrus Shepard reported on the findings as surprising, although “correlation doesn’t necessarily mean causation”. The post did create some controversy which sparked Matt Cutts to respond to the debate via hacker news to poor cold water on the findings.
Make the most of Google + by building relationships with your audience and like minded businesses in your niche, identify the industry influencers and connect with them. Take advantage of rel=”publisher” and connect your website to your Google+ brand page.
Incorporating Google + tactics in your strategy will be more important than previous years. The research done by Moz and search metrics indicates the social networks significance and correlation to higher rankings. You will also benefit from increased Click through rate, relevant and influential communities as well as growing your brands authority.
Content Marketing continues to grow but who’s taking over the reins?
Content marketing has been the buzz word for the last couple of years now and it’s taken some time for many businesses to adjust, but to give you an idea of how far it’s come, it is said that up to 92% of marketers are now practicing some form of content marketing. But do marketers really know best?
The content marketing institute estimate that Marketers will have to up their game if they want to remain relevant. Perhaps not surprisingly, it’s Journalists who are leading the way with great backgrounds in writing and storytelling and information design. They know how to orchestrate content that makes you care, is different to the competition, is new and surprises people. Beyond their storytelling abilities are tech and research skills and the ability to meet strict deadlines.
Marketer’s who can learn to think like journalists in 2014 will reap the benefits of good content marketing.
Don’t rely on any one tactic
This is nothing new… perhaps it could just be received as conventional wisdom. Relying on anyone SEO tactic will result in either one of two things:
You’ll get burned
Your gains will be short term and eventually … You’ll get burned
Ace job there…
Unfortunately, there will always be digital marketers that will want to get the best returns with as little investment as possible. Hereby lays the problem. The result of this cavalier attitude is low quality content that is happy to be placed on any site that will accept it.
For those of you who pay attention to the latest SEO news might have read about Google unleashing a fire demon on guest blogging this year. My advice to you is, just up your game and you should be fine. Guest posting isn’t dead: Google just raised the quality bar. Matt Cuts has recently blogged about guest posting and its use effectiveness as an SEO tactic. He says, “there are many good reasons to do some guest blogging (exposure, branding, increased reach, community,etc.) Those reasons existed way before Google and they’ll continue into the future.”
The bottom line is, if you love your brand … why risk its demise? Make sure that you comply with the search engine guidelines and stay up to date with best practices. Try to focus on contributing thought leading articles and information that give you exposure, branding and increased reach. When trying to find a blog to post on, ask yourself … Would you be proud to see your brand exposed here? Does this blog capture my audience? Are the blog’s users engaged in its content?
How to be SMARTER in 2014
Specific – Are you taking local and mobile into account? … Your audience probably is.
Measurable – Google analytics is great, but to stay competitive you’re going to need more data.
Achievable – Can your team do the job? Maybe it’s time to look to hire journalists for your content marketing needs. Be aware how the SERP’s have changed this past year, it would appear that the contrast of real estate on the 1st page of Google keeps diversifying, with only 7 positions for some phrase types and 10 for others. Local listings, knowledge graph and semantic markup such as reviews and ratings also mean that there is so much more to play for.
Relevant – Are you tactics still relevant to your business plan. Does local SEO, social media’s integration in search and improved level of guest posting apply to your overall strategy and brand message?
Time Based – Setting a period in which to see results will always be tricky. However, you can set time periods for work to be completed. Reflect on the content marketing strategy, more and more journalist style marketers are going into content marketing not just because they know how tell a story but because they also know how to meet challenging deadlines.
Ethical – Make sure you’re meeting Google’s guidelines. Relying on anyone tactic will get you burned, you have to remember that your brand is at stake.
Recorded – Record the processes that you’re implementing throughout the strategy. Are the tactics working? Are they future proof? Are they following the plan?
February 6th, 2014.
We’ve been making infographics as a linkbuilding method for our clients.
If you don’t know why, see here.
Last week we launched a new piece for our friends at Love Reading. We’d researched the crimes committed by the most popular children’s book villains and worked out the sentences they would have received in a European court.
You can take a look at the piece here.
Long story short, the infographic came to the attention of The Times and they ran the research on page 3 of the Saturday edition. They mentioned the client’s site (and provided a link in the digital edition).
A testament to the power of infographics.
If you want to talk about an infographic for your brand, give us a call.
January 23rd, 2014.
I ran an experiment last year. I had a website with no blog. It had lots of pages on a niche topic, but very few readers. I installed a blog and began posting once per month. In a year, the traffic doubled – (I’ll admit it increased from ‘barely perceptible’ to ‘quite unremarkable’, but you can’t argue with the numbers).
The massive spike around April 2013 was from some experimenting with paid discovery. The second, smaller spike was a particularly controversial blog post.
I think this settles the argument once and for all: A regular content schedule is a sure-fire way to get traffic.
I know what you’re wondering – ‘How does this affect me, the business owner?’
Well, business owner, I’ll tell you.
It means that you should be publishing regular content on your site if you want people to be visiting it. But as a business owner (or marketing manager) you’ll be plenty busy enough with all sorts of other concerns – do you have time for creating a content marketing strategy too?
You need to be producing content – that’s a fact. It’s a thing you can’t deny. I create content for 30 clients – I use the ‘DEAL’ system, from Tim Ferris (author of The Four Hour Work Week):
Define, Eliminate, Automate, Liberate.
Define the sort of content you need. I daresay you won’t go far wrong with one blog post per week and one infographic per month.
You’ll also need to consider sharing and seeding the content as it’s produced. This can be done via the regular social networking channels, but also on targeted interest sites via email outreach.
All of these things take time – hours and hours of time. But only if you do them all yourself…
Remove any unnecessary steps in the programme. Don’t waste your time getting bogged down with trying to design things yourself or write blog posts yourself – there are plenty of people in the world who will do it for you in exchange for money. They are called freelancers and are readily available online.
Think about what you really need to do for the job to work. In fact, I’ll do it for you – you need to come up with content ideas and you need to check it, then post it. The rest can be done for you.
Automation is achieved by setting up a system that handles the tasks for you. In essence, you feed the machine with briefs and it comes back with content. Online freelancing services exist purely to make your life easier, and they’re really great.
My favourite freelancing sites include:
O-Desk is very useful for finding people to do basic tasks – data analysis, basic research, number crunching etc. I use O-Desk for jobs that are too time consuming to handle myself. For example, if I was trying to make an infographic about football transfers (which I am), I’d post the job on O-Desk and find someone more capable and efficient than me to handle the research and analysis while I concentrate on planning the next infographic.
O-Desk also allows you to create teams of people to handle larger ongoing projects. It’s efficient and easy to manage and provides a screentracker so you can make sure your freelancers are staying on task.
Do note, however – O-Desk has a very high number of have-a-go-heroes. They aren’t necessarily qualified in a given field, so although they are competent, you can’t expect them to do more demanding tasks. For basic stuff though, it’s ideal.
Textbroker’s site is fairly basic in functionality, but it focusses solely on copywriting so it’s far more targeted. Prices vary based on the writer’s rating (out of 5). I’ve found some really fantastic writers on Textbroker, but also some absolute stinkers. Usually I have to edit a few things as it’s easier than sending it back for amendments, but it saves a lot of time.
Good copywriters also tend to be good researchers. They’re generally more able to follow a complex brief than their counterparts on O-Desk, so you can offer them more in-depth projects to research.
People Per Hour – covers pretty much every digital-based job, but I use it for designers
PPH is more useful to me than some of my own body parts. I can post a job at 9 in the morning, receive proposals and have the job in the bag before I go to bed that night. People Per Hour has the benefit of knowing where your freelancer is located, so you can target areas that are likely to have more qualified personnel.
For instance, in searching for a designer, Europe has more reliable design schools than other parts of the world, and by choosing someone in Britain I can guarantee we’re in the same timezone, language and operate on the same working hours. It makes the tasks much more manageable.
The site is really fun. You could spend hours looking at the fantastic artwork and designs people come up with. It costs a lot as it’s targeted solely for design and membership is by invitation only so the vetting process is quite thorough.
Hiring works like a traditional jobs list – you post your jobs and people apply.
It is possible to contact the designers for one-off work, but generally they know the value of their work so be prepared to pay for it.
You need to get your content in front of people. Using services like O-Desk will be futile as the workers tend to take the easy option, and language barriers often mean briefs are misinterpreted. People Per Hour is better as you can find people with proven experience who can provide you with a list of relevant sites to contact with a view to posting your content.
Seeding is an essential part of the content process. Making sure your content appears in the right places and in front of the right people is undoubtedly going to reap its own rewards. By building lists of relevant sites to post to, you can automate this process and make sure every piece of content is placed in front of the influencers, sharers and promoters you need.
If you’ve got a bit of budget, you might also consider paid promotion on social media. ‘Boosting’ a post on Facebook, or StumbleUpon’s paid discovery service guarantee the content will be exposed to more people. However, the content needs to be useful and relevant to the audience to gain more traction. If it’s not engaging, people won’t engage with it (click/share etc.) and you’ll have wasted the promotion budget.
As you practice and refine this process you’ll find yourself free to do other things for your business. You’ll be free to chase new clients and more work, and the best part is, you won’t need to do any more work yourself – the system can handle it!
You’ll notice I didn’t mention anything about idea generation – that’s because I think idea generation is the one thing you shouldn’t outsource. You need to make sure your content is completely suitable for the purpose, and you can have a lot of fun coming up with new ideas.
January 16th, 2014.
What is link reclamation?
Link reclamation is where you’re looking to re-establish links or mentions that were directed towards your site in the past. There are many reasons why previous links may have disappeared but usually it comes down to technical reasons, such as updated pages or a wrong redirect put in place. You could argue it necessary to carry out a link reclamation project every time a website is being redesigned and content is migrated.
Put simply, link reclamation is the process of locating, contacting and fixing broken links to yours or your client’s website. It also has the added benefit of being a totally organic process, with virtually no risk attached. You’re only making the most of current mentions of your company.
Link reclamation is the perfect go to method when starting any link building campaign. It’s simple, quick and will give your campaign a steady footing right from the word go. Examples of where to look for previous links could range from charity work, local or national press, sponsors, exhibitions and review sites such as trust a trader or trust pilot.
Shall we begin… Exciting!
For this you’ll need:
Moz’s Fresh web explorer
One for brands is to look for misspellings. Frequently people will have webmaster error and for whatever reason, they will misspell your domain name.
For instance if you’re a big brand, say Renault or something, you could look for alternate spelling mistakes for your brand (Renualt.com) and where people have linked to the wrong site. From there, it’s simple enough to get in contact with the source of the link and ask that the link be corrected, helping both “our” users.
John Henry Scherck wrote a fantastic post on building links from brand misspellings, all you need is excel, majestic and Aaron Wall’s keyword misspelling tool and you can scale this to another level.
Reverse image search
Have any interesting images on your site? What about your logo? YES! This one is easy. Use the Google reverse image search. This can be a very effective piece to your link building puzzle. Monitor your images and see who’s used them without crediting you as the source. There are other tools out there that can help achieve the same, such as Tineye, Creative commons and Compfight.
You can take this a step further by using your competitor’s images or logos and see what websites are linking to your competitors. A good attitude to take from here would be to try and analyse why they’re using your competitor’s images over yours. It could be that they have a direct interest in your industry and therefore a chance to outreach presents itself.
Fresh Web Explorer – Moz & Google Alerts
Fresh web explorer really has to be one of the easiest ways to locate mentions of your brand that are being scattered around the web. Simply enter your URL or Brand name and search. You’ll hopefully be rewarded with a list of recent mentions that may have passed under your radar. You can also search for multiple phrases at a time, which is handy.
Similarly, you can use good old fashioned Google alerts. You can set this up to track your keywords, brand mentions and even Url’s. If someone mentions you, you’ll get an alert sent through to your email. From there, you can decide if you’d like to get a link from the resulting website.
Use webmaster tools
Go to crawl > Crawl errors, click on your URL’s to see where they’re linked from.
Simply click on that link and you should have a pop that gives you a more detailed look. From here, click on “linked from”.
This should give you the complete run down of who’s linking to you. From here, you can decide if these links are worth keeping or not. If they are and you have another page that is up to date and has thematic relevance to your 404 URL, simply place a 301 redirect in place. Then click “mark as fixed” and let Google get to work.
This is such a simple fix that it would be a crime to leave it out.
Moving Links to your Primary domain
Many companies have more than one domain. Perhaps it was that new intern that recommended a new domain or mini site that you’ve completely forgot about. It could even be an old product that is no longer available.
Going through all your old web assets can sometime uncover some golden opportunities, sometimes going beyond links. Perhaps you’ll rekindle an old business or promotional partnership that served you well in the past. By resolving this issue with a 301 redirect, you can transfer link equity from the unfavoured to the favoured.
Important note: Don’t redirect an old site to the new if it suffered from a Google penalty. You’ll only be breathing new life into those spurious links that caused you all that bother.
Redirected Pages & Server response errors
Using the scraping frog tool, scrape through the depths of your site, as deep as you can possibly go. Make an export of the crawl and pay attention to the response codes that are being found.
If you’re seeing server errors pop up, you can run backlink checker and identify problem areas. Pay attention to 302 redirects, change them to 301’s if possible, allowing previous link equity to pass through. You can also use a header checker tool to follow redirect paths. My favourite tool for doing is Ayima’s redirect tool. I can simply follow the previous redirect path for any problem URLs.
Links to tweets
This is a slice of genius from Ross Hudgens at Siege Media. If you have an active twitter account for your brand, you can make use of your historical data and create an archive of all your tweets and interaction, which can be done by going to account settings. You should then receive an email with instructions to download the zip file. This may take away depending on how active you are.
Place into a CSV and upload using screaming frog. Once it’s been crawled, you can easily see which web addresses have linked to tweets in your archive. If you’re responsible as the source of that content, try getting in touch with that web-master and ask if they can kindly change the link to your site instead of your Twitter handle.
This is just a handful of easy ways to reclaim or identify links that you should be making the most of, a great way to get a link building campaign off the ground. I’m always up for learning, so if you know any other cool little tricks, please comment below. Who knows, perhaps I’ll even be kind enough to link to you in the future.
December 18th, 2013.
Majestic SEO, the web’s biggest open link map, just recently updated their search explorer tool to version “Alpha v0.3”. The search explorer allows marketers to search on a specialised search engine which ranks pages based on how influential on the web graph they are. The approach is no nonsense and provides realistic search results, which exclude ads, authorship and the influence of temporal algorithm rankings.
So what should we expect?
Majestic says, we shouldn’t expect the tool to rival the major search engines, but it turns out they don’t have to as they believe the time is right for a subscription based engine with no advertising and which offers complete transparency. This makes it a great tool for digital marketers, because for the first time, we can more accurately see what factors are influencing high rankings in our niche.
The new update sees two new features that should have the mouths of SEOs watering: The new live rank factors and a new link prospecting methodology.
Live Rank Factors
Labelled as “transparency at your fingertips” by Dixon Jones (marketing director at Majestic), the live rank feature allows users to search by keyword to see what factors are increasing the rankings for theirs or their competitors sites in the SERPS. From this, SEOs can gain a better understanding of what it might take to rank for specific sets of keywords or topics. “For the first time, you can run a search query and see exactly why one search result appears above another”.
The Live rank feature search results tab allows users to analyse the corresponding data on a much granular level than anything before. When performing a search, your results are scored on variables such as InTitle, InAnchor and InURL. Majestic’s trusty flow metrics are also involved as well as referring domains and total external backlinks. These metrics when combined are giving us a great understanding of how Majestic are interpreting the search data.
Getting your hands dirty
From the Live rank factors page, you can dig deeper to get a better grasp of the data being presented. Go to the “Ranking Factors” tab and you can switch from “Data” to “Chart”, which allows you to easily digest the information being presented. From here you can see that if your site is dominating majestic’s search explorer but neither of the big search engines, then this may well mean that there are other factors which are at play, such as personalisation, ads, authorship and so on.
All of this offers great insight to us as SEOs because it allows us to know when enough high authority links have been built and the onsite optimisation of our pages is in good order. If you’re under pressure from clients or your boss to “BUILD MORE LINKS” then be sure to show them the data. Make them aware that although you’re ranking high in Majestic’s search explorer, building more links might not be the way forward and you may end up tripping the big search engines’ spam filters.
Majestic has commented on its plans for its live rank reports going into the future, saying that they’re looking to increase the variables in the algorithm which will be more closely aligned to those of Google and Bing, (well the ones that are perceived as common knowledge in the SEO field).
Link Prospecting Methodology
The link prospecting methodology makes use of AQS (advanced query syntax), our good old friend “site:”. Using this command will return sites that are more than likely to be authorities in their given fields/topics. You can also go further by using the command to bring up blogs on all kinds of platforms.
“Keyword site:blogger.com site:wordpress.com site:blogspot.com site:tumblr.com site:squarespace.com”
The “bucket list” feature allows you to save a list of competitors URLs and run a search against only those URLs. This is an amazing way of benchmarking against your closest competitors.
I love this new feature as it can be used to go beyond link building. I can use this tool for research by cherry-picking the types of results I want in my bucket list, such as trusted and accurate news sites. For example, if I want to research a specific celebrity, I can exclude any news source that offers little substance (red tops and glossy magazines). Majestic also commented that this tool can be used by PR professionals who want to work on reputation management.
To take a full tour of the tool you can view Majestic’s “how to” webinar below. It’s packed full of great instructions and tips for getting the most out of the tool.
I’m excited to see how it’s going to develop over the coming months as they start to add more and more data. I have but one recommendation for going forward…
The issue I found when using the tool for the first time was one that could quite often pop up for many SEOs. It is that of searching and building for branded terms.
Searching for “Sony” will return results from Sony TLDs and international sub folders such as: .com, .net, .co.uk, .jp etc. However, the same search in Google will come with geo targeting and return results such as: .co.uk, local, news, Wikipedia and Amazon, which would be a truer version of what my target customer would be seeing.
The only way I can think of to resolve this would be to include the Google results in my bucket list, but that can also be limiting in a way because I’d constantly be changing half my list due to the amount of variables in the results on any given day, such as news.
I would personally like to see a Geo toggle implemented within the search explorer to give better flexibility in the results. Other than that, I personally love what this new tool has to offer.