September 3rd, 2014.
We’re always telling our clients that following Google’s best practice is the best strategy to ensure longer-term success in the search results.
So it’s no surprise that every small announcement of changes to their algorithm now gets picked up upon quickly and generates a rush to ‘comply’ regardless of the detail.
In our opinion site speed is really important for the user. Whilst Google will also look at this metric it is way down its list on important factors affecting page/site rankings. Matt Cutts himself even stated that these changes would affect less than 1% of all queries,
You’ll notice that the current implementation mentions that fewer than 1% of search queries will change as a result of incorporating site speed into our ranking. That means that even fewer search results are affected, since the average search query is returning 10 or so search results on each page. So please don’t worry that the effect of this change will be huge. In fact, I believe the official blog post mentioned that “We launched this change a few weeks back after rigorous testing.” The fact that not too many people noticed the change is another reason not to stress out disproportionately over this change.
Site owners seem happy to panic about site speed and security before addressing more fundamental (and infinitely more important) aspects such as page mark-up, site structure and hierarchy, and on-site copy.
It goes without saying that a fast site will improve usabiltiy and help conversions and this is as good or better reason for addressing it than for a Google announcement.
Recently Google announced that they will treat sites served with https better then sites that aren’t using a secure certificate.
As with site speed metrics, this is way down on Google’s list of priorities in the ranking algorithm, and we do not consider that it is a particular band wagon to be jumping on with the aim of improving search results. Google themselves admit that
“For now it’s only a very lightweight signal—affecting fewer than 1% of global queries, and carrying less weight than other signals such as high-quality content—while we give webmasters time to switch to HTTPS. But over time, we may decide to strengthen it, because we’d like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web.”
Google certainly has history of implementing changes which website owners feel obliged to comply with, only to backtrack on them later.
So our advice is to implement HTTPs on your site only if you feel it is required for other reasons than just search (eg customer confidence), we have outlined the possible pros and cons for you below,
- Your site is protected by an additional SSL layer
- A minimal algorithmic gain in rankings, implementation is certainly unlikely to result in any visible rankings changes
- An increase in trust from site visitors, especially important on sites offering more sensitive data or transactions
- The initial connection to your site may be slower, especially if you have to implement 301 redirects from non https pages
- Cost of an annual SSL (if you don’t already have one)
- In some cases you may lose referral data
- You’ll need to ensure all https pages are correctly 301 redirected to the newer https versions
- Need to update your internal link structure if you’re using relative URLs for internal links
- You may find that many products don’t support the use of https
Summer 2014 will see one of the biggest domain modifications in the UK ever. The roll-out of .uk domains will allow companies who currently use ‘.co.uk’ or ‘org.uk’ domains to also register for ‘.uk’ e.g. datadial.co.uk will be able to register datadial.uk.
Nominet will offer the current holders of the .co.uk or org.uk up to 5 years to register the domain before it is released to the general public.
In the event that one company holds .co.uk and another holds .org.uk, the shorter domain will be offered to the holders of the .co.uk domain.
If a domain is not currently registered to .co.uk or .org.uk, the domain will be available on a first-come, first-served basis on launch day.
So in essence, if you have a .co.uk or a .org.uk, you will be able to simplify it to .uk next year.
…a wheelbarrow in an open field that you drag along every day filling it with this and that – each thing you add to it has some significance and some use.
Now imagine you never empty the wheelbarrow. Each day, not only do the things you found the week before now lie at the bottom covered by the newest additions, but the device also becomes increasingly heavy to pull until eventually, it becomes almost impossible.
Now think of the wheelbarrow as your website, and think of its contents as the factors affecting its speed – Let’s explore these factors…
- Empty spaces between code (This only adds to processing time)
- Missing tags (Causing internal errors & bugs in the site)
- Bulky HTML (such as using unnecessary tags where something more CSS compatible would work better e.g. using the tag “font-size” rather than just “small”)
- Background colour being the same as text colour (making all text unreadable)
- Hyperlinks that fail (Devaluing your site in terms of credibility, and possibly increasing bounce rates)
- Missing images
An overload of HTTP requests:
Whenever your web browser fetches a file from a web server, for example when it loads a picture, it does this by using HTTP which stands for “HyperText Transfer Protocol”.
HTTP is an action whereby you’re computer requests for a particular file. One example is a request for ‘home.html‘ (the homepage of a particular website). The web server then sends a response to the computer that says something like: “Here’s the file you asked for” which is followed by the actual file itself.
Understandably, if your server is receiving a very high volume of requests for a range of different things, such as pictures, graphics, photographs, music players and video rendering, it can take its toll and end up really slowing your website down.
Too many cookies:
HTTP Cookies are used mainly for personalization and authentication purposes. A series of saved information is exchanged between the web server and the browser in order to remember things about how you are using the internet. For example if you are shopping online and exit the website returning at a later date, a cookie will enable the site to remember what you had in your shopping cart so you don’t have to spend time finding the same items again.
Web hosting is the business of providing storage space and access for websites. Bad web hosting happens when said storage space is overloaded with many websites, yours is added to the list and so runs slow. Other issues caused by a bad web host include:
- Search engines being unable to crawl your site resulting in a fall in Search Rank
- Your website being “down” (not working, sending out 404-errors)
- Not being able to contact your web host to fix the issue (since the service is so bad the system has probably crashed)
Excess of external media:
Embedded YouTube videos, actually embedded anything that is coming from another website can potentially slow yours down. When you embed something from another site, you are relying on that sites web server, that sites speed, and that sites ability to ensure the embedded item is working properly there, so that it works properly on yours site. Often, even when it works just fine, it might add an extra few seconds to a certain page loading…a few seconds a potential customer may be unwilling to wait!
Spam is so much more than just a bunch of annoying emails. It slows down the Internet and it increases consumer fees.
The internet is a network where spamming effects everyone that uses it. To push spam around the internet relies on a process; it begins with global networks that pass the spam along to their destination, and ends with the message being received by the recipient.
Simultaneously, time, money and resources are used trying to catch and prevent spammers from infiltrating mail servers resulting in higher costs to the consumer because providers are forced to add more security to their servers and hire more staff to manage and prevent the problem.
Be sure to spam proof all web forms by adding “captchas” or similar.
A ‘favicon’ is an image (as shown above) that stays in the root of your server. It’s definitely needed because even if you don’t care about them, the browser still requests one. If there isn’t one, it will respond with a 404 error (meaning not found). Any error message, such as a 404 or 301, is an extra message sent that adds time to the processing of a site.
This image or lack thereof, interferes with the processing sequence by requesting extra components in the load, and since the favicon is the first thing that is downloaded before these extra components, if there isn’t one, the first thing downloaded will be an error.
Too many advertisements:
Any time a site uses advertisements, you are adding to other processes a site goes through in order to function correctly. Programmes like Google Adsense and Microsoft adcenter are external, and reputable, however it is logical to practice the same rules as with external media; everything in moderation – besides, sites with too many ads look un”site”ly!
If any of these apply to you, take active steps to protect your website against sloth! Speed be with you!
April 9th, 2010.
Typically my knowledge of domains, DNS settings and A-records is happily restricted to one phrase: “Sergio, can you setup a new site please?”
But recently my personal website had issues with the DNS service I was using. I needed to move it, including changing my entire hosting and email. It’s headache material, but it’s one of those things that website owners have to do sooner or later.
If you are also daunted by all of this jargon, this may help:
The small print: this is a highly simplified diagram of how all of this works, intended to only provide an introduction. There are many more connections in between the ones illustrated. This also only shows POP3 email accounts (the most common business setup), but there are other options such as webmail and IMAP.
Like many of Microsoftâ€™s recent software products (Visual Studio .Net, .Net Framework or C# 3.0) SQL Server 2008 has been enhanced based previous versions by fixing bugs and adding new functionality built based on existing features. This is a welcome strategy that helps ease the transition for developers and database administrators.
In this blog post I will briefly touch on some of the database administrator functions and focus on the developer functionality, additions and upgrades.
SQL Server 2008 policy management has been updated and is now called the â€˜Declarative Management Frameworkâ€™. It is now possible to configure multiple database servers so that a standard configuration can be applied and maintained on multiple servers and databases.
Multiple Server Integration makes it possible to execute queries against multiple servers by placing them within special groups. Result can be categorised into one result per server or merged together as one set of results.
Security (Transparent Data Encryption)
A cool new feature of SQL Server 2008 is the improved flexibility of data encryption. Data encryption is now a property of the database instead of application code. This makes the database administrator and developerâ€™s lifeâ€™s easier as now they donâ€™t have to make changes to the database every time encryption functionality changes at the application level.
Database administrators can now specify how much CPU/RAM each user is allowed to use. This will help to eliminate situations where a userâ€™s mistake could potentially bring down a whole server. By imposing these limitations users are restricted to a predefined amount of CPU/RAM usage.
Developers have also been treated to some appealing updates in the latest version SQL Server. There are a number of new features that have been added to make a developerâ€™s life easier and increase their productivity.
LINQ Vs SQL
Most developers are pretty familiar with writing T-SQL queries to retrieve data for application objects. Most are also aware of the distinct syntactical differences between VB/C# and T-SQL. SQL Server 2008 provides something called the LINQ to SQL provider, which makes it possible to write LINQ commands directly within SQL Server. This lets develops use one common object centric language in the application domain and the database domain. Developers will be able to use the LINQ programming syntax on database table and application collection, XML and datasets.
DATETIME Data Type
Datetime has now been separated into two separate data types so that each one can be defined independently. These new data types will help increase performance by eliminating the need to perform certain operations to extract the date or time portion of the Datetime data type.
GEOGRAPHY and GEOMETRY
These two additions have been added to better represent location specific data. This eliminates the need to break geography and geometry data down into formats that fit other standard data types.
The SQL language has had some small additions in the shape of Inline Variable Declarations:
DECLARE @myVar int
SET @myVar = 5
DECLARE @myVar int = 5
C like math syntax
SET @i += 5
Icing on the cake
SQL Server 2008 had been blessed with Intellisense. All I can say is â€œabout time tooâ€. Gone are the days where you would have to do a Google search to remind yourself of certain T-SQL keywords, statements or syntax. Now SQL Server will give youâ€™re a full list of statements and keywords available for a particular variableâ€™s data type as well as column names for a particular table.
SQL Server offers some interesting new features that will keep developers and administrators happy. These new features integrate well with Visual Studio and the .Net framework 3.5 making SQL Server a nice finishing touch to a well rounded application development environment.
With Sainsburys website going down this week and Amazon’s the weekÂ before it is worth taking a moment or two to consider what would happen to your business if your website was out of action.
For most e-commerce clients this could have catastrophic consequences: not only would sales evaporate but you would also lose access to all your salesÂ and customer data.
However, hosting and the quality thereof is often totally ignored in client briefs.Â It is definitely not a priority and is only ever occasionally paid lip service.Â This may have been ok a few years ago when websites were just an experiment and an addition to an existing business rather than the core to a business.
However, it is difficult to convince people to invest in proper corporate hosting as there is aÂ perception that hosting should be practically free. It’sÂ true there areÂ some companies offering hosting for Â£10 per year of less.Â Honestly, what do you think theseÂ companies would provide in terms of back up or reliability servcie lever agreement.Â Not much I think.
So what should you consider when hosting your website.Â Â There are many things to consider, here are just three.
First, where is your website being hosted and who manages and owns the servers.Â Most websites in the UK are hosted in Telehouse in Docklands where there are massive generally well managed datacenters.Â But have you ever seen inside a data center, have you ever asked about their air conditioning systems, their own back up power supply, their connectivity to the web?Â Then what about the servers; who owns them? Who is responsible for updating them with critical patches? When is this done?Â What happens if they crash? What is the rebuild time? How many other sites are their on the same server as yours? How secure is the access to the servers?Â If you are not asking these questions then you are not taking your website presence seriously.
Second there are back ups.Â How often is your site backed up and where are the back upsÂ kept? If your service provider’s data center is blown up (a very realistic proposition, especially if you house your website in Telehouse in docklands) will the back ups go up with it.Â Â If they are kept offsite how often are they taken offsite.
Third, what aboutÂ redundancy.Â Â If your server crashes is there a mirror server which will automatically take over?Â What if your website is overloaded with visitors, can your server handle the traffic?Â Is there a load balancing mechanism that will automatically divert users to an alternative server?
All these issues need to be addressed when considering hosting and website owners need to change their mindset from considering hosting as essentially a free service to one that is valued and is invested in appropriately according to business requirements and risk assessment.
Microsoft have finally revealed Surface Computing a technology where users intereacts with the desktop
Completly by touch.
One of the most common questions I get asked from clients is, why do I get spam or email virus that appears to originate from inside our organization.Spammers and Viruses are becoming evermore resourceful in trying to elude us to open their emails. One of the simplest ways of getting you to open an email is spoofing email address of users we trust. There are several ways they can get hold of userâ€™s emails the question is how you stop spammers and viruses from faking addresses.Today’s anti-spam are composed of several layers for detecting spam. One of the methods for detecting fake or spoof emails is inbound authentication and Identity verification technically known as SIDF.
How Sender ID Works
- The sender sends an e-mail message.
- The recipientâ€™s inbound e-mail server receives the message.
- The inbound e-mail server checks which domain claims to have sent the message and checks
the DNS for the SPF record of that domain. The inbound server then determines if the IP address
of the sending e-mail server matches the IP addresses that are published in the SPF record.
E-mail messages that fail may be deleted, blocked, or sent to the Junk e-mail folder.
- As a recommended option, the Sender ID result can be combined with reputation data about the
IP/domain holder. This reputation data enhances delivery decisions for all e-mail, including
messages sent from both legitimate senders and spammers which may pass the Sender ID check.
- When combined with the receiving networkâ€™s anti-spam and anti-phishing technologies, the
e-mail may be delivered to the Inbox, the Junk or Quarantine folders, or may be blocked and deleted.Â
Question is, so why are fake emails still getting through?
Many small businesses do not know or still have not implemented this extra layer of security until a majority of business implement the SPF on their domains we will still continue to receive fake emails or we could opt to block all emails that have not implemented, thisÂ solution is risky as businesses could possibly lose important emails from potential clients.