Web Technology « Datadial Blog
0208 6000 500

On the subject of Web Technology

Alex

July 19th, 2008.

ASP.Net Vs PHP

Firstly let me start by saying that I do not claim to be an expert in either PHP or ASP.Net. Although I use ASP.Net daily, I am fairly new to it. I have decided to do little investigation to the age old debate about the difference between PHP and ASP.Net. This is only a short post, but hopefully it will give readers a better understanding of how the two technologies differ. Hopefully this will be as un-bias as possible.

The reason I have chosen to investigate the difference between the two is that non-programmer constantly ask the questions: “What’s the difference between PHP and ASP.Net”, “Why can’t PHP applications talk to ASP.Net applications?” or “That could have been done in PHP so much quicker, for free”. There are so many deciding factors when choosing a web application framework. I will attempt to outline the differences using a For and Against bullet point format for both.

For PHP

  • Free
  • Open source
  • Easier to learn due to its basic scripting language structure and build in functionality
  • PHP5 now offers many object orientated development concepts
  • Has many free IDEs available that are very impressive and well supported (e.g. Eclipse)
  • Runs on Apache server which is open source
  • Run on IIS 6.0 and IIS 7.0 due to Microsoft’s implementation of FastCGI open standard
  • Has multiple platform support
  • Marginally faster due to the overheads imposed by .Net’s Common Language Runtime, which is responsible for intermediate compilation of .Net’s many languages
  • Has a huge support base as it is open source

 

Against PHP

  • Although it claims to be free, when using free 3rd party add-ons developers often run into issues when developing commercially available applications (i.e. ownership of code / intellectual rights)
  • Most PHP IDEs require lots of add-ons in-order to add similar functions to Visual Studio
  • No built in support for AJAX. Requires add-ons.

 

For ASP.Net

  • Can be developed using the stunning Visual Studio.Net IDE that offers vast array of features, that make coding much easier and development more productive
  • Developers can download a free scaled down version of Visual Studio that offers an impressive array of features. This is aimed at students and hobbyists
  • Runs on IIS (Internet Information Services)
  • Applications can be written using many programming languages (e.g VB.Net, C#, J#, C++ COBAL)
  • The .Net framework (the engine that ASP.Net is runs on) has more sophisticated error handling capabilities than PHP
  • Allows better separation of design and application logic using of code-behind pages and user-controls
  • Has built support for AJAX as of .Net Framework 3.5

 

Against ASP.Net

  • Requires a Microsoft licenses
  • Requires a basic knowledge of object orientated concepts which can sometimes deter newbie developers
  • Single platform and will only run on Microsoft web servers

These bullet points emphasise some of the main differences, advantages and disadvantages of the two languages.

Quotes from other developers for ASP.Net and PHP

For ASP.Net
ASP.Net is Strongly Typed, Object Oriented, Sandboxed, Multi-Syntax, Component Centric, Event Driven, forms oriented, pre-compiled experience.

PHP is a loosely typed, objects optional, fixed syntax, component-less, runtime interpreted, structured programming model.

Joe Stragner

For PHP
In the end, PHP is less expensive, faster, more secure, and able to be deployed from a Linux server that is also less expensive, faster, and more secure than their Windows based counterparts.

Anon

Conclusion
There are many factors that may sway you decision about which web application framework to choose. This decision should be based on the factors above, the kind of career path you want to choose and detailed research. In reality though the decision is usually down to which framework you are exposed to first as many develops get comfortable with on languages syntax and features.

My advice would be to use both; if you can, as each one has its own merits and has earned its place in today’s web application development industry.

Alex

July 19th, 2008.

The Future of UK Broadband

As the web progresses and continues to deliver more elaborate and rich media content it is inevitable that the speeds at which this information is delivered must increase or at least stay relative to what is considered acceptable. This means that the broadband speeds offered by internet service providers (ISPs) needs to stay on par with the progression of the Web. If we are to realise the true potential of the World Wide Web, first we need to build an infrastructure that can support it.

The main issue in achieving faster broadband is that currently the UK relies on an out dated telephone system to deliver broadband internet. There is a danger that the future growth and use of next generation web applications will be stunted due to our out-dated method of transmitting data. Sites like YouTube, MySpace, BBC iPlayer, 4 on Demand etc, would never have been possible using a dial-up 56k modem. This is a clear indication of how better connection speeds can improve level of web applications that can be delivered. Currently broadband speeds are sufficient for today’s use, but we must look to the future if we are to realise the potential of the Web.

The majority of the UK ISPs is still using copper wire to deliver their services, as opposed to fast fibre connection. It is estimated that fibre alternative connections can increase broadband speeds as much as 20 times.

BT has plans to install super fast fibre connections via their Openreach project, which will hopefully replace their old copper phone network in the future. The cost will be huge but their aims are to finance this by renting lines to rival ISPs such as TalkTalk, Tiscali, Car Phone Warehouse and Sky on a wholesale basis. This will allow next generation broadband packages and services to be delivered to consumers at a competitive price.

Leaders
At present the UK is miles behind countries like Japan and South Korea who have some of the fastest broadband speeds in the world, allowing them to watch broadcast quality television over the internet.

It is estimated that 90% of South Korea’s population are using broadband with an average connection speeds of 43MBPS. In Japan the average advertised connection speed is an incredible 90MBPS, which has been made possible via fibre-optic networks.

Current state of play in the UK

UK broadband prices are certainly dropping, and speeds have definitely improved since the days when 512KBPS was something to be proud of! Today’s norms are as much as 16x faster than they were a few years ago. The downside though, is that many of us in the UK don’t actually receive the broadband that we are sold. It is very much a postcode lottery, as people who live in more rural areas often receive a poorer service. This is due to the direct correlation between distance from the exchange and broadband speeds achieved.

If we are to keep up with Japan, Korea and of EU countries like France and Germany, our main focus should be on upgrading the way we transmit data. The fact that we are trying to squeeze every ounce of speed out of a network that was designed to transmit voice calls is a stark reflection of where we are and where we need to be in the future. Our current telephone network lacks the capacity to deliver the kind of high-speed broadband we require to realise the potential of UK Internet services.

Conclusion
We risks being left behind if we do not take the necessary steps to upgrade our data transmission infrastructure. This upgrade will allow businesses to develop new web related technologies to serve us in the future.

The future is bright for the Internet as new web applications are developed everyday that would never have been possible 5 years ago. The Internet will continue to evolve in years to come but its growth must not be stunted by something as simple as poor data transmission speed.

Alex

July 16th, 2008.

Internet Explorer 8

Internet Explorer 8 (IE8) is currently in its beta testing phase and will be the next version of Microsoft’s Internet Explorer web browser.

IE browsers have been renowned for being bug ridden due to the inability to follow web standards. As well as introducing some new addition to their browser IE8 also tackles past compatibility issues by attempting to make IE8 standards compliant. The downside to this is that it may break existing web page / applications designed to run on their previous browsers (IE6 & IE7).

To combat this issue, Microsoft has design the new browser with a facility that allows IE8 to be switch to three different modes: Quirk, Strict and Standard. These modes are activated either by the inclusion of specific tags (e.g. <meta http-equiv=”X-UA-Compatible” content=”IE=7″ />) within a web page or via user setting within the browser itself (the latter requiring a restart). Standard mode will be the default making IE8 use a more standardised DOM like Firefox and Opera.

The ability to switch modes is a very important as IE8 must stay compatible with older web pages; especially offline versions such as those found on instructional installation DVDs and CDs. Pages like these cannot be updated to accommodate the new changes so this facility is essential.

The addition of the browser version switching facility has been met with some controversy as some have argued that this hinders the progression of web standards. By giving people a choice, developers may continue to target older browser version instead of finally adopting a universal standard. Some have also stated that this is an example of “monolithic behaviour due to Microsoft’s dominating position in the web browser and operating system market.” – Hakon Wium Lee – Chief technology officer of Opera Software.

Web Slices
IE8 offer a brand new and interesting feature called Web Slices, which allows users to bookmark a specific section of a page (e.g. the London weather section of the BBC web site). This then allows users to view this specific snippet of information in isolation as a widget of popup. In the future web browsers will be able to predefine specific content that is available as a Web Slice so that users can simply add them to the browser tool bar and access them on demand. Each time a Web Slices content is updated the user is given an un-intrusive indicator to let them know that the content has been updated.

Activities
Activities allow developers to attach specific functionality to information on a page. For example, with additional browser add-ons users will be able to hover over an address field and IE will open a popup layer that links directly to Google Maps or by hovering over a key word for an item of clothing IE may open up an EBay popup with a list of search results. Current IE8 beta add-ons include Translate, Send, Map, Find, Define and Blog.

Developer Tools
Fans of Firefox’s Firebug will be happy to hear that IE8 will be equipped with a similar development tool that allows them to inspect a pages HTML, CSS and JavaScript in a visual debugging environment.

Conclusion
We must remember though that IE8 beta 1 is aimed at developers as it still contains many bugs. It has a long way before a general user beta version is available but it is heading in the right direction. The slight downside is that as much as many developers are excited about many of the new additions and updates, a lot of these updates are simply bug fixes for issues that weren’t addressed in IE6 and 7. Some of these issues even go as far as their core layout engine Trident, which was developed 10 years. IE8 will use Trident Version 6 which, believe it or not is first version to pass the Acid 2 test (except for the white stripes).The decision to make its default mode to be set to Standard (i.e. standard compliance) is also welcome even if some pages viewed in IE8 will initially break.

Microsoft has a huge task of improving their support for web standards without breaking existing web sites and we all know that standard compliance and backward compatibility do not go hand in hand with Internet Explorer.

Alex

July 16th, 2008.

Microsoft Silverlight

Microsoft Silverlight is a cross browser implementation of the .Net Framework that delivers interactive applications via the web. It does so by unifying the capabilities of the web server, the web browser and the desktop.

Silverlight improves the potential for developers and web designer to create rich applications that aren’t limited by the constraints of modern web browsers.

Silverlight runs on all major browsers including Internet Explorer, Firefox and Safari and also has the ability to adapt its video quality depending on what device it runs on e.g. desktop browser, mobile device, or 720p HDTV video mode.

Silverlight application can be created by a graphic designer or a web developer using either:

  • Microsoft Extended Blend – for layout and graphic design
  • Visual Studio .Net – for coding

 

There are currently two versions of Silverlight, 1.0 and 2.0 beta. The most noticeable difference between the two versions is Silverlight 2.0’s support for the .Net Framework.

Silverlight includes Windows Presentation Foundation which is new to .Net 3.0 and is designed to allow rich client features by extending browser based user interfaces beyond what is capable with HTML alone. It also provides a declarative mark-up language known as XAML (Extensible Application Mark-up Language; pronounced “zammel”) as well as adding extensions to JavaScript so that the client UI elements can be manipulated programmatically using event handlers.

Silverlight 2.0 is designed to integrate seamlessly with existing JavaScript and ASP .NET AJAX code and goes one step further by making it possible to create applications using VB .NET and C# due to its ability to access the .NET Frameworks programming model.

To run Silverlight applications all you need is a modern browser and the Silverlight plug-in, which can be downloaded and installed in minutes.

Silverlight XAML syntax is very similar to HTML as it allows you create rich web based UIs in HTML like syntax. Using Microsoft Extended Blend (MEB) designers can create engaging graphics, animation and media. MEB can generate XAML so that (via Visual Studio .Net) programmers and designer can collaborate and work on the same files.

XAML
XAML is to Silverlight what HTML is to web pages. It is text based and can be incorporated directly into a web page via the Silverlight runtime. It is used to define objects and their properties and focuses on defining UIs. XAML is firewall friendly unlike other technologies like Java Applets, Active X or Flash, (which all send binary content to the browser) which can pose security risks and is also easier to updates due to its text-based nature, unlike its rivals, (mentioned above) which have to be recompiled and redeployed after every change. Each time a Silverlight application is updated a new XAML file is generated that will be automatically downloaded the next time a client request is made. This eliminates the need for re-installation or redeployment and prevents the user experience from being disrupted.

Silverlight has a long way before it can compete with flash’s popularity, especially as it is a Microsoft only product. It has a huge amount of potential as it is designed to work with the .Net framework, which is a robust and proven foundation. Only time will tell as to how popular it will become and whether users and developers will jump on the Silverlight express!!

Alex

July 16th, 2008.

Web 2.0

What is web 2.0? This is the question that many people (even computer professionals) struggle to answer. Some consider it to be a slogan. Others simply see it as flashy AJAX enabled web sites with curved corner, modal pop-ups and drop-shadows.

Web 2.0 can be considered as applications and services that are built around the internet instead of expecting the internet to suit or adapt to the application.

The version number (2.0) suggests an improved World Wide Web (i.e. blogs, podcasts, RSS feeds etc) that provides a more interactive experience than standard read-only websites. The main goal is to bridge the gap between users and the providers. In many cases with Web 2.0, users become the providers as they are given the ability to upload content as well as download it.  Over time these sites become more popular and informative the more users add content, which is a stark contrast to old school Web 1.0 sites that limited used to viewing only.

Richer Web Applications
Web applications that incorporate technologies such as Flash, AJAX, Java, Silverlight and Curl have enhanced the user experience by creating improved browser based applications. These technologies make it possible to update specific sections of user content without the need to refresh the whole page. These techniques also tend to make more use of the client computer / browser to reduce the need for page postback and decrease server workload. This helps to increase the responsiveness of Web 2.0 web applications and improves the user experience. This is important as it makes it possible to create a richer, more responsive UI that is better able to mimic modern desktop applications.

We must remember that many of the new concepts that have been made popular by Web 2.0 have not replaced old protocols. They have simply added a layer of abstraction to them.

Web 2.0 should be thought of as bridging the gap between users and web content. It is about understanding how and why people use the web and providing the correct services to better serve their needs. The needs of the user must outweigh the visions of programmers, marketing directors or information architects. Web 2.0 is about doing things on the web that cannot be achieved on any other medium, not reinventing the wheel and shoehorning old concepts it into a web application and calling it Web 2.0

Alex

July 16th, 2008.

Unobtrusive JavaScript

There are many developers who do not consider how their web site might function if a user has JavaScript disabled within their web browser. Unobtrusive JavaScript (UOJS) methodology is a key component that encourages developers to build web pages that do not rely on JavaScript to deliver core content.

Usually the most common way to implement event based JavaScript is to embed event-handlers directly into HTML tags i.e., onclick, onmouseover, onload etc, or to generate dynamic mark-up using document.write functions. Unfortunately these techniques aren’t always implemented using the appropriate methods and sometimes go against the UOJS methodology.

It is important to remember that a web page should still be functional without any scripting and caution should be taken to avoid the over use of functions and dynamic content generation. The key is to separated web content into appropriate layers (i.e., Structure – HTML, Presentation – CSS and behavioural – JavaScript) so that each layer complements the layer that proceeds it.

Wherever possible, each layer should be separated into its own file and hooked into the page via IDs and Class attributes. Dynamic page content should be inserted after a page has fully loaded so that if JavaScript is disabled users aren’t left with a partially functioning page. UOJS stresses that the behavioural layer (JavaScript) should act as an enhancement to a page rather than a dependency.

Loading JavaScript Files
There are two ways to load JavaScript into a HTML document. One is to add the JavaScript within the head tags and the other is to add the script before the closing body tag. The first method can cause loading issues as it can slow down the page loading process. By default functions placed within the head tag are fired after the browser has rendered the page content. This means that the extra time it takes to download the necessary external JavaScript files is pointless, as the functions are called after the page content has loaded. If a web page relies on a particular JavaScript function to dynamically render or position content this can cause page elements to display incorrectly or jump when the desired function is eventually fired.

A more ideal solution would be to place JavaScript at the bottom of a web page so that by the time the function is fired the DOM is fully loaded and ready to be manipulated since the script is being loaded after the HTML. This method decreases the time it takes to load the page and forces the developer to build a page that doesn’t initially rely on JavaScript.

JavaScript Disabled Browsers
In circumstances where we have to generate dynamic HTML a useful method would be to add HTML place holders. These place holders take the place of the pre-rendered dynamic content (using CSS to set ‘visibility:hidden’ in-order to preserve the elements dimensions) and once the page has fully loaded we can generate the appropriate dynamic content and unhide the place holders via the JavaScript that loads at the end of the page.

You may be thinking that this contradicts the purpose of writing UOJS as users with JavaScript disabled will be left with parts of the browser content missing. The point of this technique is that by leaving the rendering of dynamic content to the end of a page users with JavaScript turned off will still be able to view the page without any page errors. Missing content can always be replaced by some informative information indicating why the content is absent. The above technique is highly recommended for web sites that are heavily reliant on JavaScript.

Summary
Although this blog post briefly touches on the subject of Unobtrusive JavaScript, hopefully it has wetted you appetite to go on and investigate the subject further. There are many guidelines and codes of conduct when applying UOJS so it is strongly advised that readers of this blog post explore this topic in more detail (especially articles written by Jeremy Keith). It is not always possible to implement all the concepts of UOJS as in some cases it may even break an application. A good working knowledge of this methodology, especially at the early stages of development may improve scalability, portability and efficiency of your future web applications.

Alex

July 14th, 2008.

Enrich your web content with an E-Mag

Have you ever seen one of those clever online magazines where you can “turn” or “flick” the pages with your mouse?

Here’s an Example

If you ever fancied getting your web content displayed in one of these flash viewers (you don’t even have to have a magazine, a product catalogue will do for example), ZMAGS provide an affordable way of creating these “online magazines” via the upload of various PDF files that represent the pages of the e-mag.

The service costs £19 for 1 magazine, £79 for 10 and £149 for 25.  More info here.

Many companies have used this service including our client, Pocket London as shown in the example above.  Other clients include, TNT Magazine, IKEA and Volkswagen

For Pocket London, we built a system around the Zmag IFrames that enables the client to “brand” each e-mag separately to their specifications.  The client can change the header, footer or an intro page around the IFrame through a bespoke content management system, therefore providing limitless possibilites of re-branding their online magazine for any number of clients.

Here are a few examples of the finished product:

Example 1

Example 2

Example 3

Alex

July 10th, 2008.

Global Web Stats – Part 2

Last year, I looked at the current trends in terms of browser and operating system market shares, as well as what screen resolutions people are using and the most popular internet user locations in terms of country.

I have been keeping constant track of changes in this data over the past year or so and have been able to study trends and changes in different usages.

My source for the data is W3Counter’s Global Web Stats, who compiles these usage statistics every month by studying the last 25,000,000 vists to approximately 12,000 websites.  There are some other sources for browser activity, for example, W3Schools Browser Statistics (not accurate because statistics are only based on visitors to 1 website).  By sticking to one source for information (even if it is not 100% accurate), it is easier to accurately discover just how quickly things change over any given time period.

Looking at browser usage statistics, Firefox’s share of the browser market has significantly increased.  According to the latest statistics, the total market share for the Mozilla/Firefox browser family is 29.62% compared to IE’s 61.43%.  Over the past year, Firefox has narrowed the gap by almost 10%.  These statistics take an average across many different countries, although recent reports suggest that in some countries, Firefox’s share is significantly higher – for instance in Poland where it is 45%.  However in the UK it is thought to be less than the global average at only 20%.

Internet Explorer 6 has shown a steady decrease in usage with Internet Explorer 7 increasing, although 6 still remains the most popular.  With the advent of Internet Explorer 8 on the horizon, it will be interesting to see how many people ditch IE6 immediately for version 8 and how many make the switch from 7 to 8.  IE8 has been billed as a far more standards compliant browser, that previous versions have been severely lacking in, which could potentially see some recovery for Internet Explorer over Firefox’s increasing market share.

The new Firefox 3 has just been released with rave reviews all round and is already increasingly its popularity.  Although the stats for June only show a 1% market share, this is sure to jump right up over the coming few months as more people upgrade to version 3.

Safari still “enjoys” a rather low percentage of the market (just 2%), despite now being available on PCs and Opera is showing a small but steady increase in popularity, though still only at a 1% market share.  The AOL browser seems to be completely dropping off the radar as more users are encouraged to switch to far more sophisticated browsers such as Firefox.

The following two graphs illustrate these shifts over the past year:

Recent operating system activity shows only slight changes in the market share between Windows and Mac operating systems.  Apple have increased their share to roughly 5% and Microsoft Windows versions account for 91%.  The most interesting statistic is slow uptake of Windows Vista, now a year and a half after its original release, the market share is just 8%, while computers running XP still account for 78% of the market.  With the possible release of a new Windows operating system less than 5 years away, it has been commented on that many users will stick with the largely stable XP until then, after the many grumblings and problems that Vista has had.

The following is a basic illustration of the change in operating system market share at the lower end of the market (not including XP) over the past year:

An analysis of the usage in various countries gives little indication of any change since my last report.  The developed nations of the USA and UK are seeing a decreasing share (although I suspect still an increase in overall usage in these countries) and making way for faster developing countries such as Germany, China, The Netherlands and Turkey.

The statistics on screen resolution confirms we are in the middle of what I like to call a “Widescreen Revolution”.  Screen resolutions such as 1280×800, 1440×900 and 1680×1050 have all shown a significant increase over the past year, and the “old reliable” 1024×768, 1280×1024 and 800×600 resolutions are on a steady decline.  Particularly in our office we waved a rather unemotional farewell to the last or our old CRT monitors that made a bee line for the scrap heap.  I’m sure there are thousands of other offices across the world also seeing the last of those big, chunky, heavy monitors.  In most cases they are being replaced by not only one widescreen flat monitor, but in many cases – 2 or even 3.  Yes, we are also seeing the “Multiple Monitor Revolution”!

The following graph illustrates this behaviour:

There will be another update on Global Web Stats in 2009.

Kolen

June 25th, 2008.

Design tools

Recently there has been a lot of talk about what software designers should use, or whether they should just skip the entire process and go straight to HTML & CSS.  The debate continues, but  I want to echo some of the responses made and emphasise that design software in any form is just a tool, and that which tool you use hugely depends on what you are trying to achieve.

We have a huge variety of clients and design needs here at Datadial.  Over the last month I have used Photoshop, Visio, the good ol’ fashioned pen and paper and in one case I did go straight to HTML/CSS.  What method do I like to use?  It all depends on the client’s needs!  The project that I went straight to HTML with had a very minimalistic layout that is hugely dependant on typography.  The ones I use pen & paper on require a large level of creativity and speed that you cannot get from going straight to a computer.  Photoshop is very flexible and creative, likewise for Illustrator.

I don’t think the tool really matters either. As long as the end result is what the client is after and works for their audience, then whether or not you use Photoshop doesn’t come into the equation.  The tool is not going to make a successful website – that comes to understanding who is using the site and designing for them.

Kolen

June 23rd, 2008.

Valuable resources for web designers

My bookmarks toolbar in Firefox is full with great websites at my fingertips to keep up to date with what is happening in the web design industry.  I have recently moved to London from New Zealand and I accidently left my great list of resources behind – whoops! I really should get into online bookmarking!

So I’ve had to rebuild my bookmark collection and thought it would be nice to share some them with you here (in no particular order…)

  • Web Designer Wall – perhaps my favourite website of all, the mixture of news, tutorials and resources is just spot-on!  And the site looks so pretty :-)
    Spoon Graphics and Design Reviver are in a similar vein with that magic combination of content.
  • Best Web Gallery is a great spot for research and inspration.  I really like it how this gallery doesn’t discrimate about the technology used on the featured site, like there are Flash sites mixed in with Web Standards sites – but they are all chosen because of their outstanding design.
  • A List Apart – the authority and cutting edge on web standards developments.  Publishing techniques like the Faux Absolute Positioning is just invaluable, and illustrations are just so gorgeous…
  • Which brings me to illustration websites.  Unfortunately most of the great illustration sites I used to have bookmarked were ones I’d stumbled upon by accident, but Keven Cornell’s site is a superb website that inspires me before I even read a word!
  • Signal vs. Noise is a must read for anyone in the web world.  Likewise for other blogs by the industry leading guys such as Douglas Bowman, Jeff Croft, Mark Boulton and John Hicks.  I still miss Andy Clarke’s old blog, but keep an eye out for the occassional post at Stuff & Nonsense :-)
  • Design Float is a great website that collates other blog posts about web design.
  • Other design resources like ColourLovers (perfect for fleshing out colour schemes), Photoshop brushes, free fonts and more tutorials.
  • Viget Inspire is a great blog by the web designers at Viget, addressing many issues regarding web design.
  • And last but by no means least is the Web Standards Group email list.  Over the last year or so I’ve found that the list itself can be rather tedious at times, but then once a week Russ Weakley puts out the Links for Light Reading which is a summary of what’s going on around the blogosphere and in the web design/standards world.

Alex

August 29th, 2007.

Global Web Stats

Just how popular is Vista today? How many people have switched from IE6 to IE7? Just what share of the browser market does Firefox have?

For answers to these questions and more it is worth regularly checking W3 Counter’s Global Stats which can be found here:

http://www.w3counter.com/globalstats.php

These stats are generated by tracking the last 32 million unique visits to 5,500 websites that represent a broad cross section of internet traffic. From each unique visit, the web browser, operating system, country of origin and screen resolution can all be determined to produce a reasonably good picture of just what general web traffic looks like today.

From looking at the latest stats for 20/08/07 we can see that a person in the USA, running Windows XP, Internet Explorer 6 with a screen resolution of 1024×768 is your average Joe web surfer, this accounts for 6% of web users. Myself I am based in the UK, running Windows XP, Internet Explorer 7 (as my primary browser) and screen resolution of 1280×1024. Only one in a thousand computers run the same combination as I do.

But lets look at the really interesting stats.

Starting with web browsers we can see that Microsoft’s Internet Explorer still dominates the market with an approx 66% share. This is not taking into account the stats for IE5, IE4 and earlier, which I am sure some systems in less developed countries are still running. However the real story here is that IE’s lead is slipping due to Firefox’s continuing popularity – Firefox versions 1.0, 1.5 and 2.0 now account for approx 25% of all web browsers. Safari still has an approximate 2-3% share, however it is unclear whether Safari or Firefox dominate the browser industry for Macs.

Operating system stats show that it’s been a slow start for the brand new operating system from Microsoft – Windows Vista. Just 3.33% of the OS market in August, but recent trends show that share increasing. In May 2007 it was 2.13% and has increased by 1.2% over the past 3 months. XP still dominates, but I expect to see Vista’s share increase more rapidly over the next few years as more people buy new PCs or upgrade. Mac OS X users still account for just under 4% of all users suggesting that the marketing managers at Apple need to do more than their latest “Mac vs PC” advertising campaign to encourage more people to make the big switch.

The stats for the countries of origin offer no real suprises with the US of A leading. Germany and the UK in 2nd and 3rd place respectively and a suprising Latvia in 4th. The Chinese account for just over 2% of all users who visited the web pages in question, but recent media reports suggest this statistic is far higher (almost as dominant as the US). It is well known that the Chinese government have blocked internet sites, which could include many in these reports, or that it may be that the sites in the report are more westernised, located in the US (may be blocked), or are written in languages that Chinese web users are generally unlikely to understand – you can only speculate.

Screen resolution statistics are largely trivial, but the most common is 1024×768, which I am sure will remain dominant for many years to come. Many IT professionals, gamers and experienced computer users generally opt for higher resolutions such as 1280×1024, or widescreen resolutions such as 1280×800 or 1440×900. If any of these statistics should be taken into consideration its that 8.42% of web users are still using a low resolution of 800×600. This figure is falling slowly, in May 2007 it was 9%. While many of the systems running 800×600 may be in less developed countries, nevertheless it is important for web designers to design websites with this low resolution in mind. An example of how not to do it – http://www.hrodc.com/.

Because of the lack of legacy data, it is difficult to put some of these stats into context over the past few years, but rest assured I will be keeping track of them over time and a review of the stats will come in the shape of another blog article towards the end of the year.

Newer Posts »

Recent Posts »

Our work »

What we do »

Who we work with »

Got Questions? Lets Talk »