As the web progresses and continues to deliver more elaborate and rich media content it is inevitable that the speeds at which this information is delivered must increase or at least stay relative to what is considered acceptable. This means that the broadband speeds offered by internet service providers (ISPs) needs to stay on par with the progression of the Web. If we are to realise the true potential of the World Wide Web, first we need to build an infrastructure that can support it.
The main issue in achieving faster broadband is that currently the UK relies on an out dated telephone system to deliver broadband internet. There is a danger that the future growth and use of next generation web applications will be stunted due to our out-dated method of transmitting data. Sites like YouTube, MySpace, BBC iPlayer, 4 on Demand etc, would never have been possible using a dial-up 56k modem. This is a clear indication of how better connection speeds can improve level of web applications that can be delivered. Currently broadband speeds are sufficient for todayâ€™s use, but we must look to the future if we are to realise the potential of the Web.
The majority of the UK ISPs is still using copper wire to deliver their services, as opposed to fast fibre connection. It is estimated that fibre alternative connections can increase broadband speeds as much as 20 times.
BT has plans to install super fast fibre connections via their Openreach project, which will hopefully replace their old copper phone network in the future. The cost will be huge but their aims are to finance this by renting lines to rival ISPs such as TalkTalk, Tiscali, Car Phone Warehouse and Sky on a wholesale basis. This will allow next generation broadband packages and services to be delivered to consumers at a competitive price.
At present the UK is miles behind countries like Japan and South Korea who have some of the fastest broadband speeds in the world, allowing them to watch broadcast quality television over the internet.
It is estimated that 90% of South Koreaâ€™s population are using broadband with an average connection speeds of 43MBPS. In Japan the average advertised connection speed is an incredible 90MBPS, which has been made possible via fibre-optic networks.
Current state of play in the UK
UK broadband prices are certainly dropping, and speeds have definitely improved since the days when 512KBPS was something to be proud of! Todayâ€™s norms are as much as 16x faster than they were a few years ago. The downside though, is that many of us in the UK donâ€™t actually receive the broadband that we are sold. It is very much a postcode lottery, as people who live in more rural areas often receive a poorer service. This is due to the direct correlation between distance from the exchange and broadband speeds achieved.
If we are to keep up with Japan, Korea and of EU countries like France and Germany, our main focus should be on upgrading the way we transmit data. The fact that we are trying to squeeze every ounce of speed out of a network that was designed to transmit voice calls is a stark reflection of where we are and where we need to be in the future. Our current telephone network lacks the capacity to deliver the kind of high-speed broadband we require to realise the potential of UK Internet services.
We risks being left behind if we do not take the necessary steps to upgrade our data transmission infrastructure. This upgrade will allow businesses to develop new web related technologies to serve us in the future.
The future is bright for the Internet as new web applications are developed everyday that would never have been possible 5 years ago. The Internet will continue to evolve in years to come but its growth must not be stunted by something as simple as poor data transmission speed.
Internet Explorer 8 (IE8) is currently in its beta testing phase and will be the next version of Microsoftâ€™s Internet Explorer web browser.
IE browsers have been renowned for being bug ridden due to the inability to follow web standards. As well as introducing some new addition to their browser IE8 also tackles past compatibility issues by attempting to make IE8 standards compliant. The downside to this is that it may break existing web page / applications designed to run on their previous browsers (IE6 & IE7).
To combat this issue, Microsoft has design the new browser with a facility that allows IE8 to be switch to three different modes: Quirk, Strict and Standard. These modes are activated either by the inclusion of specific tags (e.g. <meta http-equiv=”X-UA-Compatible” content=”IE=7″ />) within a web page or via user setting within the browser itself (the latter requiring a restart). Standard mode will be the default making IE8 use a more standardised DOM like Firefox and Opera.
The ability to switch modes is a very important as IE8 must stay compatible with older web pages; especially offline versions such as those found on instructional installation DVDs and CDs. Pages like these cannot be updated to accommodate the new changes so this facility is essential.
The addition of the browser version switching facility has been met with some controversy as some have argued that this hinders the progression of web standards. By giving people a choice, developers may continue to target older browser version instead of finally adopting a universal standard. Some have also stated that this is an example of â€œmonolithic behaviour due to Microsoftâ€™s dominating position in the web browser and operating system market.â€ â€“ Hakon Wium Lee â€“ Chief technology officer of Opera Software.
IE8 offer a brand new and interesting feature called Web Slices, which allows users to bookmark a specific section of a page (e.g. the London weather section of the BBC web site). This then allows users to view this specific snippet of information in isolation as a widget of popup. In the future web browsers will be able to predefine specific content that is available as a Web Slice so that users can simply add them to the browser tool bar and access them on demand. Each time a Web Slices content is updated the user is given an un-intrusive indicator to let them know that the content has been updated.
Activities allow developers to attach specific functionality to information on a page. For example, with additional browser add-ons users will be able to hover over an address field and IE will open a popup layer that links directly to Google Maps or by hovering over a key word for an item of clothing IE may open up an EBay popup with a list of search results. Current IE8 beta add-ons include Translate, Send, Map, Find, Define and Blog.
We must remember though that IE8 beta 1 is aimed at developers as it still contains many bugs. It has a long way before a general user beta version is available but it is heading in the right direction. The slight downside is that as much as many developers are excited about many of the new additions and updates, a lot of these updates are simply bug fixes for issues that werenâ€™t addressed in IE6 and 7. Some of these issues even go as far as their core layout engine Trident, which was developed 10 years. IE8 will use Trident Version 6 which, believe it or not is first version to pass the Acid 2 test (except for the white stripes).The decision to make its default mode to be set to Standard (i.e. standard compliance) is also welcome even if some pages viewed in IE8 will initially break.
Microsoft has a huge task of improving their support for web standards without breaking existing web sites and we all know that standard compliance and backward compatibility do not go hand in hand with Internet Explorer.
Microsoft Silverlight is a cross browser implementation of the .Net Framework that delivers interactive applications via the web. It does so by unifying the capabilities of the web server, the web browser and the desktop.
Silverlight improves the potential for developers and web designer to create rich applications that arenâ€™t limited by the constraints of modern web browsers.
Silverlight runs on all major browsers including Internet Explorer, Firefox and Safari and also has the ability to adapt its video quality depending on what device it runs on e.g. desktop browser, mobile device, or 720p HDTV video mode.
Silverlight application can be created by a graphic designer or a web developer using either:
- Microsoft Extended Blend â€“ for layout and graphic design
- Visual Studio .Net â€“ for coding
There are currently two versions of Silverlight, 1.0 and 2.0 beta. The most noticeable difference between the two versions is Silverlight 2.0â€™s support for the .Net Framework.
To run Silverlight applications all you need is a modern browser and the Silverlight plug-in, which can be downloaded and installed in minutes.
Silverlight XAML syntax is very similar to HTML as it allows you create rich web based UIs in HTML like syntax. Using Microsoft Extended Blend (MEB) designers can create engaging graphics, animation and media. MEB can generate XAML so that (via Visual Studio .Net) programmers and designer can collaborate and work on the same files.
XAML is to Silverlight what HTML is to web pages. It is text based and can be incorporated directly into a web page via the Silverlight runtime. It is used to define objects and their properties and focuses on defining UIs. XAML is firewall friendly unlike other technologies like Java Applets, Active X or Flash, (which all send binary content to the browser) which can pose security risks and is also easier to updates due to its text-based nature, unlike its rivals, (mentioned above) which have to be recompiled and redeployed after every change. Each time a Silverlight application is updated a new XAML file is generated that will be automatically downloaded the next time a client request is made. This eliminates the need for re-installation or redeployment and prevents the user experience from being disrupted.
Silverlight has a long way before it can compete with flash’s popularity, especially as it is a Microsoft only product. It has a huge amount of potential as it is designed to work with the .Net framework, which is a robust and proven foundation. Only time will tell as to how popular it will become and whether users and developers will jump on the Silverlight express!!
What is web 2.0? This is the question that many people (even computer professionals) struggle to answer. Some consider it to be a slogan. Others simply see it as flashy AJAX enabled web sites with curved corner, modal pop-ups and drop-shadows.
Web 2.0 can be considered as applications and services that are built around the internet instead of expecting the internet to suit or adapt to the application.
The version number (2.0) suggests an improved World Wide Web (i.e. blogs, podcasts, RSS feeds etc) that provides a more interactive experience than standard read-only websites. The main goal is to bridge the gap between users and the providers. In many cases with Web 2.0, users become the providers as they are given the ability to upload content as well as download it.Â Over time these sites become more popular and informative the more users add content, which is a stark contrast to old school Web 1.0 sites that limited used to viewing only.
Richer Web Applications
Web applications that incorporate technologies such as Flash, AJAX, Java, Silverlight and Curl have enhanced the user experience by creating improved browser based applications. These technologies make it possible to update specific sections of user content without the need to refresh the whole page. These techniques also tend to make more use of the client computer / browser to reduce the need for page postback and decrease server workload. This helps to increase the responsiveness of Web 2.0 web applications and improves the user experience. This is important as it makes it possible to create a richer, more responsive UI that is better able to mimic modern desktop applications.
We must remember that many of the new concepts that have been made popular by Web 2.0 have not replaced old protocols. They have simply added a layer of abstraction to them.
Web 2.0 should be thought of as bridging the gap between users and web content. It is about understanding how and why people use the web and providing the correct services to better serve their needs. The needs of the user must outweigh the visions of programmers, marketing directors or information architects. Web 2.0 is about doing things on the web that cannot be achieved on any other medium, not reinventing the wheel and shoehorning old concepts it into a web application and calling it Web 2.0
Have you ever seen one of those clever online magazines where you can “turn” or “flick” the pages with your mouse?
If you ever fancied getting your web content displayed in one of these flash viewers (you don’t even have to have a magazine, a product catalogue will do for example), ZMAGS provide an affordable way of creating these “online magazines” via the upload of various PDF files that represent the pages of the e-mag.
The service costs Â£19 for 1 magazine, Â£79 for 10 and Â£149 for 25.Â More info here.
For Pocket London, we built a system around the Zmag IFrames that enables the client to “brand” each e-mag separately to their specifications.Â The client can change the header, footer or an intro page around the IFrame through a bespoke content management system, therefore providing limitless possibilites of re-branding their online magazine for any number of clients.
Here are a few examples of the finished product:
Last year, I looked at the current trends in terms of browser and operating system market shares, as well as what screen resolutions people are using and the most popular internet user locations in terms of country.
I have been keeping constant track of changes in this data over the past year or so and have been able to study trends and changes in different usages.
My source for the data is W3Counter’s Global Web Stats, who compiles these usage statistics every month by studying the last 25,000,000 vists to approximately 12,000 websites.Â There are some other sources for browser activity, for example, W3Schools Browser Statistics (not accurate because statistics are only based on visitors to 1 website).Â By sticking to one source for information (even if it is not 100% accurate), it is easier to accurately discover just how quickly things change over any given time period.
Looking at browser usage statistics, Firefox’s share of the browser market has significantly increased.Â According to the latest statistics, the total market share for the Mozilla/Firefox browser family is 29.62% compared to IE’s 61.43%.Â Over the past year, Firefox has narrowed the gap by almost 10%.Â These statistics take an average across many different countries, although recent reports suggest that in some countries, Firefox’s share is significantly higher – for instance in Poland where it is 45%.Â However in the UK it is thought to be less than the global average at only 20%.
Internet Explorer 6 has shown a steady decrease in usage with Internet Explorer 7 increasing, although 6 still remains the most popular.Â With the advent of Internet Explorer 8 on the horizon, it will be interesting to see how many people ditch IE6 immediately for version 8 and how many make the switch from 7 to 8.Â IE8 has been billed as a far more standards compliant browser, that previous versions have been severely lacking in, which could potentially see some recovery for Internet Explorer over Firefox’s increasing market share.
The new Firefox 3 has just been released with rave reviews all round and is already increasingly its popularity.Â Although the stats for June only show a 1% market share, this is sure to jump right up over the coming few months as more people upgrade to version 3.
Safari still “enjoys” a rather low percentage of the market (just 2%), despite now being available on PCs and Opera is showing a small but steady increase in popularity, though still only at a 1% market share.Â The AOL browser seems to be completely dropping off the radar as more users are encouraged to switch to far more sophisticated browsers such as Firefox.
The following two graphs illustrate these shifts over the past year:
Recent operating system activity shows only slight changes in the market share between Windows and Mac operating systems.Â Apple have increased their share to roughly 5% and Microsoft Windows versions account for 91%.Â The most interesting statistic is slow uptake of Windows Vista, now a year and a half after its original release, the market share is just 8%, while computers running XP still account for 78% of the market.Â With the possible release of a new Windows operating system less than 5 years away, it has been commented on that many users will stick with the largely stable XP until then, after the many grumblings and problems that Vista has had.
The following is a basic illustration of the change in operating system market share at the lower end of the market (not including XP) over the past year:
An analysis of the usage in various countries gives little indication of any change since my last report.Â The developed nations of the USA and UK are seeing a decreasing share (although I suspect still an increase in overall usage in these countries) and making way for faster developing countries such as Germany, China, The Netherlands and Turkey.
The statistics on screen resolution confirms we are in the middle of what I like to call a “Widescreen Revolution”.Â Screen resolutions such as 1280×800, 1440×900 and 1680×1050 have all shown a significant increase over the past year, and the “old reliable” 1024×768, 1280×1024 and 800×600 resolutions are on a steady decline.Â Particularly in our office we waved a rather unemotional farewell to the last or our old CRT monitors that made a bee line for the scrap heap.Â I’m sure there are thousands of other offices across the world also seeing the last of those big, chunky, heavy monitors.Â In most cases they are being replaced by not only one widescreen flat monitor, but in many cases – 2 or even 3.Â Yes, we are also seeing the “Multiple Monitor Revolution”!
The following graph illustrates this behaviour:
There will be another update on Global Web Stats in 2009.
Microsoft has released for download Hyper-V version of Windows Server 2008, this component will allow IT users to run different virtual machines on one physical machine without the need of running a full Host OS.
With server virtulization you can cosolidate work loads of underutilized machines onto smaller number of servers reducing energy and hardware costs and maximizing performance of all your servers.
Other beneifts with virtualization is the ability to take snapshots of a running machine wich can then be easily revert to a previous state improving the overall recoverability your Virtual Machines. The ability to easily test upgrades and service packs prior to upgrading the production machines without the need of purchasing new hardware. Migrating Virtual MAchines from one physical host to another with minimal downtime.
Alan Stevens from ZdnetÂ has written an excellent article on the topic of Hyper-Vï¿½
With more and more companies outsourcing work abroad or even having their work force located around the globe and many roaming users that want access to their full system as if they where in the office.
IT professionals or facing huge challenge in maintaining client’s desktops and backing up all this dispersed data. For example you are a software company that has just been awarded a new project and you need 10 more developers, these developers needed to run Visual Studio, WebSphere, Full MSSQL and other office applications.
We all know how hard it is to source developers based in one location but if you recruited developers from around the world your task would be much easier other than the fact that you would need to purchase, install and maintain these desktops. The cost of buying the equipment would be high given that once the project is finished it would be difficult to return the machines back to base.
Other solutions would be setting up Citrix Presentation Server Infrastructure which is far too expensive or the VMARE VDI Approach which uses too many system resources for memory storage and CPU.
I recently came across software that would certainly fit the bill for the above solution.
By using Virtuozzo’s virtual desktops you are able to install windows 2003 deploy virtual desktops which are connected via IP using remote desktop you can change the look and feel of the OS to behave just like a desktop.
The foot print of each Virtual Desktop is so small in comparison to VMware as only one host OS is installed and furthermore you are able to share the full resources of the Server between all Virtual Servers e.g. if you installed Virtuozzo on a Quad Core Server with 16 GB of ram each user could use up to 16 GB of ram depending how much other Virtual desktops where using so you could end up with a virtual desktop that is much faster than your local machine.
Management of the virtual desktops is more simplified as you only have one host to manage and deploying different software in each Virtual Desktop is as simple as ticking the package you want each use to have.
This is still a new concept and not widely used but that was the same for server virtualization just 5 years ago not many companies used this technology and know it’s one of the most talked about topics.
There are still several drawbacks that need to be resolved for this to hit the mainstream and become more widely used.
- Firstly thin clients are still to expensive (Most users would be using the personal machine anyway)
- The graphics for multimedia users leave a lot to desire
So for any business that find themselves in a similar situation this is a would be a worth considering as the initial investment is very low compared to other options.
This the latest news in Virtuozzo’s Efforts to make VDI a reality
Recently there has been a lot of talk about what software designers should use, or whether they should just skip the entire process and go straight to HTML & CSS.Â The debate continues, butÂ I want to echo some of the responses made and emphasise that design software in any form is just a tool, and that which tool you use hugely depends on what you are trying to achieve.
We have a huge variety of clients and design needs here at Datadial.Â Over the last month I have used Photoshop, Visio, the good ol’ fashioned pen and paper and in one case I did go straight to HTML/CSS.Â What method do I like to use?Â It all depends on the client’s needs!Â The project that I went straight to HTML with had a very minimalistic layout that is hugely dependant on typography.Â The ones I use pen & paper on require a large level of creativity and speed that you cannot get from going straight to a computer.Â Photoshop is very flexible and creative, likewise for Illustrator.
I don’t think the tool really matters either. As long as the end result is what the client is after and works for their audience, then whether or not you use Photoshop doesn’t come into the equation.Â The tool is not going to make a successful website – that comes to understanding who is using the site and designing for them.
My bookmarks toolbar in Firefox is full with great websites at my fingertips to keep up to date with what is happening in the web design industry.Â I have recently moved to London from New Zealand and I accidently left my great list of resources behind – whoops! I really should get into online bookmarking!
So I’ve had to rebuild my bookmark collection and thought it would be nice to share some them with you here (in no particular order…)
- Web Designer Wall – perhaps my favourite website of all, the mixture of news, tutorials and resources is just spot-on!Â And the site looks so pretty
Spoon Graphics and Design Reviver are in a similar vein with that magic combination of content.
- Best Web Gallery is a great spot for research and inspration.Â I really like it how this gallery doesn’t discrimate about the technology used on the featured site, like there are Flash sites mixed in with Web Standards sites – but they are all chosen because of their outstanding design.
- A List Apart – the authority and cutting edge on web standards developments.Â Publishing techniques like the Faux Absolute Positioning is just invaluable, and illustrations are just so gorgeous…
- Which brings me to illustration websites.Â Unfortunately most of the great illustration sites I used to have bookmarked were ones I’d stumbled upon by accident, but Keven Cornell’s site is a superb website that inspires me before I even read a word!
- Signal vs. Noise is a must read for anyone in the web world.Â Likewise for other blogs by the industry leading guys such as Douglas Bowman, Jeff Croft, Mark Boulton and John Hicks.Â I still miss Andy Clarke’s old blog, but keep an eye out for the occassional post at Stuff & Nonsense
- Design Float is a great website that collates other blog posts about web design.
- Other design resources like ColourLovers (perfect for fleshing out colour schemes), Photoshop brushes, free fonts and more tutorials.
- Viget Inspire is a great blog by the web designers at Viget, addressing many issues regarding web design.
- And last but by no means least is the Web Standards Group email list.Â Over the last year or so I’ve found that the list itself can be rather tedious at times, but then once a week Russ Weakley puts out the Links for Light Reading which is a summary of what’s going on around the blogosphere and in the web design/standards world.
With Sainsburys website going down this week and Amazon’s the weekÂ before it is worth taking a moment or two to consider what would happen to your business if your website was out of action.
For most e-commerce clients this could have catastrophic consequences: not only would sales evaporate but you would also lose access to all your salesÂ and customer data.
However, hosting and the quality thereof is often totally ignored in client briefs.Â It is definitely not a priority and is only ever occasionally paid lip service.Â This may have been ok a few years ago when websites were just an experiment and an addition to an existing business rather than the core to a business.
However, it is difficult to convince people to invest in proper corporate hosting as there is aÂ perception that hosting should be practically free. It’sÂ true there areÂ some companies offering hosting for Â£10 per year of less.Â Honestly, what do you think theseÂ companies would provide in terms of back up or reliability servcie lever agreement.Â Not much I think.
So what should you consider when hosting your website.Â Â There are many things to consider, here are just three.
First, where is your website being hosted and who manages and owns the servers.Â Most websites in the UK are hosted in Telehouse in Docklands where there are massive generally well managed datacenters.Â But have you ever seen inside a data center, have you ever asked about their air conditioning systems, their own back up power supply, their connectivity to the web?Â Then what about the servers; who owns them? Who is responsible for updating them with critical patches? When is this done?Â What happens if they crash? What is the rebuild time? How many other sites are their on the same server as yours? How secure is the access to the servers?Â If you are not asking these questions then you are not taking your website presence seriously.
Second there are back ups.Â How often is your site backed up and where are the back upsÂ kept? If your service provider’s data center is blown up (a very realistic proposition, especially if you house your website in Telehouse in docklands) will the back ups go up with it.Â Â If they are kept offsite how often are they taken offsite.
Third, what aboutÂ redundancy.Â Â If your server crashes is there a mirror server which will automatically take over?Â What if your website is overloaded with visitors, can your server handle the traffic?Â Is there a load balancing mechanism that will automatically divert users to an alternative server?
All these issues need to be addressed when considering hosting and website owners need to change their mindset from considering hosting as essentially a free service to one that is valued and is invested in appropriately according to business requirements and risk assessment.