HTML 4.5 Anyone? | January 24, 2007

HTML was created for document mark-up, and it still does a pretty good job. However as websites become more application focused, it becomes harder to use HTML to model complex user interfaces. We have the basics in the shape of form widgets, but there are a number of key UI elements missing. You only have to look at the average desktop application to see the wide variety of widgets on display, from push-in buttons to vertical and horizontal sliders. You can “hack” some of these elements using CSS and JavaScript, but it always seems like a less than ideal solution.

One of the benefits of application development with Flex 2.0 is its wide range of interface elements, or components. The developers have thought about the needs of modern web applications, and created a really handy toolkit of widgets. The widgets are essentially a combination of markup (MXML), styling (CSS) and behaviour (ActionScript), so are not dissimilar to the widgets we have currently. Developers can also create their own widgets, which is both a blessing or a curse. It’s a blessing because if you really need a particular widget you can create it yourself. It’s a curse because it could also lead to a proliferation of non-standard widgets which could ultimately lead to a reduced user experience.

The feeling that the web is outpacing (X)HTML and CSS development has been growing on me for some time. While commercial organisations are quick to react to client feature requests (not always a good thing), it seems that the W3C is getting bogged down in internal politics and the desire to please a diverse group of stakeholders.

As a reaction to this, a group of web developers creating their own “standards” working group called the WHATWG. The result has been a HTML derived a draft specification called Web Applications 1.0. The draft spec includes some very interesting suggestions, along with some more controversial ones.

One recommendation is predefined class names which just smells wrong to me. Class names have always been a way of authors adding their own meaning to a document, and to that end are very powerful. You see them being used very successfully with microformats, which is a good thing. However it is their looseness and flexibility that really gives them their strength, so predefining them seems wrong. We had this discussion in the office and Jeremy rightly pointed out that if you need to mark something up as a copyright message, you should create a copyright element instead.

Looking through their spec, they have recommended a number of new GUI elements and attributes that could prove very helpful.

In part as a reaction to the WHATWG, and in part due to calls from the browser vendors, the W3C have set up a new HTML working group to look at developing HTML further. I’m interested to see what comes of this, although I do worry that 2010 may be a little late.

I’m also interested to see the crossover with the new HTML working group and the web forms working group. Web forms are basically an extension of the existing form controls and adds some interesting features like validating or requiring input and auto completion. Here are some of the additions that look interesting to me, although I’m sure there are more.

This progress is good, but I think there are a lot more useful elements you could add to both lists. Here are some of the elements I’d like to see in an application focused mark-up language.

This is just a start, but I’d be interested to hear what UI elements you think are missing?

Comments (27)

The Great British Booze-up | January 23, 2007

UPDATE: The Venue has now changed to the Lava Lounge, 405 E 7th St, so please update your records!

Clearleft, Boagworld and @media 2007 are pleased to announce the first annual “Great British Booze-up” at SXSWi this year.

With so many of our fellow countrymen heading over to Austin, we thought it only fitting to throw a party. We’ve hired out a traditional <cough/> British pub and will be delighting you with an evening of drinking and merriment. This is your opportunity to meet some of your favourite British designers and marvel at their funny accents and eccentric ways.

We’ve put some money behind the bar, and thanks to the beneficial exchange rate, this should last for a while at least. We’re also organising food, so you won’t be limited to warm beer and pork scratchings. In fact, it should be the perfect destination for all you hungry and thirsty conference goers. However you’ll want to get there early as numbers are limited.

The event kicks off at 7:30pm in the Lava Lounge on Seventh Street. Last orders will be at 10:30pm, just in time for a final drink before heading round the corner to SXNW.

To set a reminder, why not head over to the upcoming page for the event. Hope to see you there.

Comments (10)

Small Office Backup with Rsync | January 23, 2007

When setting up our new office, I wanted to ensure everything was backed up correctly. I asked around for backup solutions, but the options were overwhelming. As we were a new company, I didn’t want to spend a huge amount of money on complicated software or hardware solutions, so in the end we went with something that’s already built into the operating system–Rsync.

This small but powerful command line tool forms the basis for a lot of Mac backup solutions, which are essentially GUI front ends. Rsync is much loved by techies, but I’m no Unix wizard soit took a while to get things set up. This quick tutorial outlines how I’ve got backups working at Clearleft. This is by no means a definitive guide, and I’m sure there are much better ways of doing it. So if you’ve got any better ideas, please let me know.

The first step was to find something to back-up to. We thought about network attached storage (NAS), but in the end went for the simple option of a Mac Mini connected to a removable hard drive. We have two such drives and rotate them weekly to ensure we have an offsite back-up.

What we’re going to do is set the Mac Mini up so it connects to each machine on the network at a set time of day, and then run an Rsync back-up. To connect to each machine you first need to give them a distinct IP address on your network.

Go into your network preferences, select the TCP/IP menu option and in the “Configure IPv4” dropdown, select “Using DCHP with manual address”. I’m not sure what the best IP numbering convention is, but we have all our desktops starting from, so other devices like routers, printers or laptops can grab the first 10 slots automatically if they want.

OS X network preferences

Once each machine has an IP address, you need to make sure the Mac Mini can connect to it over SSH. To do this, go into the sharing preferences and check the “Remote Login” option.

OS X sharing preferences

Now, lets create the backup command on the Mac Mini. I’ve created a new folder on the Mini called back-up where I’m keeping all my configuration files. Create a new text file in this folder and call it the a sensible name like andybak.command.

First you need to set all the required flags for the rsync command. I’m not going to go into them all, but if you’re interested you can type man rsync for the full list.

rsync -a -v -r -S -x -z --delete -e

The next thing you need to do is connect to the machine and folder you wish to backup using ssh

ssh andy@

Now specify the target location of your backup. In our case it’s a mounted volume called “LaCie Disk”

/Volumes/LaCie\ Disk

Lastly we don’t want to back up everything, so I’m going to create an exclusions text file. Add a pointer to this text file next.

--exclude-from /Users/clearleft/backup/andy_excludes.txt

Save this file and create a new file for your excludes called andy_excludes.txt. In this file list all the folders you wish to exclude. I’ve got a lot of music on my machine so I’m going to exclude the music folder. If you have lots of movies or pictures, you may want to exclude those folders as well.


Save the textfile.

Now we can run the command and see if it works. If you want to be extra cautious there is a flag you can add to your command file that will run a simulation instead of the real thing. As this will be the first time you’ve run this command, the initial backup may take a while. To run the command, simply double click the file and it should launch and run in the terminal window.

The first thing this command will do is try to connect to the computer you’re backing up using SSH. Because this is the first time you’ve connected, it will ask you if you’re sure of the authenticity of the host. Type “yes” to proceed. You’ll next be asked the password of the host machine. Type it in now and the backup will start running. Go make a cup of tea as it may take a few minutes.

Once the back-up is complete, check that a new folder has been added to the backup drive and that all the selected files have been backed up.

Now you obviously don’t want to enter the password each time you run a backup, so you need to set up a public and private key on the backup machine, and then copy the public key over to the host machine. This is where things get a little tricky as there are numerous ways of doing this, some more secure than others. Luckily I did this ages ago, so I’m not even going to attempt to explain how this is done. If you’re interested, do a search on ssh or public key authentication on OS X.

On the Mac Mini, locate your public key. In our case the file was called and it was in a folder called .ssh. Using secure copy (scp), copy this key to the authorized_keys file in the .ssh folder on the machine you’re wanting to connect to. OIf the file or folder doesn’t exist, you will need to create it.

scp /Users/clearleft/.ssh/ andy@

You’ll be asked for the password of the machine you’re connecting to. Once you’ve entered it, the files will copy over, and you’ll never be asked for a password again. To check the public key is working, run the backup command again and it should run without asking for a password.

We’re almost there. Just one last step in order to make the backups really useful. We need to automate their execution. To do this, you need to decide a time for each backup to run. We run ours in the evening when everybody is out of the office, to avoid the inevitable network slowdown. First, go into the energy saver preferences for the machine you’re backing up, click the “schedule” button and wake the machine up 5 minutes before you plan to run the backup.

OS X energy saver preferences

Then go back to the Mac Mini and edit your crontab file.

sudo pico /private/etc/crontab

Set the time you want the command to run in minutes and hours, and leave the day, month ect starred out, so your backup runs every day. Under the command heading, add the path to your command along with an optional path to a log file.

/Users/clearleft/backup/andybak.command >> /Users/clearleft/backup/backup.log

Do this for every machine on your network, and every night you’ll have trouble free, automated backups.

Comments (10)

Conference Tastic | January 18, 2007

In the space of a few days, three new conference websites have launched. First up is @media 2007. If you’ve been to @media in London the last few years, you’ll know what a fabulous conference this is. Well, Partick has been locked away in his underground dungeon for the last couple of months planning world domination. There are now three, not one, @media events in 2007. One in London, one in San Francisco and one in Hong Kong. So if you’ve always wanted too attend but couldn’t justify the transatlantic flight, now you can.

I have the honour of speaking at the Hong Kong event, while it looks like fellow Clearleftie, Jeremy Keith is going for the hat-trick. Hong Kong is a fantastic place and @media is a fantastic event, so I can’t wait.

On the subject of world domination, the Carsons keep churning out new events faster than you can say, “blimey, not another web design conference!”. The latest event is entitled The Future of Web Design and looks set to be a winner. I for one will be at the front of the queue for tickets.

Lastly, the new kid on the block is Scotland’s very own Highland Fling. Focussing around the concept of web standards and progressive enhancement, this new event aims to move the focus out of London and the South East. So if you’re a web developer working in Scotland or the North of England, I hope you’ll lend your support to this worthy event.

With so many web design events already scheduled this year, it won’t be long before we see a “Future of Web Conferences” event somewhere in the world.

On the subject of conferences, SXSW is fast approaching so I hope everybody has their flights and accommodation sorted. With so many Brits flying to Austin this year, we’ve decided to organise a bit of a treat. Clearleft, Boagworld and @media are organising a party and you’re all invited. It will be your opportunity to hang with the BritPack and enjoy a good old fashioned knees-up. More details soon.

Comments (13)

Heuristics for Modern Web Application Development | January 17, 2007

Heuristic evaluation is a technique that involves analysing the usability of a website against a set of general usability precepts. One or more “experts” will analyse the target site, often following a series of pre-defined scenarios. Whenever they encounter an issue that breaks one of the precepts or “heuristics”, they will note the issue and sometimes the severity.

Heuristic evaluation is usually done either to augment usability testing, or where usability testing is impractical or cost prohibitive. Heuristic evaluation is considered slightly more objective than a simple “expert review” as the results are based upon generally agreed guidelines rather than personal opinion.

There are a number of different usability heuristics around, but the most popular ones on the web are Jakob Nielsen’s 10 usability heuristics and Bruce “Tog” Tognazzini’s basic principles for interface design.

As part of my consultancy work at Clearleft I run a lot of expert reviews and heuristic evaluations. While planning a recent evaluation, I started to feel that the existing heuristics didn’t accurately describe the requirements of a modern web application. In particular I felt that Mr Nielsens heuristics were somewhat convoluted, contained a lot of overlap and varied widely in terms of scope and specificity.

Since Mr Nielsen first created his heuristics back in 1990, the web has changed on a lot. Many of the underlying principals remain the same, but their relative weight has shifted. So using these heuristics as a starting point, I set out to create a set of web application heuristics that better reflected the current landscape.

Usability heuristics are by their nature subjective, so I don’t claim what follows is a definitive list. However I have tried hard to cover as many common usability issues as possible. There is still a lot of overlap, but I think this is because one problem can the result of multiple causes.

Anyway, this is just a first draft so I’m really keen to hear your opinions.

Design for User Expectations

Design the system around the users, their goals and expectations. Choose features and functions the audience will find useful and use the appropriate level of complexity for their experience and abilities. Make processes work in the way users expect, and mirror real-world processes where applicable. Ensure the interface always upholds its promise and never trick or mislead the user. E.g.


Make the system as clear, concise and meaningful as possible for the intended audience. Use meaningful icons, symbols and imagery. Use the natural language of the user and optimise for skim reading. E.g.

Minimize Unnecessary Complexity and Cognitive Load

Make the system as simple as possible for users to accomplish their tasks, but no simpler. Do not overload the user with too many unnecessary choices, and make sure those choices are prioritised. E.g.

Efficiency and Task Completion

Design for user productivity, not the system’s. Optimise the system for the most common tasks. Provide experienced users with advanced features that speed up task completion. Use the most common defaults and honour user preferences and previous selections. However, allow them to be easily overridden when necessary E.g.

Provide Users with Context

Interfaces should provide users with a sense of context in time and space. The system should let users know where they are, where they have come from, what they can do and where they can go next. Processes should inform users of the progress they have made and the remaining duration. E.g.

Consistency and Standards

Labels, processes and interface elements should be used consistently throughout the system. The system should use common web conventions unless a new convention provides a significantly improved user experience. E.g.

Prevent Errors

The system should help prevent errors wherever possible. This can be done by limiting incorrect choices, accepting alternative input formats, providing guidance and inline validation where applicable. E.g

Help users notice, understand and recover from errors

Errors should be obvious and easy to recover from. Error messages should be clear, concise and easy to notice. They should succinctly explain what has happened and suggest possible solutions. E.g.

Promote a pleasurable and positive user experience

The users interaction with the system should be positive and where possible enhance their quality of life. The user should be treated with respect and their preferences and wishes honoured. The design should be aesthetically pleasing and promote a pleasurable and rewarding experience. E.g.

Comments (7)

User Error is Our Problem, Not Theirs! | January 17, 2007

I witnessed something happen on a web developer mailing list the other day which I’m not proud of, but which is all too common in our industry. A group of experienced users rounded on a group of less experienced users for making a simple error, and then proceeded to put them down in public for their “stupidity and laziness” in not learning the system.

Sadly this is an all too common event when the technically astute developer come in touch with user error. Rather than blaming the system they created, these developers are all too keen to blame the users for their error. This reaction is somewhat understandable, as the developers know the system inside out and understand how it works. There is a good chance they even created part of the system and imbued it with their world bias. Because of this domain knowledge they just don’t understand how another users can’t get what they find so easy.

The problem is, most people don’t want to master the system, they simply want to get their tasks done in the simplest way possible. Users don’t sit and ponder all the possible options before making a choice. Decisions are made in a split second and are usually based on the first best guess. Apart from being looked down upon by developers, this approach has a low cost of failure and makes a perfect coping strategy.

It is all too easy to blame users for their mistakes and has become a bit of an in joke in developer circles. I’m sure we’ve all heard the story about the user calling tech support because his computer wouldn’t turn on, only to realise later that he’s in the middle of a blackout. However, while funny, there stories help to create an us and them attitude of superiority. The truth is, we are all that “dumb” users at some stage in our lives and shouldn’t write off other peoples experiences just because of our own personal biases.

In most cases, it’s not the fault of the user for making an error. It’s the fault of the system for allowing the error to be made. Ultimate responsibility for the error lies with the developers of the system, the very people who are so quick to scoff at their users stupidity.

So I implore you, “don’t be that guy”. See every user error as a gift. An opportunity to exercise your problem solving skills and make the system smarter. After all the goal of technology should be to empower people, not to make them feel stupid and inferior.

Comments (17)

End of Year Review: 2006 | January 11, 2007

End of year reviews are all the rage in the blogosphere, and I’ve been meaning to do one since before Christmas. However if 2006 was typified by anything, it was a severe lack of free time. This has yet to abate, which is why I’m late to the party with my review of the year. 2006 has left a deep impression on me for a variety of reasons. In fact I’d say that it was probably the most tumultuous year I’ve had in a long time.

The year started off well with the publication of my first book. I’ve been completely overwhelmed by the response to CSS Mastery, and would like to thank everybody who went out and bought a copy. Considering most tech books sell less than a thousand copies, the book has surpassed my wildest expectations, selling over 15,000 copies and reaching it’s 7th print run. It’s ended up being my publishers best selling book this year, as well as appearing in Amazon’s Top 10 Editors Picks and Top 10 Customers Favourites of 2006. However the highlight had to be briefly beating Harry Potter in the Amazon sales rankings, if only for a couple of days.

I spoke at SXSW again in 2006, and had a super time. The British invasion was even bigger than the previous year, and we ended up leaving our mark on Austin both figuratively and literally. There were some amazing parties, and the free food and drink was flowing even more copiously than before. In fact I think I only spent $50 the entire time I was there! It was great to meet up with friends from the previous year, as well as meeting lots of new people. SXSW 2007 is only a couple of months away now, and I can’t wait.

After the conference I headed down Mexico with my then girlfriend for a well deserved spot of R&R. Highlights included the ruins at Tulum, diving the caverns of Dos Ojos, and enjoying the bath warm private pool at our hotel. Shear heaven.

Over the last year Clearleft has been going from strength to strength. In June we celebrated our first birthday and office move with a party at our favourite cocktail bar. Six months later and we’d moved again, this time to swish new offices in Brighton’s trendy North Lane. In the past year we’ve had the honour of working with some amazing clients on some truly fantastic projects. As well as project work, we’ve also been running a lot of training courses, along with organising our second web development conference.

d.Construct 2006 was a resounding success, with tickets selling out in just 36 hours. We had some great speakers this year, and if you haven’t listened to them already, the podcasts are available online. Everybody who came seemed to have a good time, and you see what they all said via the excellent backnetwork. We’ve already started planning for next years event and I’m really excited about how it’s starting to shape up.

Talking of conferences, I had the pleasure of speaking at a load of great events last year including @media2006, BarCampLondon, Webmaster Jam and Refresh06. Refresh was particularly fun as I also got to go on a behind the scenes tour of the Kennedy Space Centre thanks to one of Paul’s listeners from Boagworld. There are even more great events planned for next year, including @media2007 in Hong Kong which I’m really looking forward to.

During the summer, my girlfriend and I finally bought a flat in the Kemptown area of Brighton. We’d been going out for 7 years and slowly watched house prices rise around us. I’d always been very hesitant about buying a flat together, but it was something Mel had wanted to do for some time and I eventually capitulated. It was a lovely little flat, and we put our energy into sorting the place out. On the surface things seemed OK, but that turned out not to be the case.

Over dinner one evening, Mel told me that she didn’t think things were working any more. I was initially quite shocked and tried to salvage the relationship. However during the next couple of months it became evident that we’d been treading water the last few years and had drifted apart. We were still close friends, but the relationship was over and I decided to move out. This was a very hard time for me, so would like to thank my friends and colleagues for their support and understanding during this time.

Thankfully things are starting to look up and I’m hopeful that 2007 will be a good year. I’ve found a nice little flat on my own and am currently readjusting to single life. I’ve recently met somebody new, and am enjoying the first phases of a new relationship. However I don’t want to jinx things by saying any more at this stage. Work is going well and there is even talk of a new book. So who knows what 2007 will hold?

Comments (9)

The Power of Info-graphics | January 4, 2007

There has been an interesting story circulating in the press today about food labelling. The government are trying to encourage food manufacturers to label food in such a way that shoppers can clearly tell which of a number of similar products are healthiest just by glancing at them.

The food standards agency realised that the current labelling system—while very good by international standards—is still quite complicated. If you want choose between two products for health reasons, you need to spend a considerable amount to time looking at the two labels, and even then it is difficult to tell which is better unless you know exactly how much salt, fat or sugar you are supposed to eat each day

Old style, information heavy food label

Two rival labelling systems have emerged. One system is called the traffic light system and studies have shown that it provides shoppers with a clear indication of which product is the least healthy. It works though a colour coding system, so green is healthy, amber is medium and red is unhealthy. Four main metrics are communicated; the amount of fat, saturates, sugar and salt an item contains. So by quickly glancing at a product you can tell if it is “unhealthy” by the amount of red and orange displayed on the info-graphic.

Traffic light label on a bag of chips (fries)

Traffic light label on a pizza box

Traffic light label using the alternative pie chart format

The problem is, it turns out that when faced with the traffic light, shoppers naturally (and some would say instinctively) avoid the products containing a lot of red traffic lights. This has obviously upset many manufactures who prefer a less emotive system called the GDA system.

This system shows how much of an adults guideline daily amount (GDA) of calories, sugar, fat, saturates and salt the product contains. The info-graphics the manufacturers prefer don’t include the traffic light system, making it much less emotive. They argue that the info-graphics provide more information to the shopper and leads to an informed decision.

The new GDA label

GDA label from a box of nestle cereal

Supporters of the traffic light system say that the GDA system is flawed because many people don’t have the time, ability or inclination to do mathematical calculations while shopping. This is an interesting argument from a usability, user-centered design and accessibility standpoint, and is actually supported through user testing. They argue that when you are in a hurry, the traffic light system gives the shopper the information they desire at a glance, and is therefore superior.

However supporters of the GDA system counter with the argument that some products like cheese, which are naturally high in fat and would therefore always have a red label, can still be eaten as part of a healthy diet as long as more than the GDA isn’t consumed.

I find it very interesting that a story about info-graphic design has been all over the TV and newspapers today. I also think it is very interesting how the two different camps are reacting to the two different types of info-graphic. To throw salt (sugar, fat and saturates) onto the wound, one option would be to combine both techniques. It would be very simple to add colour to the GDA info-graphic, but desaturate it slightly to make it less emotive. That way you would still be able to see which elements were high, medium or low at a glance while hopefully placating the manufacturers. I was planning to knock up an example but I’ve just got a new laptop and Adobe are forcing me to phone them up again to prove that I own my copy of Photoshop. This is starting to get tedious.

Comments (19)