Digital Education is Broken | January 31, 2016

Ever since I started blogging in the early the naughties, the emails came in. At first in dribs and drabs, one every few months. However by the end of the decade they were one or two a week. Emails from disgruntled students who had spent up to £9k a year on tuition fees, and even more on living expenses, to find themselves languishing on a course that was woefully out of date.

Their emails were filled with tales of lecturers from engineering, graphic design or HCI departments, co-opted to teach courses they didn’t understand because, well, it’s all just computers really? Tales of 19 year olds effectively becoming teaching assistants on the courses they were paying for, because they knew more than their lecturers. Students walking out halfway through their courses, because they were learning more from their evening gigs than they ever could at school.

It was in this context that Clearleft started our general internship program way back in 2008; to provide the growing ranks of self taught designers and developers the knowledge and experience they needed to succeed in the workplace.

Now don’t get me wrong. I’m not one of those libertarian Silicon Valley types who believe the role of education is to churn out dutiful employees. Far from it. Instead I want my tax funded education system to produce well rounded members of society; individuals who are interested in following their passions and who have been taught the tools to learn and think. Sadly digitally focussed courses, in the UK at least, are failing on even these most basic standards.

As I walk the halls of the end of year degree shows, I’m amazed and saddened in equal measure. The work coming out of digitally focussed courses with “User Experience”, “Interaction Design” and “HCI” in their titles are shockingly poor. The best courses represent the fetishes of their course directors; more art than design in most instances. The worst courses have the whiff of Kai’s Power Tools about them.

You’d be excused for thinking the institutions themselves were broken, were it not for the amazing digital work coming from other courses like Product Design, Motion Design, Graphic Design and even Architecture; work that showed a deep understanding of creative problem solving and an appreciation of the medium. So why are digital courses so bad?

I sit down for lunch with a lecturer friend of mine. He bemoans the state of digital design education, as he attempts to scratch a living on the higher education equivalent of a zero hour contract, working far more hours than he was paid for. Fighting for quality inside an organisation that doesn’t really care; that has too many other stakeholders born in a different era to worry about this “digital thing”.

The students are keen to learn, but how much can you really teach in 6 hours of lectures a week, by somebody who has never designed a commercial website in their lives; or at least the last 6 years? Is it any wonder that the graduates from a 10-week General Assembly course leave with more face time (and a better portfolio) than an 18-month Masters?

And so we continue to do what we can. Answering emails from disgruntled students, speaking on courses, offering student tickets, hosting CodeBar events, and running our internships.

And my lecturer friends do what they can. Running the best course possible within a broken system; hoping (and fearing) digital transformation will eventually disrupt their sector, like other sectors before it.

However there’s only so much any one individual can do on their own, which is why I’m pleased there are events like The Interaction Design Education Summit. I hope that through events like this (and others) we can put pressure on the institutions, improve the quality of courses, and help bring digital education out of the dark ages, in order to give students the learning experience they truely deserve.

Comments (2)

Why do agency account managers exist? | January 26, 2016

This morning Alison Austin asked the question…

It’s a valid question and one I’ve often wondered myself. As a company we’ve always been resistant to hiring dedicated account managers, having seen the worst excesses of our industry. I remember chatting to an account manager from a large digital agency, during a BBC supplier evening a few years back. She bragged at how she only got the job because she went to the same private school as one of their major clients and had a lifetime membership to The Ivy. It seemed her job largely involved getting clients drunk.

I suppose this is what account management was like back in the days of Madmen. You would win a big “account” made up of multiple smaller projects, then do everything you could to keep the client sweet. This is somewhat understandable when I remember another conversation I had a few years back, with the marketing manager from a large fizzy drink brand. He explain that their agency selection process involved picking 3 agencies from the NMA top 100 list each year, hiring one, and firing one of their incumbents. In this environment of fear, is it any wonder why agencies would do everything in their power to curry favour?

Fortunately I’ve only experienced this attitude once in my professional life. It was in the early days of our company and we’d just had a really positive meeting with a prospective client, so we invited them to lunch. From the booze fest that followed, it was clear these folks were used to being entertained; as they explained how they judged their agencies on the quality of restaurants they got taken to.

In some ways I could understand the attitude. I got the sense that they weren’t especially well paid (or indeed respected) by their company, so agency entertaining was one of the few perks of the job. However I looked back on the episode thinking that if we had to win work based on our ability to entertain clients rather than our ability to deliver, we would have failed as an agency.

While this attitude may still exist in some corners of our industry, it’s not one I recognise anymore. I like to believe that the majority of projects in the digital sector are awarded based on skill, experience, quality and price. So if the Madmen age is over, what do modern account managers do?

For very large accounts spanning multiple projects, the account manager acts as a constant presence on the project, ensuring the needs of the client are met. They’ll have a good understanding of the big picture challenges the client is facing, and be able to share those insight with the individual teams. They will also be there to help solve problems and smooth over any bumps in the road; essentially acting as the client champion within the organisation.

From the agencies perspective, they are also there as a consultant; helping to develop the client as a longer term prospect. This means working with the client to find new opportunities to solve their problems, possibly in areas the client didn’t know they had experience in.

In smaller agencies, this role is often done by the founder, project managers and project leads. In larger companies it’s centralised amongst a small number of account executives. It’s an important role, but not without it’s challenges.

Speaking with friends at agencies with a strong account management ethic, common gripes often come up. The main one being less experienced account managers promising clients new features with little understanding of what’s entailed. This is especially problematic on fixed price, fixed scope projects where margins are tight.

I tend to hear more concerns around account management from clients, who often feel that account managers are either too overtly sales driven (constantly trying to get them to spend more money) or acting as blockers between them and the people working on their projects.

Too often, these problems are caused by a misalignment between the clients needs and the way account managers are being judged and remunerated. Either that to it’s a reflection on poor agency practices and an attempt to keep clients at arms length, possibly to hide an ever changing team of junior practitioners and freelancers.

As such, while I understand the benefits of larger agencies hiring a small number of very experienced account managers, with a solid understanding of the industry, a large number of junior account managers always feel like a bit of a warning sign to me. However as somebody who has never really experienced account management first hand (good or bad) I’d love to know what you think?

Comments (0)

Can the balance between divergent/convergent thinking explain mid career peaks? | January 25, 2016

Divergent/convergent thinking is a fundamental part of the design process, and something most experienced practitioners are familiar with. Essentially the design process is broken down into two phases; a phase where you open up the problem space and explore as many different directions as possible; and a phase where you start analysing all the possible solutions you’ve come up with, in order to settle on the perfect answer.

It’s easiest to see this approach play out in the world of branding; the designer filling their notebook with pages and pages of graphic experiments, before selecting a handful that meet the brief in different and interesting ways. Rest assured that all good designers work this way, from physical product designers cycling through dozens of concept drawings, through to interface designers exploring countless different UI variations.

If you’ve been involved in a well executed brainstorming session, you’ll understand the benefits of this approach; allowing you to explore a large number of ideas, without the dampening effect of analysis.

You may have also experienced a badly run “brainstorming” session where ideas are debated and discarded as soon as they are created. This approach not only slows the process down, severely reducing the volume of ideas that are generated, it also discounts potentially novel ideas before they’ve had chance to breath.

This process always reminds me of classic crime dramas where there detectives post all of the clue up on a wall in search of patterns. The mediocre detective will jump to the most obvious conclusion first, spending the rest of their time trying to prove their hunch right (and often arresting the wrong person in the process). Meanwhile our hero spends their time assembling clues, exploring the problem space, and analysing all the possible angles, before coming to the less obvious, but ultimately correct conclusion.

So as a designer, how do you decide how much time to spend exploring the problem space and generating ideas, versus honing in on the end solution? And what are the risks involved in spending too much or too little time on either activity?

In my experience, novice designers tend to jump to the convergent phase far too quickly. This is partly because they’ve been mis-sold the idea that design is driven by that elusive spark of creativity, rather than a deeper process of problem solving. Creative ideas are viewed as rare and precious things in need of immediate nurture.

Early in your career, all your ideas seem fresh and novel, so you’re eager to get stuck into the execution, especially as your craft skills are more developed than your ideation skills. Essentially you end up running from an area you don’t feel comfortable with, to one you better understand. I’ve seen plenty of novice designers abandon potentially interesting ideas in favour of more fully fleshed but obvious ones. These ideas may not seem obvious to the designer in question, but more experienced designers will have seen the same tropes time and again.

Good design educators work hard to prevent their students for jumping to the most obvious conclusion, running exercises like “100 designs in a day”. As the name suggests, the students are encouraged to come up with 100 versions of a common design problem, like designing a new chair. The first fifty or sixty designs are usually easy to come by and are typically discarded for being too obvious—variations of designs they’ve seen many times before. It’s the next twenty or thirty designs that get really interesting, where the designer has to really think about the problem and come up with something truly novel.

The “100 designs in a day” exercise is a type of “design game” that acts as a “forcing function”; essentially a way of forcing you to think divergently. The best designers will tend to have an arsenal of similar activities in their toolbox to draw upon when needed.

I’m always nervous when I come across designers who appear to be driven by “creativity” rather than process. Eventually this unbounded creativity will dry up, and they’ll be reduced to aping the styles of other designers, unable to explain their designs other than “it felt right”. Instead, like my old maths teacher, I like to see the workings out; to understand how the designer got to the current solution, and make sure they could replicate the process again and again.

If novice designers spend too little time exploring the possibility space, experienced designers often spend too long; trying to explore every nook and cranny and gather every piece of evidence possible before starting down the route to a solution. This is evidenced by the classic Einstein quote many senior designers love to re-iterate; “If I had only one hour to save the world, I would spend fifty-five minutes defining the problem, and only five minutes finding the solution.”

While it’s true that any nontrivial problem requires a good amount of divergent thinking, spending too much time exploring the problem can form a mental trap akin to analysis paralysis, making it difficult to come up with a solution that solves all the problems you’ve uncovered. This is one of the reasons why large organisations often benefit from enlisting the help of external consultants who can bring a fresh perspective unencumbered by years of exploration and analysis. But these external agents may only have a 6-month grace period before they get indoctrinated into the organisation and start getting similarly overwhelmed.

Architect Eliel Saarinen said it best when he famously said “Always design a thing by considering it in its next larger context - a chair in a room, a room in a house, a house in an environment, an environment in a city plan.” Novice designers regularly jump straight to the chair, ignoring the room it’s in, while very senior designers get so obsessed with the room, the house and the city plan, they ignore the impending seating needs. The logic often seems to be “how can I possibly design a chair, when the city infrastructure to deliver the chair is broken!”

From my experience working with students, interns and junior designers, novices often spend less than twenty percent of their time on divergent activities, and end up obsessing over the convergent process. This works for relatively simple projects, but fails for anything remotely complicated. By contrast, many senior designers will spend up to eighty percent of their effort on divergent thinking, leaving their production team to do most of the converging. Although the ultimate figure depends on the problem you’re solving, in general I think the balance needs to be closer to 60/40 in favour of divergent thinking.

If the idea that designers start their careers focussed on convergent thinking and become more divergent over time holds true, this may help explain why many designers seem to reach a creative peak around 8 years into their careers. At this point they have got out of the habit of rushing to the most obvious solution, and are spending a good deal of time understanding the problem and exploring a variety of leads. They still have enough focus on delivery to reserve enough time for convergence, thereby avoiding the divergence trap.

Comments (0)

Design like a Michelin Star Chef | January 19, 2016

The England of my youth was a desert for good food. The difference between a “good” restaurant and an average one lay mostly in the surroundings; that and the use of slightly more expensive ingredients. But white cotton table cloths and snooty service weren’t enough to hide the mediocre food that lay therein. That’s why I used to relish my regular trips overseas, to eat at restaurants where the owners actually cared about what they were producing.

Jump forward 20 years and the landscape has changed dramatically. England is awash with top-end restaurants and Michelin Stars abound. Quality cooking now permeates popular culture, thanks to shows like Master Chef. This attitudes has trickled down to neighbourhood bistros, mixing locally-sourced produce with the skill of the chef. As a result we’ve developed the vernacular and know when something doesn’t make the grade; we’ve basically become a nation of food critics.

We still have average restaurants, but they are few and far between. Instead, a rising tide has raised all boats. Even pubs, and more recently the humble pizza restaurant and burger joint, have gone gastro. The UK really is in the midst of a food revolution. So much so that I now look forward to returning from overseas trips, because of the food.

In this environment, it’s no wonder that a recent show on Netflix charting some of the best restaurants in the world was an immediate hit amongst my colleagues. The level of passion and craftsmanship the chefs demonstrated was amazing. These chefs sweated over every detail, from the provenance of the produce, to the service experience. Experimentation was key, and you could tell that every dish they produced looked and tasted fantastic, elevating cooking to an art form.

This focus on quality struck a chord with me as a designer. It’s an attitude that’s been baked into Clearleft from the outset, hiring people who really care about the details and want to go the extra mile, not just for our clients or their users, but for the field itself. Like great chefs, designers find it difficult to explain the extra effort that goes into an amazing composition. It’s actually fairly easy to knock up something palatable if you have the tools to hand. However it takes a huge amount of effort to craft something noteworthy.

Where quality is concerned, whether it’s with food or design, it usually takes 20% of the effort to deliver 80% of the quality, and a further 80% of effort to deliver the last 20% of quality. I call that the effort to quality curve, and most people stop where the differential is highest. But it’s the last 20% that elevate a dish from average to amazing.

Sadly the current design climate reminds me of 90s cooking. The big studios, like the big chain restaurants, are more interested in delivering a consistent experience rather than a quality one. So they put processes in place that ensure minimum quality, but do nothing to foster true creativity. Many agencies and individuals come off looking like fast food joints, using frameworks and templates to speed production and deliver a slew of me-too products lacking in love or a sense of craft.

By comparison, when I look around our studio—and others like ours—I see the similarities between a kitchen full of expert chefs. Each one with their own areas of expertise, but brought together through a passion for good design and quality code.

However in a world dominated by fast food and even faster design, it’s often difficult to explain the difference to customers—why a meal by a Michelin Star chef is worth more than a chain restaurant. It’s difficult because, unlike the restaurant world, most customers haven’t seen the effort required to deliver quality; haven’t sampled enough dishes to tell bad from good.

The only way to combat this is for designers to make their effort visible as well as their output; to educate customers on the importance of ingredients and technique; and to design like a Michelin Chef.

Comments (7)

In defence of the hamburger menu | January 13, 2016

It’s interesting seeing how quickly hamburger menus have turned from handy UI element to social pariah. Rarely a day goes by without some young designer pronouncing hamburger menus the biggest UI crime since Clippy. They cite a raft of arguments why hamburger menus are bad, from the theoretical (it’s mystery meat navigation that users don’t recognise) to the anecdotal (three of my five usability subjects didn’t know what it was when I asked), to the statistical (60 percent of the users on my site don’t interact with the hamburger menu).

All these arguments hold water and in normal circumstances I’d agree. After all, it’s not the most immediately obvious icon, and the last thing any designers wants to do is cause undue stress or confusion. However I think there’s an innate Britishness about me that feels the need to stick-up for the underdog and protect something that feels like it’s been getting an unnecessary kicking.

Ignoring its longer history for a second, the Hamburger menu is part of an emergent design language that resulted from the rise of responsive design. It solves a difficult problem (how to represent a potentially large number of menu items on a small screen) in a relatively neat and tidy way.

Agreed that the icon doesn’t clearly explain what it does, but then neither does the pause button on a typical media player. One of the main reasons we’re able to use this symbol unlabeled is the fact that it worked its way into our cultural repertoire thanks to continued repetition on tape decks and VCRs.

Had Twitter existed in the 80s, I’m sure a group of well meaning designers would have tried to shoot down the humble pause button—and it’s cousins “stop” and “record”—with similar arguments. However I think they would have done so from an oversimplified understanding of what usability is.

If you go back to the early definitions of usability, they state that a usable interface is one that is learnable, efficient, memorable, produces low errors, and is satisfying.

I’d argue that the pause button on a VCR is learnable (once you’ve pressed in once you know what it does), memorable (the icon is simple and easy to recall) and produced low error rates (if you accidentally press it you can easily recover with little negative effect). It’s also relatively efficient (it’s one press after all) and the action on an old style mechanical VCR was a tiny bit satisfying. So as a result of these qualities, the pause button became part of the global iconographic lexicon.

I believe the hamburger menu shares many of these characteristics, and has the same opportunity to become a globally recognised icon through consistent exposure. However this will only be possible if we stop showing off to our friends by “hamburger shaming”, and embrace the plucky icon for what it is, warts and all.

Comments (13)