2016-12-30T16:53:41+00:00Over that last 12 years of attending, speaking and organising conferences, I’ve seen a lot of talks. Probably upwards of a thousand. I’ve seen talks that have inspired me, talks that have challenged me, and talks that left me welling up. During that time I’ve seen themes start to emerge; topics our industry find fascinating and love to revisit time and time again. Many of these topics I’ve used myself, and were I ever to write a “101 things I learnt at architecture school” style book for the interaction design industry, these tropes would feature heavily. After spending two days binge watching talks in an attempt to find the last couple of speakers for a conference I’m organising, I was amazed how regularly these tropes appeared. I was also surprised how certain traits and behaviours kept repeating themselves across speakers. So I thought I’d jot them down, on the off chance people found them useful. Either as things you hadn’t heard of before and wanted to explore further, or topics and behaviours you wanted to avoid in a search for originality. Top Talk Tropes One of the earliest tropes I can remember is “paving the cow paths”; the idea of designing for observed behaviour rather than imposing strict architectures of control. This concept beautifully illustrates the fields of user centred design and lean startup. It’s also one of the pervading philosophies behind the web; that there is intelligence in the system, and it will find its way around any blockage. In the retail world, “cow paths” are also known as “desire lines”, and are used to maximise product exposure. In past talks I’ve used this example to explain why milk is always placed at the back of the store, and how casinos in Vegas are designed. If desire lines can be seen as a highly optimised user journey, the “peak-end rule” is a similar short-cut our brains make for judging how we experience said journey. Research from the field of hedonic psychology has shown that we tend to judge an experienced based on two things; the intensity of the peak condition—positive or negative—and the end state. This is one reason why the most memorable customer experiences are often the result of something bad happening. We remember the intensity of the bad experience, plus the happiness caused by a positive outcome, and the differential between the two frames our perspective. That’s not to say that we should deliberately try to manufacture negative experiences. However it does suggest that people will judge experiences more favourably that have peaks and troughs of emotion, but ultimately end well, rather than an experience that was consistently good, but not noteworthy. A related cognitive bias is the idea of “change blindness” as illustrated perfectly but the classic basketball video. Viewers are asked to count the number of times the basketball changes hands. So fixated are they on this task, a good proportion of viewers fail to spot the 100 pound gorilla in the room, both literally and figuratively. This goes to show that even when we think something obvious is happening with our designs, many of the people using our products literally don’t notice the things we’re carefully designed. width="560" height="315" src="https://www.youtube.com/embed/0grANlx7y2E" frameborder="0" allowfullscreen> One tool to help craft these peak experiences is “The Kano Model”. This model classifies features into three different types; basic needs, performance pay-offs, and delights. I usually describe the Kano model in talks by using the analogy of a hotel. A hotel room just wouldn’t function without a bed, a door, access to a bathroom, electricity and a few other must have items. Theses are your MVP feature set. However in order to compete in a crowded market, you can add performance pay-off features like a bigger TV or after broadband. Over time, these nice-to-have features eventually become basic needs, which is why most MVP aren’t very minimal. It’s the third type of feature in the Kano model that interests interaction[...]
For the longest time I’ve maintained that Service Design was a specific discipline, distinct from UX Design. It’s true that they have a lot in common, like the way both fields approach problems through a user-centred lens. They also use many of the same tools, such as design games and personas. Even some of their distinctive tools, like the service delivery blueprint have similarities with our own user journey maps. But if you spent any time with a credible Service Design agency five or ten years ago, you’d easily spot the differences.
User Experience agencies typically came from a digital background, and were filled with information architects, interaction designers and usability specialists. We primarily focussed on creating products and services with a digital interface, along with the service ecosystem that supported them.
By comparison, European Service Design agencies did a lot of work on the delivery of public services—presumably because the public sector is so strong in the UK—while their US counterparts looked more towards the in-store experience. It wasn’t unusual to find a Service Design consultancy staffed with industrial designers, set-designers and commercial architects.
The two disciplines clearly shared the same ancestry, but somewhere along the evolutionary tree, they took a slightly different branch. Look at a typical UX agency and their portfolio will be full of publishing websites, mobile apps, and startups, while Service Design consultancies are more likely to show phone systems, airline check-in procedures, and better ways to deliver healthcare.
On the surface these outputs may look very different. However, as digital technology increasingly provides the platform on which these services are built, the differences are slowly being stripped away.
Just think about it. What is Uber if not a cleverly crafted service that matches car owners looking for a bit of extra cash to people looking for a ride? The interface may be digital, but make no mistake that this is a carefully choreographed piece of Service Design with lots of stuff happening behind the scenes.
Now look at the airline check-in experience. On the surface it may look like a traditional Service Design project. Dig a bit deeper however, and you’ll see that almost all of the people turning up to the desk bought their tickets online, checked-in via the airline website or mobile app, and either printed out their own boarding passes, or have them stored on their mobile phones. Surely this is a classic User Experience project?
As digital has become one of the primary ways of delivering a service experience, Service Design agencies have needed to become more digital, while digital agencies have needed to become more service oriented, to the point that it’s getting harder to differentiate the two. So much so that, at its highest level, User Experience Design has become indistinguishable from Service Design.
I’ve resisted this sentiment for a while, not least because I think the distinction still provides value to clients. However this value is rapidly diminishing as the industry continues to misunderstand and misrepresent what UX Designers do, incorrectly applying the UX label to interaction designers and digital generalists.
Government Digital Services took the decision to adopt the term Service Designer, because they understand the language of government is one of delivering services rather than products or experiences. I suspect many traditional companies feel the same way.
I expect to see more and more high end consultancies adopt the language of digital service design, as opposed to product design or experience design, to better explain what they do and separate themselves from the herd. What existing Service Design agencies will think of this trend is anybody’s guess. Will they embrace this new influx of digital service designers, or push back? Whatever happens, I have a feeling that this change is inevitable.
2016-08-24T12:23:57+00:00We’ve all been there. You spent months gathering business requirements, working out complex user journeys, crafting precision interface elements and testing them on a representative sample of users, only to see a final product that bears little resemblance to the desired experience. Maybe you should have been more forceful and insisted on an agile approach, despite your belief that the organization wasn’t ready? Perhaps you should have done a better job with your pattern portfolios, ensuring that the developers used your modular code library rather than creating five different variations of a carousel. Or, maybe you even should’ve sat next to the development team every day, making sure what you designed actually came to pass. Instead you’re left with a jumble of UI elements, with all the subtlety stripped out. Couldn’t they see that you worked for days getting the transitions just right, only for them to drop in a default animation library? And where on earth did that extra check-out step come from. I bet marketing threw that in at the last minute. You knew integration was going to be hard and compromises would need to be made, but we’re supposed to be making the users lives easier here, not the tech team. When many people are involved in a project, it is very important to make sure that they have a common understanding of the problem and its solution. Of course, there are loads of good reasons why the site is this way. Different teams with varying levels of skill working on different parts of the project, a bunch of last-minute changes shortening the development cycle, and a whole host of technical challenges. Still, why couldn’t the development team come and ask for your advice on their UI changes? You don’t mess with their code, so why do they have to change your designs around? Especially when the business impact could be huge! You’re only round the corner and would have been happy to help if they had just asked. While the above story may be fictional, it’s a sentiment I hear from all corners of the design world, whether in-house or agency side. A carefully crafted experienced ruined by a heavy-handed development team. This experience reminds me of a news story I saw on a US local news channel several years ago. A county fair was running an endurance competition where the last person remaining with their hand on a pickup truck won the prize. I often think that design is like a massive game of “touch the truck”, with the development team always walking away with the keys at the end of the contest. Like the last word in an argument, the final person to come in contact with the site holds all the power and can dictate how it works or what it looks like. Especially if they claim that the particular target experience isn’t “technically possible”, which is often shorthand for “really difficult”, “I can’t be bothered doing it that way” or “I think there’s a better way of doing it so am going to pull the dev card”. Now I know I’m being unfairly harsh about developers here and I don’t mean to be. There are some amazingly talented technologists out there who really care about usability and want to do the best for the user. However, it often feels as though there’s an asymmetric level of respect between disciplines due to a belief that design is easy and therefore something everybody can have an opinion on, while development is hard and only for the specially initiated. So while designers are encouraged (sometimes expected) to involve everybody in the design process, they often aren’t afforded the same luxury. To be honest, I don’t blame them. After all, I know just enough development to be dangerous, so you’d be an idiot if you wanted my opinion on database structure and code performance (other than I largely think performance is a good thing). Then again I do know enough to tell when the developers are fudging things and it’s always fun to come back to them with a working prototype of something they said [...]
2016-08-23T16:54:10+00:00Agile has been the dominant development methodology in our industry for some time now. While some teams are just getting to grips with Agile, others extended it to the point that it’s no longer recognisable as Agile. In fact, many of the most progressive design and development teams are Agile only in name. What they are actually practicing is something new, different, and innately more interesting. Something I’ve been calling Post-Agile thinking. But what exactly is Post-Agile, and how did it come about? The age of Waterfall Agile emerged from the world of corporate IT. In this world it was common for teams of business analysts to spend months gathering requirements. These requirements would be thrown into the Prince2 project management system, from which a detailed specification—and Gantt chart—would eventually emerge. The development team would come up with a budget to deliver the required spec, and once they had been negotiated down by the client, work would start. Systems analysis and technical architects would spend months modelling the data structure of the system. The more enlightened companies would hire Information Architects—and later UX Designers—to understand user needs and create hundreds of wireframes describing the user interface. Humans are inherently bad at estimating future states and have the tendency to assume the best outcome—this is called estimation bias. As projects grow in size, they also grow in surface area and visibility, gathering more and more input from the organisation. As time marches on, the market changes, team members come and go, and new requirements get uncovered. Scope creep inevitably sets in. To manage scope creep, digital teams required every change in scope to come in the form of a formal change request. Each change would be separately estimated, and budgets would dramatically increase. This is the reason you still hear of government IT projects going over budget by hundreds of millions of dollars. The Waterfall process, as it became known, makes this almost inevitable. Untimely the traditional IT approach put too much responsibility in the hands of planners and middle managers, who were often removed from the day-to-day needs of the project. The age of Agile In response to the failures of traditional IT projects, a radical new development philosophy called Agile began to emerge. This new approach favoured just-in-time planning, conversations over documentation, and running code; effectively trying to counter all the things that went wrong with the typical IT project. The core tenets of this new philosophy were captured in the agile manifesto, a document which has largely stood the test of time. As happens with most philosophies, people started to develop processes, practices and rituals to help explain how the tenets should be implemented in different situations. Different groups interpreted the manifesto differently, and specific schools started to emerge. The most common Agile methodology we see on the web today is Scrum, although Kanban is another popular approach. Rather than spending effort on huge scope documents which invariably change, Agile proponents will typically create a prioritised backlog of tasks. The project is then broken down into smaller chunks of activity which pull tasks from the backlog. These smaller chunks are easier to estimate and allow for much more flexibility. This opens up the possibility for regular re-prioritisation in the face of a changing market. Agile—possibly unknowingly—adopted the military concepts of situational awareness and command intent to move day-to-day decision making from the planners to the front-line teams. This effectively put control back in the hands of the developers. This approach has demonstrated many benefits over the traditional IT project. But over time, Agile has became decidedly less agile as dogmas crept in. Today many Agile projects feel as formal and conservative as the approaches they overthrew. [...]
2016-08-15T15:58:08+00:00Back in the the olden days (c. 2000) people used to own software. When a new version of Photoshop or Fireworks came out, you’d assess the new features to decide whether they were worth the price of the upgrade. If you didn’t like what you saw, you could skip a generation or two, waiting until the company had a more compelling offering. This gave consumers a certain amount of purchasing power, forcing software providers to constantly tweak their products to win customer favour. Of course, not every tweak worked, but the failures were often as instructive as the successes. This started to change around 2004, when companies like 37 Signals released Basecamp, their Software as a Service project management tool. The price points were low—maybe only a few dollars a week—reducing the barrier to entry and spreading the cost over a longer period. Other products quickly followed; accounting tools, invoicing tools, time-tracking tools, prototyping tools, testing tool, analytics tools, design tools. Jump forward to today, and the average freelancer or small design agency could have subscriptions to over a dozen such tools. Subscription works well for products you use on a daily basis. For designers this could be Photoshop or InVision; for accountants this could be Xero or Float; and for consumers this could be Spotify or Netflix. Subscription also encourages use—it encourages us to create habits in order to get our money’s worth. Like the free buffet at an all-inclusive hotel, we keep going back for more, even when we’re no longer hungry. In doing so, subscription also locks us in, making it psychologically harder for us to try alternatives. Making it less likely for us to try that amazing local restaurant because we’ve already paid for our meals and need to beat the system. The sunk cost fallacy in all its glory. Problems with the rental model become more apparent when you’re forced to rent things you use infrequently, like survey products or recruitment tools. You pay to maintain the opportunity of use, rather than for use itself. We recently did an audit of all the small monthly payments going out of the company, and it’s amazing how quickly they mount up. Twenty dollars here and forty dollars there can become thousands each year if you’re not careful. Even more amazing are the number of products we barely used. Products that somebody signed up for a few years back and forgot to cancel. You could blame us for our lack of diligence. However the gym membership model of rental is explicitly designed to elicit this behaviour. To encourage people to rent the opportunity, safe in the knowledge that the majority of members won’t overburden the system. Unclear billing practices and disincentives for unsubscribing—”if you leave you’ll lose all your data”—are designed for this very purpose. Then you have the legacy tools. Products that you rarely use, but still need access to. Photoshop is a great example of this. Even if you’ve decided to move to Sketch, you know many of your clients still use Photoshop. In the olden days you would have keep an older version on your machine, costing you nothing. These days you need to maintain your Creative Cloud account across multiple team members, costing you thousands of dollars for something you rarely use. This article was sparked by a recent Twitter storm I witnessed where Sketch users raised the idea of a rental model and vilified people who felt paying $30 a month for professional software (which currently retails at $99) was too much. While I understand the sentiment—after all Sketch is the tool many designers use to make their living—you can’t take this monthly cost in isolation. Instead you need to calculate lifetime cost. As we all know from the real world, renting is always more expensive than ownership in the long term. You also have to consider rental costs in relationship to every other piece of rented softwar[...]
2016-07-17T08:24:22+00:00Every few months, somebody in our industry will question why designers don’t use their talents to solve more meaningful problems; like alleviating the world from illness, hunger or debt. This statement will often be illustrated with a story of how a team from IDEO or Frog spent 3 months in a sub-saharan village creating a new kind of water pump, a micro-payment app, or a revolutionary healthcare delivery service. The implication being that if these people can do it, why can’t you? As somebody who believes in the power of design, I understand where this sentiment comes from. I also understand the frustration that comes from seeing smart and talented people seemingly wasting their skills on another image sharing platform or social network for cats. However this simple belief that designers should do more with their talent comes loaded with assumptions that make me feel very uncomfortable. Firstly let me state that I think designers are a genuinely caring group of people who got into this industry to have some visible impact on the world. They may not be saving lives on a daily basis, but they are making our collective experiences slightly more pleasant and less sucky. They do this by observing the world around them, being attuned to the needs of individuals, spotting hidden annoyances and frustrations, and then using their visual problem solving skills to address them. As a result, designers are often in a permanent state dissatisfaction with the world. Designers also regularly find themselves overwhelmed by the volume of problems they are exposed to and expected to solve. This is partly down to the fact that companies still don’t understand the full value design, and fail to resource accordingly. However it’s also down to the designers natural urge to please, often causing them to take on too much work and spread themselves far too thin. The message that designers aren’t trying hard enough to solve the really big, meaningful problems taps into this deep insecurity; making them feel even worse about the lack of impact they are having than they already do. As somebody who cares about the industry I feel we should be trying to help lighten the load, rather than adding increasingly difficult to achieve expectations onto an already stressed out workforce. I also worry about who get’s to define what counts as “meaningful” work. For some people, meaningful may mean taking 6-months off to help solve the refugee crisis—an amazing thing to do I’m sure you agree. For others it may mean impacting hundreds of millions of people by working at Facebook or Twitter. That may seem facile to some, but both these platforms have been used to connect isolated communities, empower individuals, and in some cases, topple regimes. So who are we to judge what “meaningful” means to other people? Many designers I speak to do actually want to have a bigger impact on the world, but don’t know where to start. It’s not quite as easy as giving up your day job, traveling to a crisis zone, and offering your services as a UX designer. It turns out that a lot of the world favours doctors, nurses and engineers over interaction designers and app developers. I sometimes feel there’s a whiff of Silicon Valley libertarianism tied up in the idea that designers should be solving the really big problems; the kind of things that Universities, Governments and NGOs have been struggling with for decades. There is also a sense of privilege that comes with this notion. While some designers may be in the position to take a pay cut to join an NGO, or invest their savings into starting a community interest company, that’s not true of everybody. Designers may be better paid than many in society, but they still have mortgages to cover, families to look after, and personal lives to lead. By comparison, many of the people I see extolling these notions have been very fortunate in their careers, an[...]
Stories of mass underemployment due to the rise of Artificial Intelligence have been popping up all over the place the past 18 months. It would be easy to dismiss them as crack-pot theories, were it not for the credibility of their authors; from scientists like Stephen Hawkins to industrialists like Elon Musk.
Self driving cars seem to have gone from science-fiction fantasy to real world fact, in a matter of months, and the worlds transport workers are right to be concerned. Uber are already talking about making their drivers redundant with fleets of self driving taxies, while various local governments are experimenting with autonomous bus services. However the real employment risk comes from the huge swathes of haulage vehicles which could be made redundant. This won’t happen soon, but I suspect our roads will be 30% autonomous vehicles by 2030.
While it’s easy to assume that AI will only affect blue collar jobs, as we saw with the automation of manufacturing, I’m not so sure. I’m currently using an Artificially Intelligent PA to book my meetings and manage my calendar. It’s fairly crude at the moment, but it won’t be long before internet agents will be booking my travel, arranging my accommodation, and informing the person I’m meeting that I’m stuck in traffic. All things that are possible today.
Jump forward 20 years and I can see a lot of professional classes affected by digital disruption and the move to AI. In this brave new future, how will governments cope with rising unemployment?
One idea that’s been raised by both right and left is that of a Universal Wage. Put simply, every citizen would automatically receive a small, subsistence payment at the start each month. This would be enough to cover basic expenses like food and accommodation, but it wouldn’t guarantee a high quality of life, so most people would still choose to top up their incomes through work.
Unlike unemployment benefits, people don’t lose their universal wage if when they do work, removing a huge disincentive for many people. Instead this provides greater flexibility in the type of work people are able to do. For instance carers could fit work around their caring duties or students around college. As such, the Universal Wage supports the current trend we’re seeing towards the gig economy.
This may seem like an impossibly expensive solution, but various economic studies have shown it to be just about feasible today with only a marginal rise in tax. The reason it’s not more comes in part from the savings it would provide to the state. No more judging benefits on means, or policing infractions. Just a simple monthly payment for all.
The left love this policy for the social equality it brings. People can now spend their time in education and training, raising families and caring for loved ones, or exploring the arts. The right like it for similar reasons; empowering individual entrepreneurship while simultaneously reducing the size of government.
Several Universal Wage experiments are taking place around the world at the moment, so it will be interesting to see what the findings bring.
2016-04-04T09:32:07+00:00In a meeting a couple of weeks ago, one of my colleagues asked me to define “design thinking”. This question felt like a potential bear trap—after all “design thinking” isn’t a new or distinct form of cognitive processing that hadn’t existed before us designers laid claim to it—but I decided to blunder in regardless. For me, design thinking is essentially a combination of three things; abductive reasoning; concept modelling; and the use of common design tools to solve uncommon problems. If you’re unfamiliar with abductive reasoning, it’s worth checking out this primer by Jon Kolko. Essentially it’s the least well known of the three forms of reasoning; deductive, inductive and abductive, and the one that’s associated with creative problem solving. Deductive reasoning is the traditional form of reasoning you’ll be familiar with from pure maths or physics. You start with a general hypothesis, then use evidence to prove (or disprove) its validity. In business, this type of thinking is probably how your finance department plans its budget i.e. to generate this much profit we need to invest this much in staff, this much in raw materials and this much in buying attention. Inductive reasoning is the opposite of deductive reasoning, using experimentation to derive a hypothesis from a set of general observations. In business, inductive reasoning is often the preserve of the customer insight and marketing team i.e. we believe our customers will behave this way, based on a survey sample of x number of people. By comparison, abductive reasoning is a form of reasoning where you make inferences (or educated guesses) based on an incomplete set of information in order to come up with the most likely solution. This is how doctors come up with their diagnoses, how many well-known scientists formed their hypotheses, and how most designers work. Interestingly it’s also the method fictional detective Sherlock Holmes used, despite being misattributed as deductive reasoning by Sir Arthur Conan Doyle. Abductive reasoning is a skill, and one that can be developed and finessed over time. It’s a skill many traditional businesses fail to understand, preferring the logical certainty of deductive reasoning or the statistical comfort of inductive reasoning. Fortunately that’s starting to change, as more and more companies start to embrace the “design thinking” movement. So what else does design thinking entail other than abductive thinking? Well as I mentioned earlier, I believe the second component is the unique ability designers have to model complex problems, processes, environment and solutions as visual metaphors rather than linguistic arguments. This ability allows designers to both understand and communicate complex and multifaceted problems in simple and easy to understand formats, be they domain maps, personas, service diagrams or something else entirely. All too often businesses are seduced into thinking that everybody is in alignment, by describing complex concepts in language-heavy PowerPoint presentations, only to realise that everybody is holding a slightly different image of the situation in their heads. This is because, despite its amazing power, language is incredibly nuanced and open to interpretation (and manipulation). Some of our biggest wins as a company have involved creating graphic concept maps in the form of posters that can be hung around the office to ensure everybody understands the problem and is aligned on the solution. We call this activity design propaganda, and it’s a vital part of the design process. A simpler incarnation is the design thinker’s tendency to “design in the open” and cover their walls with their research, models, and early prototypes. By making this work tangible, it allows them to scan the possibility space looking for un-made connections, and drawing inferenc[...]
2016-02-22T13:58:45+00:00It’s understandable why we’re all so interested in fast-growing businesses, thanks to the drama involved. Will the company in question be able to raise the next round of funding, or will they hit the end of the runway in a ball of flames? Will they be able to hire fast enough to meet their demands, or will the culture implode on itself as people flee to Google or Facebook? Will the company fight off unwanted takeover bids and gain an even bigger valuation, or will they end up regretting not taking the deal? Will the founders end up as multi-millionaires, or just another Silicon Valley casualty? Each option is a juicy as the next, and equally deserving of comment and speculation. As rapid growth is considered the yardstick of start-up culture, it’s unsurprising that the majority of how-to articles focus on the challenges of running a fast-moving business. So how do you embed culture when your team has grown from 30 to 100 people in less than a month? How do you ensure your infrastructure is up to the task when your user base is doubling every few weeks? And how to you keep the company going when your monthly payroll is in the millions, but your income is in the thousands? These are all very interesting questions, and ones that will help many budding entrepreneurs. However as the founder of a deliberately slow-growing company, there is a real lack of articles charting the challenge of slow growth; and challenges there are a-plenty. For instance, when fast-growing companies hit problems around team management, marketing and sales, or HR, they can usually hire their way out of the problem. So they’ll create a new management layer, build a sales and marketing team, or hire HR professionals. The speed they are moving, combined with the funding they have raised, is often enough to power through these inevitable growing pains and get to the other side in one piece. By comparison, slow moving companies often have to live with these challenges for years, until they have the revenue or work available to justify even a part-time position. Until that point, slower moving companies need to make do with the resources they have available, figuring out ways to self manage, spreading sales and marketing across the management team, or using external agencies for ad hoc HR work. That’s why smaller companies end up having to focus on their core commercial offering, be that building, maintaining and supporting software if you’re a tech start-up, or offering design and development services if you’re an agency like Clearleft. This means that the traditional business functions you’d find in a large company (finance, marketing, HR etc.) end up taking a back seat; either by being distributed across the whole team, or concentrated amongst a small number of operations staff. Neither of these approaches is ideal. For instance, you can adopt a “many hands make light work” attitude by distributing common admin tasks across the team. But having experienced (and expensive) practitioners spend time on admin isn’t particularly cost-effective. It can also be a little demoralising, especially if colleagues at larger companies don’t have to do this. The other option is to centralise typical business functions amongst a small group of operations staff. This works well for general admin duties, but can be challenging when you start to need specialists skills like sales, marketing or HR. So in the end you just struggle through until you grow big enough to justify these additional roles. In fast-moving companies, hiring new people is less of a cultural challenge as the team are used to job descriptions fluctuating and new people joining all the time. In a slow-growth company, people get used to the status quo. It can be hard to relinquish part of your job to a new hire, or suddenly find yourself working under a mang[...]
2016-02-11T15:35:39+00:00When we first started Clearleft 10 years ago, the bulk of my effort was focussed on explaining to clients what user experience design was, the extra value it offered, and why design needed to be more than just moving boxes around the screen. I’m pleased to say that it’s been a long time since I’ve had to explain the need for UX to our clients. These days clients come to us with a remarkable understanding of best practice, and a long list of requirements that contain everything from research, strategy, prototyping and testing, through to responsive design, mobile development and the creation of a modular component library. I think it’s safe to say that the quality of the average digital project has soared over the past 10 years, but so has the effort involved. This isn’t unusual and happens across all kinds of industries as they develop and become more professional. You only have to look at the advances in health care over the last 50 years to see the dramatic rise in quality. Back in my childhood, the most advanced diagnosis tool was probably the X-ray. These days a whole battery of tests are available, from ECGs to MRIs and beyond. The bar has been raised considerably, but in the process, so has the average cost of patient care. Over the past few years I’ve seen client expectations rise considerably, but digital budgets have remained largely unchanged. We’ve done an amazing job of convincing digital teams that they need proper research, cross-platform support, and modular style guides, but somehow this isn’t filtering back to the finance departments. Instead, design teams are now expected to deliver all this additional work on a similar budget. I believe one of the reasons for this apparent lag is that of tempo. Despite the current received wisdom of continual deployment, most traditional organisations still bundle all their product and service improvements into a single big redesign that happens once every 4 or 5 years. Most traditional organisations’ understanding of what a digital product should cost is already half a decade out of date. Add to this the fact that it takes most large organisations a good 18 months to commission a new digital product or service, launch it, then tell whether it’s been a success, and you have all the hallmarks of a terrible feedback loop and a slow pace of learning. I think another problem is the lack of experienced digital practitioners in managerial positions with budget setting authority. It’s relatively common for digital budgets to be set by one area of the company, completely independently from those setting the scope. Project scope often becomes a sort of fantasy football wish list of requirements, completely untethered from the practical realities of budget. I couldn’t begin to tell you the number of projects we’ve passed on the last couple of years because their budgets were completely out of whack with what they wanted to achieve; or the number of clients who have asked for our help when their previous project failed, only to discover that the reason was probably due to their previous agency agreeing to deliver more than the budget would actually allow. These organisations end up spending twice as much as they could have done, because they wanted to spend half as much as was necessary—the classic definition of a false economy. Fortunately once you’ve made this mistake once, you’re unlikely to make it again. Speed of learning is hugely important. In fact I think the organisations that will fare best from the effects of digital transformation are those who can up their tempo, fail faster than their competitors, learn from their mistakes, and ensure they don’t happen again. Basically the standard Silicon Valley credo. It is possible to avoid some of these mistakes if you hire strategically. I’v[...]