More thoughts by Cennydd Bowles
Mobile is an ideological minefield. The insults are barbed and the indignation disproportionate. Take a stance on mobile product choices and design approach, and you’ll soon receive the vitriol of your opponents.
The mobile app vs. native app debate is still one of the most contentious. I find this daft. No one has this problem about native desktop apps vs. web apps. The same people that decry native mobile apps use Coda, Photoshop, and OmniFocus. Native enthusiasts use FreeAgent, Google Docs, and Basecamp without a second thought. In the desktop world, we already know that whether a native or web app is better depends on what it’s for.
I long for a similar rapprochement in the mobile world. Sloganeering and feigned fanaticism are fun, but reality is always more nuanced. Is our mobile future native or web? No. It’s native and web. Now let’s stop arguing, and build the damn thing.
Sure, “It depends” is a valid answer to any worthwhile question, but it’s also vague and patronizing. It’s far more helpful to explain what it depends on.
Process is for fast food restaurants. I’m much more interested in skill.
Skilled people without a process will always find a way to get things done. Skill begets process. But process doesn’t beget skill. Following a recipe won’t make you a great chef – it just means you can make a competent bolognese. Great chefs don’t need cookery books. They know their medium and their ingredients so well that they can find excellent combinations as they go. The recipe becomes a natural by-product of their work.
Sure, if you have a low-skilled team, or inefficiencies and costs are your top priority, process matters. But for knowledge workers, skill is more important. The best people don't care whether agile/waterfall/lean is the flavor of the month, or what job title they should hold. Instead, they care about practicing their craft, and being better than they were yesterday.
The French have a phrase for it. "Reculer pour mieux sauter": to step backward for a better run-up.
Once you've clambered up the learning curve, the easy terrain of the plateau is a blessed relief. But look closely and you may see another summit in the distance. If you want to reach it, you'll have to climb down first.
Sometimes the best option is to throw away what's made you successful. Your next ascent may be more punishing and even risky, but that adrenaline hit is intoxicating.
Web Design < Interface Design < Interaction Design < User Experience Design < Customer Experience < Service Design.
Seems to me that a lot of the drama over job titles stems from a classic outflanking strategy: wanting to appear more strategic than the other guy. We jostle for ascendency in this supposed value chain, trying to appeal to more senior sensibilities in the hope of winning broader, richer projects. In the meantime, our language becomes ever more grandiose, our deliverables more abstract, and our expertise even harder to define.
Eventually we'll run out of abstractions, and this euphemism treadmill will creak to a halt. The only thing left then will be to judge designers by our work; by the things we bring into the world. Hallelujah.
As information is torn free of its moorings, and people expect services to straddle countless devices, we'll see a rise in the value of good, old-fashioned information architecture. Context, structure, content, and metadata have become key issues for every designer. Information architects, much maligned over the last five years, can surely allow themselves a wry smile.
Sustainable design matters even in the digital world. Our raw materials—information, effort, money—may not be mined from the earth, but they’re still in scarce supply. We have a duty to conserve them.
Responsive web design is essentially a sustainable design manifesto, reducing the need for clients to retreat to the drawing board with every new trend, every new device. RWD isn’t about designing for today’s technology: it’s about designing for tomorrow’s.
Predicting the future sounds daunting, but if you scan the horizon you can spot the silhouettes: pervasive high-density displays, increased use of touch and voice, users bouncing between multiple devices. We shouldn’t wait until these issues burst into the mainstream before considering their impact.
So are you making something that will live into the future? Or will your work just contribute to the digital landfill?
Each new app kickstarts the race to state an opinion. By the end of the day, the app has been hailed as the dizzying future or decried as a #fail-hashtagged folly. Entire industries have been deemed moribund or reinvigorated, and life will never be the same again.
First impressions are valuable, but they’re unreliable, too easily seduced by visceral tricks: appearances, implications, promises. Products should be experienced over time, with context, sobriety, and occasional tipsiness. Only once a product has started to wear a groove in our lives can we genuinely claim to understand what it means.
It’s taken me a while, but my RSS reader is starting to gather dust. I’ll admit it was a solution to at best an edge case – ‘show me everything these people ever write’ – but I liked having a mechanism that prioritised recall over precision.
But now I need filters; and my personal networks provide them. Networks encourage a kind of contextual meritocracy: strong, relevant ideas tend to bubble to the surface. This is, of course, why even the most technically-minded companies around are pursuing the social agenda. It’s heartening to know that humans can still outperform the machines from time to time.
If technology doesn’t go the way we want it to (conjure up any techno-dystopia you like) history will blame us. Instead of making technology easy, accessible, and pervasive, why didn’t we sabotage it at every turn? If only we’d destroyed the data stores! Smashed the search engines!
If the future historians are generous, they’ll put our mistakes down to mere naiveté. Or perhaps they’ll deem us traitors who collaborated with the enemy, our work an indictment of what happens when egos trump ethics.
If technology goes the way we want it to, the vast majority of people won’t ever understand what we did. As it should be.
I’ve found that if you want rough indication of a designer’s experience, look at the time they spend on different stages of the design process.
Novice designers spend most of their time creating a solution, and maybe 20% refining it.
Intermediates split the time roughly evenly.
For senior designers, the ratio flips: 20% creating, 80% refining.
And the experts realise that creating and refining are actually the same thing.
The old maxims were gradually replaced, and new leaders appointed through the democracy of retweets. No books were burned, although a few were traded in on Amazon.
Our industry has undergone a genteel revolution. But listen closely and you may notice a more aggressive tone. The designers led us astray! The web has become bloated, and it’s only pure luck that mobile arrived to save us, deus ex machina, from our past excesses.
I prefer to recognise complexity than assume incompetence. No designer wants to add unnecessary elements, or another column of pageview-sucking linkbait. These designers knew what’s good for users and the web. However, design is more complex than the slogans suggest.
Real design is political. It’s making the case for fluidity when the client isn’t confident enough to yield control of pixels. It’s persuasion, reciprocity, and benign skullduggery. It’s a constant negotiation between what the consultant sees as harmful, but the client sees as essential.
If the planets are aligned, it might just produce something the community deems worthy. But far more often, good design produces a modest improvement, a compromise, or a disaster narrowly averted.
So let me speak on behalf of the designers who are accused of causing this apparent mess:
But it’s not that easy.