Stephen P. Anderson explores the interaction between design and psychology, topics he loves to speak about at national and international events. Prior to becoming a independent consultant, Stephen spent more than a decade building and leading teams of information architects, interaction designers, and UI developers. He has designed web applications for businesses such as Nokia, Frito-Lay, Sabre Travel Network, and Chesapeake Energy, as well as numerous technology startups. He created the Mental Notes card deck, a tool that's widely used by product teams to apply psychology to interaction design. He’s also of the author of the book Seductive Interaction Design, which deals with the question our whole community is obsessed about: "how do we get people to fall in love with our applications?"
A few years ago, I started adding a rather odd section to my proposals. In addition to the usual stuff (who I am, their problem, what I’m proposing, pricing, and so on), I added a section entitled:
“Questions I’m Curious to Explore (Why this is personally exciting!)”
I did this for two reasons:
One, it doesn’t hurt to demonstrate that your interest in working with them is about more than the money. Letting potential clients know that you’re really excited about the problem space, and why, is a great way to set yourself apart from other proposals they may be entertaining.
Two, this was a good filter for me, personally. Answering the question “why do I want this project?” forces you to look beyond financial compensation. I’m keenly aware of how the projects I choose to invest my time in, in turn either advance, hinder, or alter my own learning trajectory. I’d like to be intentional about the work I choose — not just for what it is, but how it affects me: Emotionally. Financially. Timewise. What will I learn and who will I meet? What will I bring into the world? How will this affect people? And how does this play into my personal narrative? Put into perspective, a six month project is a good chunk of my life. I want to live my life with intention. I’ve turned down work — good work — because it didn’t pass the “Questions I’m Curious to Explore” test.
For the projects I do work on, once I’m a few weeks or months deep, it’s a nice reminder — especially when things are tense — of what I (personally) hoped to get out of this investment of my time. I can re-evaluate: Is what I wrote down still important? Are we on track? Am I learning new things that I didn’t anticipate?
Of course, being able to complete a “Questions I’m Curious to Explore” section means there’s already a bank of things you want to learn more about, whether it’s new skills, something you’d like to test, or something else.
One theme that’s emerged for me over the last year is just how import learning, discovery, play — being curious — is for me. This is a theme I’ve seen show up again in again each post I wrote and each presentation I gave. I value curiosity, a lot. Not just for me, but for everyone. As a species, we are born curious. We are hardwired to grow and discover the world around us.
So I’d like to ask you a question:
“What are you curious about? What do you want to know more about by this time next year?”
Whether it’s related to your work, or some personal interest, write down something (or a bunch of somethings!) you’re curious about. And then, pursue that topic to the point of exhaustion. Make something based on what you’re learning. Share what you’re learning with others. Engage in conversations about what you’re learning. You’ll discover a quiet kind of satisfaction that comes from a challenge faced, and mastered.
What are you curious about?
Like many who have come into this thing known as “UX,” I’ve never had any formal training. Except for a few art and design classes at my University (back when graphic design meant cutting ruby sheets by hand!), I’ve never been trained in web design, information architecture, usability, psychology, data visualization, product strategy… Everything I know I’ve picked up on projects, through a lot of reading and experimentation, and by working with lots of varied and talented people. At times, I’ve struggled with this lack of training, asking myself “Am I a fraud?” or “Am I going about things in the right way?” And I’ve certainly been jealous of those younger than me who’ve been able to take brilliant and inspiring courses from MIT, Stanford, or Carnegie Mellon. But… I’ve come to realize that it is precisely this lack of formal training, combined with a natural curiosity, that has led me to invent my own ways of doing things and form my own opinions about things. A reliance on a perceived “authority” can actually create blind spots in our thinking. We don’t see opportunities, or worse— we dismiss them before they have a chance to introduce themselves.
Creative ideas happen through unexpected combinations: someone formally trained in film becomes a certified chef so he can create a new kind of cooking show. Someone else decides to chart his health patterns not with charts and graphs, but with a data visualization he designs based on patterns in nature. We tend to think in particular ways and in a particular direction — often the one taught to us. But, history shows us that both incremental and giant leaps forward happen at the intersection of different — seemingly unrelated — ideas.
I’m finally coming to realize the upside to not having been formally trained in anything I do; I might even say I’m proud of the fact that I’ve had no formal training or certifications. It’s the discovery mindset that allows us to combine and create new things, things that in many cases go on to become “how things are done.” But hear me on this: How things are done is we listen, observe, draw on our experiences (and those around us) to identify and solve real problems. How things are done is not about the resulting processes and tools that we see coming out of the very real process of inquiry. How things are done starts with curiosity.
By coming at things sideways, without the safe rails of training, I’ve been forced to explore uncharted spaces, creating along the way my own map of the landscape. I’ve never known to not combine things in a certain way or that a particular tool is the “right” one for the job. I get to make stuff up as I go along! And since I’m always learning, I guess I’m always making stuff up. In truth, we all are, or all should be.
At this point, rather than envy, I pity those who are trained in the “right way” to do things. It’s difficult to see things from different perspectives when you’ve decided–consciously or unconsciously–to follow a particular path through and around a problem space. For a field as nascent as ours, there is no right way to do things. We’re all stumbling into the future.
Years ago, I discovered a nifty little social hack: Get people talking about themselves and they’ll like you more. Weird, I know. But as human beings, we tend to like others who are interested in what we have to say.
Of course, I escalated this into a game where, when confronted with someone I’ve just met or with whom I suppose I’ll have nothing in common, I see how long I can keep this conversation going without once ever talking about myself. The beauty of this is all the interesting stuff I’ve learned, stuff that I didn’t know before. Like when I learned (from a Bioengineering PhD) how living in the weightless conditions of space leads to bone loss, a big problem for astronauts who spend extended periods of time in space. Fascinating — I had no idea! Or my friend who was assigned to Presidential Guard duties while in the Marines — imagine the stories he has shared (at least those stories he can share!). Oh, the stuff you can learn about people and the world.
But, something else happens along the way. In my case, I’ve begun to see things differently.
In most conversations, around a subject in which we all share an interest, the conversation is much like a game of tennis: ideas and statements get volleyed back and forth. We’re looking for keywords and ideas that we can respond with. And if we’re diligent, we may even try to work their ideas into our own world view. This is assimilation. New ideas get layered into our existing world view.
But this is limited. We can only see things through our own perspective.
We hear all about “building bridges” with people who have different perspectives than our own. But, if all we’re doing is assimilating information, we don’t really grok perspectives other than our own.
So what’s the alternative?
My little social game, where I get people to talk about themselves, has a nifty little side effect. By listening intently, especially concerning subjects for which I have no knowledge or vested interest, I’ve become better at listening. And learning. Then, when it comes to a topic I do have some interest in, I’m learning how to suspend my own judgement. Conversations are no longer about the volley. The only “volley” on my part is a relentless questioning, born of curiosity. And in the process, I am no longer concerned with assimilating information. Rather, my goal is to see things the way that person sees things.
And here’s the magical thing that happens next…
By truly understanding a perspective other than your own, you’ll end up with ideas that don’t fit into your worldview--ideas you can’t simply assimilate. At this point, real change happens; you begin to accommodate these new ideas. Your internal world has to adapt itself to contrary ideas. Or, where there is disagreement, you can articulate the flaw, whether it’s in your understanding or theirs.
We’ve all heard that disagreements are born out of misunderstandings. But, it took learning about assimilation and accommodation for me to really understand — truly understand — how to get around these misunderstandings. Now when someone disagrees with, say… a design decision I’ve made, rather than jump to defend that decision, I jump to inquisitiveness. I want to first see things the way they do, to be able to hold in my head two different perspectives, so the way forward can become obvious. Then, we can reconcile the differences. Most resistance, whether to a new design or a foreign idea, is born out of ambiguity.
In research, we're told to listen. But it’s far too easy to filter what we observe through our own mental model — we do this without being aware that we are doing so. Thinking about accommodation has helped me to suspend not only judgement, but assumptions. I approach each research conversation with a blank slate, anxious to learn what I don’t know.
So, here’s my challenge: This week, as you engage in normal conversation, practice some meta-cognition. Stop and think about your responses. Are you seeking to assimilate or accommodate new information?
If I had to hire someone based on one trait, it’d be curiosity. Curious people ask the “How…” “Why?” and (most importantly!) “Why not…?” questions. This desire to learn and make sense of the world is what leads someone to see what others haven’t— whether it’s seeing the unseen things that are broken or new opportunities yet to be shown. Sure, once a subject is understood, these individuals get bored easily and move on to the Next Big Thing. But while they’re learning, they are the passionate student that every teacher desires, devouring every bit of information and giving themselves over to this new endeavor. Add to this the wealth of experiences the curious mind has accumulated, and you have a candidate who stands out from the rest.
For the last several decades, we’ve created varieties of the same CRUD tool. Whether it’s writing a Word doc, posting to Facebook, or checking in with FourSquare, these are all tools for Creating (and Reading, Updating, Deleting) information. Add to this all the ways to passively collect data—your mobile phone alone collects more than 700 points of data per day—and we’ve got a curious situation: Too much information and no good way to make sense of it all. We need new tools for understanding this information. Tools that let us explore, evaluate, and synthesize information. Tools that support cognitively complex activities.
When people ask me about “what’s next in design?” I think about this problem of too much information. New planets are discovered nearly every week, it seems, while back on Earth companies are scratching their heads trying to figure out what to do with all their customer data. We have scientists looking for patterns in strands of DNA. From cinematography to medicine, we’re getting sharper details—what used to be a few x-rays on a lightbox is now a terrabyte of data mapping your entire body. Add to this the growing number of people monitoring their own health patterns, social interactions, and all sort of minutia, and we’ve got some thrilling challenges waiting for us! How are we going to help turn information into understanding?
I favor visualizations as a way to make sense of it all, if only for the reason that our sense of vision is the most highly evolved sensory organ, capable of picking out minute differences in detail. Much has been written about the power images and pictures to aid in understanding. But there’s more to come than infographics and data visualizations displayed on a screen. Consider how coupled interaction is with thinking. We don’t necessarily think and then do. We also think through doing. When you hold a chess piece in mid-air, considering possible moves, the board has become part of your thinking space. You are using the chess board to extend beyond the limitations of your short-term, working memory. In this way, the environment outside of our bodies helps us to think and understand. Karl Fast says it best: “the problem space is now partly in the head and partly in the world, with interaction linking and blending these two spaces together.” By interacting with this external representation we have the potential to learn more deeply and to see more options, as with the chess player evaluating possible outcomes. This is partly why I favor interactive visualization. But what happens when interactions involve far more than a simple mouse click or tap of the finger? What happens when communication, movement, motion, position and other forms of interaction become part of the designers toolbox? That, for me, is a thrilling future. New forms of interaction helping us make sense of the world around us… CRUD tools, and to be fair, tools for sorting, filtering and searching content, are going to seem crude when compared to whatever comes next!
Dear [Insert name]
Unfortunately, I will not be able to attend this meeting.
As someone who is paid to create value for this organization, I feel it is my ethical obligation to decline participating in meetings that have neither (a) a stated purpose, nor (b) a clear agenda, as it is well documented and agreed upon by most leadership teams that the absence of this intentional thinking costs companies upwards of billions of dollars every year [insert link to most recent source]. If you wish for me to attend, please send a second invitation with this information clearly stated so I can accurately estimate the value of my attendance.
So when is the best time to write a book (or give a presentation or start a blog or…) on a subject you're particularly interested in? While you're learning! Putting words on a page isn't a commitment (trust me). You'll be able to add to, edit, and refine your ideas along the way–but only if you take the time to write them down. Also, there's a certain amount of fidelity to your thinking that fades over time; write in the moment and you won't lose the burrs and barbs that stick with readers. And, if you share what you're learning with others along the way, all sorts of people and projects will present themselves; you'll have more information and learning experiences than you could have imagined-- all while you're curious and enthusiastic, all while you're a student of your subject.
But, if you wait until you're an expert (which you'll never feel like you really are), one of several things will happen: One, everything you've learned will seem mundane and not worth writing about. Two, you'll be so bored of the topic that writing a book on the subject will be the last thing you'd ever want to do. Three, your interests will have led you to new, entirely different subjects. Or four, your interests will have led you to a new perspective from which it's no longer possible to write about the things you learned.
I'm proud of the book I wrote, and the card deck I self-published. Others have found them quite useful. But, had I waited until I felt competent about the subject (psychology and design), these things would never have been created; my ideas would have never taken a form that could be shared with others. These things would have been another casualty in the lineup of ideas I'll never work on.
It's human to create and learn. Doing isn't a commitment, it's just a step that keeps your ideas in motion and your options open. What are you working on? What are you learning? What are you creating?
Lately, there's been some concern within the UX community about folks doing crazy things like skipping the critical IA step or ditching wireframes to go straight to hi-fidelity comps. Behind this is an implicit accusation: By choosing to ignore traditional UX methods and deliverables, you're not really practicing good UX.
So... I think there's a difference between skipping a phase versus internalizing a phase. As young students, we go through a very formal writing process in order to learn the skills needed to be a good writer; I doubt very seriously that any of us go through that same, explicit process as mature writers. We've internalized those things we were taught. I've found the same true of my work, where frankly it might appear that I am skipping "steps," and am going straight to visual design, but I (personally) no longer have the need to expose these steps. Moreover, by going straight to a screen, people (stakeholders and users) are able to experience the IA & IxD by way of something pretty darn close to the final experience. This results in far better feedback than I ever got from abstractions like site maps or wireframes. One caveat, this "internalized" approach doesn't scale to very large projects where the complexity is much greater (say, a very large site with tens of 1000s of pages). But, for most small web sites and all web apps, this integrated approach has worked great; I've traded steps for multiple rounds of iteration, which allows for much more learning and feedback early on. Of course, the approach I'm describing assumes some prerequisite amount of experience...
Still, some will say that we must tease apart a project into discrete, isolated steps to get proper feedback on just the structure or just the interaction concept or just the look and feel. I say this is rubbish. Human beings don't think about content separate from presentation separate from structure separate from [fill in the blank]… We experience the world around us as one integrated whole. By insisting that we create these artificial distinctions, we confuse more than help. Take wireframes: We've all hear clients ask "Is this what it's going to look like?" This should be a clear signal that this artifact isn't working. As human beings we experience the world around us using all of our senses. Asking someone to comment just on the interaction or just on the structure--independent of the other pieces -- is a bit like asking someone to judge a chocolate chip cookie based on only a handful of ingredients. "Here, these are the wet ingredients (eggs, sugars, vanilla)--what do you think of this cookie?" How can we possibly expect to get good feedback on such an incomplete experience?
I'd argue for an integrated, holistic approach to UX that serves up as complete an experience as possible, as early on in the process as possible. I'm talking days, maybe even hours in some cases. This is not so we can be done more quickly, but so that we can use this new found time to iterate more frequently with actual users, leading to better, more user focused experiences. This approach is both more efficient and more effective. And hasn't this always been the goal? To build value for an organization through the design of useful and desirable customer experiences? Why should we settle for a process that puts artifacts and process ahead of experiences? As with writing, going through the motions of UX is for the rookies. I'd rather do whatever it takes, even if it mean getting out of my comfort zone and learning a new approach or skill, to create the experiences that actually improve peoples lives.
What you're reading on this page may not be what I intended when I wrote this paragraph. A careless or careful choice of words, the thoughts that preceded you reading this, knowledge of the author, even the font displaying these words-- all these factors are working together to create a concept in your short-term, working memory. What you come to understand is based on what you perceive.
Given this knowledge, I cringe when people say "It's all about content". No, it's not. It's about perceptions and memories, which are continually constructed and reconstructed with every new bit of sensory input.
If I were to give you a fine piece of artisan chocolate (content), your judgement and reaction would be based on far more than the quality of the chocolate alone. In mere seconds you'd be recalling memories of other chocolates you've tried, adding in your estimation of me as a chocolate connoisseur, evaluating how this chocolate is packaged, factoring in the origin of this piece of chocolate, considering what the shape and color of the chocolate reminds you of as– in seconds, your brain would make a staggering number of conscious and unconscious associations. And your enjoyment of the chocolate? It is, according to numerous studies from psychology and neuroscience, based on all of these associations. Your experience of the chocolate is based on far more than just the chocolate.
The same is true of online content. Whether we're talking about text, photos, or something else, these things do not exist independent of some form of presentation. And the experience you have with that content is always situated inside of some larger context.
Why is this important?
Think about how many people respond before reading past the first sentence of an email, or how content displayed in a creative magazine layout doesn't get the same reaction when displayed in HTML. Or how the simple addition of "Sent from my iPhone" allows us to be more forgiving of a terse email reply. These aren't content issues. These are perception issues, of which content is a part.
Isn't this what our "user experience" work should be about: how people experience and respond to the stuff we put into the world? Why are we so quick at placing a premium on one discipline over another? Why do our processes place one discipline farther up stream, ahead of another? And why do we stop at providing content and graphics without asking "why" or what comes next in the experience? As human beings, we experience the totality of things working together for some intended purpose. The piece of chocolate, the remarkable web site. How much better off would we be as a profession if we focused less on the defense of these isolated things (content, graphics, interactions, code) and more on the experience people have with the sum total of these things?
We're given mounds of data when all we want is the answer to a question: Which car should I buy? What's the best treatment for this illness? Which software will be best for my business? Instead of answers, we get search results, lists, spreadsheets, dashboards and other collections of data that do nothing to help us with the sense-making process. And while more and more data is made available to us, our capacity to hold these ideas in short-term memory has not changed. We need tools to help offload the mental tasks of understanding and identifying relationships. Why? So that our short-term working memory is free to make better, more informed judgments.
Data visualization is a step in the right direction, but these impressive feats of engineering tend to overwhelm most people. And infographics, while great for engaging people emotionally and making sense of a complex topic, are designed for print and not sufficient for large, dynamic datasets. We need something in-between, something engaging and dynamic, some visual representation designed around the content it is meant to serve. Shopping for a point and click camera? Why settle for search results or a data grid? Why not show dozens of options arranged in such a way as to reveal something each camera relative to the other choices? What might this screen look like? Moreover, what will the Web look like as we start to pay attention to the content being served up by these decade old UI patterns?
For this to happen, we need skilled visual designers—those individuals who excel at communicating ideas in powerful ways—who are also excited to work with content that is liquid and unpredictable. This is still a new set of skills, but a set of skills that will be in high demand as we look for new ways to deal with too much information and a shortage of clear answers.
I love the idea of turning work into play—or even a game. But, as a former educator, I've been troubled by something: Should we be making meaningful games out of ordinary problems, or should we teach kids how to "see" ordinary problems as fun challenges? If we succeed at the latter, we all benefit from a game that never gets old.