Jeff Eaton

Jeff is a Senior Digital Strategist at Lullabot. He has more than two decades of diverse experience in publishing, enterprise software development, and web publishing. He's built ecommerce sites for florists, billing systems for multinational corporations, and supply-chain automation tools for billion-dollar industries using technologies from Perl to ASP.Net. At Lullabot, he's designed and implemented large-scale web platforms for clients including Sony/BMG Music, Fast Company and Inc. Magazine, World Wrestling Entertainment, Verizon Wireless, Harvard University, MSNBC, and more.

When he's not writing or speaking about multichannel publishing, structured content, and the business value of streamlined editorial workflow, Jeff hosts the Insert Content Here content strategy podcast. In the Drupal world, he's best known as a co-author of O'Reilly Media’s Using Drupal; author of the popular Voting API, EVA, and Token projects as well as dozens of other plugin modules; the primary developer of Drupal’s core TokenAPI and a co-developer of FormAPI; and member of the content advisory committee for

You can follow Jeff @eaton.

Published Thoughts

Well, then.

It's been a hell of a year in a lot of ways, both good and bad. I've never liked retrospectives, and I promise I'll steer clear of lists and reminiscing. If there's one thing I've learned from this year, it's this: one of the most important things I can do is find other people who care deeply and join voices with them.

When I started posting here on The Pastry Box a year ago, I'd fallen off the blogging train. I spill dozens of thousands of words each month for my work: client briefs, project audits, company communication, and technical articles all add up. But I'd let the other things, like writing about personal growth and social issues and ethical dilemmas and the things that move my heart as well as my head, languish. Being given the chance to participate here changed that, and it changed my year for the better.

Collaboration—whether it's a group blog, an indie album, or a protest march—is about all of us doing something more than any of us could have managed individually. Contributing to The Pastry Box made me want to do better, because I was surrounded by people writing great things. At the same time, it meant that I didn't have to carry the burden single-handedly. Most days, I could come here to read, listen, and learn from others.

This idea—the importance of finding people who care about things that matter, listening to them, and joining in—isn't just about blogging. If there's one lesson to take into 2015, that's the one I choose.

Our world is full of exhausting and difficult challenges if we go at it alone. Find people you can support, and who can support you as you do the same. Tackle the big problems together.

Happy 2014.

Thanksgiving was pretty good this year. It was relaxing, full of excellent food, and conversation with family was pleasantly uneventful. Leftover pie was eaten, early Christmas gifts were ordered online, and lazy afternoon hours were whiled away with good books and naps. This is interesting, because not more than seventy-two hours ago, my family and I were shocked and outraged at the news that Darren Wilson, the St. Louis police officer who shot and killed a black 18-year old, would not be facing a trial.

It's been a tough couple of years for those of us who thought that racism was a remnant of the distant 1960s, who liked to think of ourselves as modern, open-minded, and "colorblind." A black man shot and killed by police while shopping. A 12-year old black boy shot and killed by police while playing with a BB gun. An black father of six choked to death for selling untaxed cigarettes. An unarmed black motorist beaten and framed by police. Eight black women sexually assaulted by a police officer.

The horrifying reality is that it's easy to lose count of these stories.

These are not fantastic holiday conversations, but they are reality. Blacks in the United States are far more likely to be killed by police. When their deaths are noticed by the media, the search for "proof of sin" kicks off immediately. These are not new developments in our country; our history of racism stretches from its early history to the present day. Beyond the obvious, embarassing stuff from the history books, our past and present are filled with subtle biases, abuse by those in power, and carefully-constructed systems of discrimination.

It's those systems that are the most insidious: no matter what challenges I face and difficulties I must overcome in my life, I am a white man. I will never be subject to the same suspicions and assumptions and demands for justification that are so common—and sometimes deadly—for blacks in our nation. I can rest easy, knowing that the color of my skin will not associate me with the New Hampshire Pumpkin Festival Riot. If my white neighbors and I were to destroy our neighborhood, we'd most likely be called "revelers" and the destruction characterized as "mischief."

I can object to the evils of racism, but I must also face the truth: I benefit from systems designed to give me a buffer of trust and protection that black men and women in the United States do not enjoy. I live in a world where racism is often treated with less seriousness than my discomfort at being reminded of it. As Jon Stewart (a fellow white guy, naturally) said, "If you're tired of hearing about racism, imagine how tired people are of experiencing it."

And so, here I am. Eating thanksgiving pie, contemplating the future, and reflecting on the profound luxury of ignoring terrible things when they feel overwhelming. It's easy for me to take comfort in the quick hits of outrage that spring up when a new injustice, a new horror, reaches social media or the nightly news. "I don't support that! It's terrible!" But days pass and as outrage gives way to exhaustion, the temptation to "move on" is strong. After all, I spoke out! I Tweeted.

As we approach 2015 and plan for a fresh new year of study and achievement and creativity and connection, that's the challenge for those of us who have the luxury of looking away. Every day cannot be a protest, but every day has opportunities to change ourselves and the world around us. It is up to us to educate ourselves, to listen to people whose experiences we are allowed and encouraged to ignore, and to respond with humilty and grace when we get it wrong. It is up to us to learn what we can do in the time between the headlines.

Building Beyond Our Means


I work for a distributed company. We have no central office, our infrastructure is almost entirely digital, and we log a lot of time in Google Hangouts. There are lots of upsides to this approach, but the downside is familiar to anyone who’s ever worked from home for a week: staying in touch with distant co-workers and team members takes real work. When everyone is camped out in a home office or a coffee shop, you can’t count on casual walks down the hallway to reveal morale problems or frustration.

We have a lot of ways to keep those lines of communication open, from IRC to daily status updates to one-on-ones and more. But the “ambient mood” of the company is still tough to gauge without a lot of legwork. Even worse, the times when you most need that information are the crunch times that make the extra effort most difficult.

A few years back we started a skunkworks project to solve that problem. Using a couple of simple inputs (a mobile-optimized site and a bot listening in our IRC channel), we’d let everyone in the company record their mood on a scale from “Crappy” to “Awesome” whenever they liked. With the sharing-barrier reduced to a single button-press or a one-line shout in the company chat room, we figured, we’d get a lot more useful data. And given that data, we’d be able to expose a simple-but-effective “mood board” for the company’s directors.

Perhaps, we thought, we’d even be able to anticipate problems. If Frank’s mood always plummeted during migration projects, but Suzanne’s always skyrocketed, we could take it into account for future assignments. And if Edward’s mood began to steadily slide over time, we could check in to make sure he wasn’t stranded without a listening ear. Science! Statistics! Utopia!

An enterprising co-worker jumped in with another idea. Instead of requiring everyone to manually post a mood announcement, why not go the next step? Sentiment analysis of our existing IRC and Yammer streams could intuit the level of negativity, cheeriness, or depression inherent in their existing communication. Furious whiteboarding ensued, culminating in a proposed plugin architecture and some research assignments. It wouldn’t be perfect, obviously, but if we counted on people to generate their own updates, they might stay quiet instead of sharing when they were down. If the sentiment analysis angle worked, we thought, our tool might even be useful to other companies navigating the challenges of distributed teams.

The technical side of things was humming along nicely. One of our devs had a few days of slack between assignments and was building out the backend and evaluating natural language parsing libraries, a designer was throwing together working wireframes, and I was taking a quick refresher on basic statistics. One of our PMs started nailing down the functional spec, and conducted some research into possible markets and competitors.

And that’s when we killed it.


It turns out that there was a real market for those kinds of tools. Unfortunately, some of the use cases weren’t as cheerily benign as we’d imagined. It’s already reality in the world of retail, where adjectives like “engaged,” “friendly,” “helpful,” and “enthusiastic” are non-negotiable requirements for every customer interaction. Modern technology allows businesses to build the data driven, optimized endgame of Office Space’s infamous “fifteen pieces of flare.”

We took a stab at brainstorming ways to horror-proof the tool: scaling back the automatic monitoring of conversations, anonomyzing the data, and so on. At the end of the day, though, all we could say was that we wanted to use the tools responsibly. For the good of the team, and all of its members. If it was released as an open source project or a monetized product, we couldn’t deny that it would enable, well… soul-crushing dehumanization.

And so, we stopped working on it.

This sort of dilemma isn’t a new one. Open Source developers have grappled with the fact that their creations can be weaponized, and the fantastic potential of the social web also facilitates troubling government surveilance. Questions abound, and answers are few.


In the years since our skunkworks project was shelved, other startups have rolled out fully-developed tools that serve similar needs. They launched and found funding and integrated privacy controls and held webinars. They did the hard work of building and iterating and shipping and selling, and they deserve the rewards.

Telling this story isn’t an attempt to imply that those teams were immoral or unethical. We didn’t kill our mood-tracker because we felt that doing it right was impossible. We killed it because we knew we didn’t have the time or resources to do it responsibly. It was a skunkworks side project, something we started to scratch an itch and learn a new framework. Tackling the kinds of problems we unearthed would have stolen time and resources from our successful client work. And that would’ve made pulling the plug even harder if we couldn’t come up with a good solution. It seemed better, at the time, to make the call early and leave the problem to those who could dedicate their energies to it.

Tools and data may aspire to technopian neutrality, but it’s extremely difficult to draw technical lines around morally and ethically problematic applications of them. Those of us who cut our teeth on programming and web development in less hyper-connected times can have trouble remembering that. For many of us, these technologies are all about freedom, exploratory tinkering, the thrill of discovery and creation.

Those creations, though, don’t stay locked on the family computer or a 5.25" floppy the way they did in the old days. The tools we build and the prodcuts we create and the projects we work on are used by billions of people around the world. Together, we affect their lives whether we want to or not—whether we intend to or not. Facebook recently stumbled into a hornet’s nest when it revealed mood-altering experiments it had conducted on its users, and smaller-scale experiments by the OK Cupid dating site drew fire for similar reasons.

I think both projects were terrible, but looking at my own experiences, I understand how easy it is to get the ball rolling. Often, the hardest part of preventing unethical projects isn’t waving a red flag. Rather, it’s realizing that we’re in over our heads—that we don’t have the resources to examine the issues thoroughly enough or spot the possible problems before the deadline arrives.

That capacity to grapple honestly with the ripple effects of our creations is difficult to quantify, but profoundly important. I like to think of it as a sort of ethical budget, and that metaphor can help me account for the ebb and flow of my own time and energy. I might be able to weigh the impact of a new “block user” button, for example, but this month I’m just too swamped to give updated privacy controls the attention they deserve.

I don’t have a crystal ball, and I’m no better than anyone else at predicting the problems my work can cause. I know how difficult it is to say “no” to compelling projects, or put the kibosh on demo-friendly features. Sometimes, though, it’s necessary. Pushing ahead, assuming someone else will sort it out after launch, is a way of blowing through that ethical budget. We owe it to ourselves, to our users, and to our world to stop living beyond our means.

When I stumbled into the world of web development in the mid 90s, “state of the art” meant table-based layouts, JPEG image maps, and the occasional CGI script. They were dark times, full of spacer GIFs and half-tested regular expressions.

CSS, better HTML standards, and the maturation of server and client side frameworks have all made life better, but there’s a lot that’s stayed the same. Saddest of all, the contentious divide between “developers” and “designers” is still a fact of life in many organizations. I’ve worked in marketing agencies and web development shops; played negotiator between PR departments and IT teams; and watched as weary PERL developers argued with frustrated Photoshop jockeys. It’s never pretty, and the fonts never, ever look quite right.

Like many others who build web sites in those early years, I learned to preach the gospel of collaboration. If developers studied the principles of UX, if print-savvy designers sunk their teeth into HTML, if everyone stretched a bit, we could overcome so many problems! Every year, more and more people nodded their heads in agreement—but the cultural divide still felt like a wide one.

Earlier this month, I had a chance to eavesdrop while a friend taught a class of MFA students about the value of cross-discipline communication. They were studying to become UX specialists, and the eminently practical topic that day was “working with web development shops.” I recalled the many frustrated negotiations I’d witnessed over the years, and was curious to see how open these students were to the realities of production web projects.

What I heard startled me: the divide I had spent years trying to bridge didn’t even seem to exist. They wanted to work closely with coders, and many of them straddled the line between design and development themselves. One scrappy designer wanted to talk about the value of better editorial tools for CMS projects: how willing, he asked, were real-world clients to invest time in that work? After class, I heard a cluster of students discussing the relative merits of PHP and Ruby while another asked for advice about a design internship.

There’s an old saying that “generals always fight the last war—especially if they won it.” In the years after World War I, French politicians and military leaders obsessed about the Maginot Line, a fortified wall of bunkers meant to keep the nation safe from German soldiers. They prepared for The Previous War: Round II, but failed to anticipate the mobile, fast-paced combat of WWII.

Listening to that room full of students wasn’t exactly a blitzkrieg moment for me, but it was definitely a shock. They were already true believers in a message that colleagues and I had struggled to communicate for years. And while there are still many projects and agencies where the development/design divide is a serious problem, this generation of cross-functional creatives seems to have moved beyond it. When they enter the working world they’ll have to deal with entrenched systems, but the skills and perspectives they’ve already developed will help pull those organizations out of deep ruts.

When the class finished up and assignments were distributed (via Dropbox, naturally), I shared a laugh with my friend about how different my expectations had been. In the day-to-day grind of enterprise project scoping and deadline-driven tweaks, it can be easy to forget that things change. We’ll probably never get rid of the crazy deadlines, and IE6 may never truly die, but for an old-school web nerd like me, it’s great to remember that even the most frustrating problems can get better.

Parlez-vous de l'argent?

I think one of the toughest challenges for passionate developers, designers, and writers is learning to speak business.

I don’t meant there’s an actual language barrier—although sometimes boardroom and managerial jargon can be daunting. What I mean is that most people who care about what they create spend a lot of time thinking about quality, craftsmanship, and The Right Way To Do Things™. It’s how we improve, it’s how we grow, and it’s a big part of the satisfaction in creating something.

The tough part comes when that care and dedication to craft collide with the cold reality of budgets, ROI, and opportunity cost. “It’s the right thing to do” and “this way is better” may be true, but when it comes time to cut the checks, few managers and clients are swayed by our high-minded artistic principles.

And so, we grow.

We figure out how to step back from the things that excite us and explain what those things mean for the people who have to pay for our time. We learn to communicate benefits rather than features, emphasize outcomes instead of processes, and practice tying it all to the bottom line. And it’s good.

Business, after all, is what keeps the bills paid. Figuring out how the things we make tie into its rules is an important growing-up moment, a critical part of navigating the modern world. It’s also an important part of empathy! When I ask someone else to pay for my work it’s important for me to see things through their eyes, to understand the pressures and constraints they’re working within, before pitching them on my Grand (expensive) Vision.

Of course, there’s a but.

The other night, I was talking to a close friend who’s been doggedly pursuing her dream of writing—and succeeding. She pitches and she produces and she edits and hones. She writes her fiction at night, covers pop culture news during the day, and she’s even learning how to make that tough business shift. She can weigh the ideas she’s developing by their commercial viability, their ability to generate traffic and drive the clicks and comments that editors know they need.

And it’s killing her.

“What if I can’t sell it?” she asks. “What if what I’m creating isn’t worth anything?”

The problem with the language of business and economics isn’t that it’s bad. The danger is that it’s so easy, once you’ve picked it up, to forget that there’s any other language at all. Once you learn to attach that hours-and-dollars cost to every activity and weigh it against the margins-and-profits payoff, well… the beauty and craftsmanship and the purpose and the hope that keeps us creating? It can look like pretty weak stuff indeed.

The matter-of-fact precision of economic language is even more compelling because it’s so pervasive in our culture. It’s hard to find an issue that isn’t decided (or at least justified) by an appeal to its apparent objectivity. But the worth of my friend’s work, the stuff she really cares about, isn’t measured in clickthroughs from Twitter or checks cut from a magazine’s razor-thin freelance budget. As compelling as they are, those numbers are not the why that makes what she does matter. They aren’t the why that makes your work matter.

Speak the language, learn the ropes, and don’t be afraid to sell the dollars-and-cents value of what you do. It’s how we pay the bills, and it’s a great skill to have. But don’t forget—don’t ever forget—that you’re only visiting that land of ledgers. Its language, its rules, its pressures and its values, are never the true measure of your worth and your work.

Your Brain Hates You, and Other Hazards of Metrics

I love measuring things.

That’s not terribly special, of course. Human beings generally love knowing how much that is and which one is more and am I faster and fun stuff like that. We measure our economies, our jogging, our page loads, our friends, our clickthroughs, our sleep cycles, our faves and our carbon footprints… We invent buzzwords like “the quantified life” and “big data” to describe our relationships with those numbers and graphs and goals, and we build whole jobs and companies and industries around making them make sense.

Which is great, because measuring things helps us recognize problems and learn and improve and grow. Knowledge is power! Yay! Except, of course, when it isn’t.

The funny thing is that measuring something almost inevitably causes strange and unexpected stuff to happen. It’s not that measuring is bad, of course. The real problem is the human brain.

“I wanted to go canoeing this week, but I realized I wouldn’t get any FitBit points.” —My co-worker, having a horrible realization

People-brains are fantastic organs full of great tricks, but they’re also littered with hard-wired shortcuts, biases, and ruts — pitfalls that can sabotage well-meaning metrics-driven approaches to problem solving.

My plan would’ve worked if it wasn’t for those meddling humans

In 1924, the Western Electric Company wanted to figure out how to make their factory workers more productive. With the help of management experts from the Harvard Business School, they turned the Hawthorne Works plant in Cicero, Illinois into a giant A/B test. Over the course of nine years, they repeatedly changed factory conditions and dutifully recording the results. It was a data lover’s dream come true!

In one particular test, they changed the lighting levels in the factory every week. Would bright lights keep everyone on their toes, or would dim lighting lead to more relaxed, focused workers? The answer was clear: Yes. Both. Or… no? Maybe neither?


Factory output, it turns out, went up when they brightened the lights. It also stayed up when they dimmed them, and stayed up when they returned to normal. And then, when the experiment ended, factory output drifted back to normal. The same pattern played out in other experiments, as well.

The sad truth the Harvard and Western Electric researchers discovered came to be known as The Hawthorne Effect, a form of observation bias. Simply knowing that they’re being measured changes peoples’ behavior, skewing attempts to gather actionable data.

Oh, don’t worry — it gets worse.

Looking into research on cognitive biases yields piles of well-researched pitfalls that should give any decision maker pause.

Humans tend to rank information and events we can easily remember as more relevant than the less noteworthy stuff — it’s called the Availability Heuristic. The effect can cascade, too: if you spend all of your time reading about startups that make it big, you’ll find it easier to recall success stories, leading to rosier predictions even when they’re unmerited. This bias can also skew the value we place on easy-to-access stats like follows, comment counts, and the seductively simple default view in Google Analytics.

Trying to figure out how to measure something complicated? Be careful — “close enough” metrics rarely stay that way. Goodheart’s Law, Campbell’s Law, and a host of other nerdy postulates all describe the same principle: If you care about X, but can only measure Y, it doesn’t matter how closely related they seem — people will inevitably game the system by focusing on Y. Test scores as a measure of intelligence, gross sales as a measure of business health, and lines of code as a measure of developer productivity are all familiar examples.

Don’t hate the player

Excellent books like Eli Pariser’s The Filter Bubble, Clay Johnson’s The Information Diet, and David Boyle’s The Sum of Our Discontent all explore the cultural effects of our fixation on metrics. And they generally agree that the dangers don’t lie in measurements, tests, metrics, or numbers in and of themselves.

Rather, it’s the danger of ignoring our own human hiccups and assuming that the data we gather and present can be trusted and understood without hard work and lots of humility.

Working with clients in the web world, that’s the kind of effort that separates tactics from strategy. Increasing the number of blog posts published on a web site, boosting ad impressions, or convincing more users to like the company Facebook page are all easy in isolation. Keeping a steady focus on why we’re doing those things and whether we’re accomplishing our broader goals, though — that’s what can keep us from chasing easy but deceptive measure of success.

In conclusion, my Klout score is down

Tricky or not, I still love measuring stuff. My first contribution to an open source project was a product ratings engine. I wrote a web crawler to compare the performance of KickStarter projects for fun! Prompted by a co-worker’s bad day, I designing a tool to track the emotional health of isolated remote workers. And these days, I’m elbows deep in mad-scientist plans to monitor my cats’ kibble intake with a RESTful API.

But when important decisions have to be made, I have to temper that natural enthusiasm. I ask people with different perspectives and experience to check my assumptions. I force myself to argue against my own data-driven conclusions, as honestly as I can. Most importantly, I try to ensure that the big-picture goals are crystal clear. We’re only human, after all, and outsmarting our own brains is a difficult prospect. Admitting our own biases and respecting the limits of our own understanding — those are the first steps to making better decisions.

Seven Books That Changed My Life

Both my wife and I love books — when we married, one of the first challenges was figuring out how to fit the dozens of boxes we’d accumulated into just one apartment.

Written words have had such an impact on both our lives that it’s hard not to see them as an essential fuel for living, like oxygen or Doritos. Thankfully, the rise of eBooks has eased our space crunch, and regular donations to the local library have thinned our shelves to make room for selected new arrivals.

There are a few, though, that I’ll never have the heart to part with — books that have changed how I see the world around me, how I understand my own life. Seven arbitrarily selected examples are presented here in no particular order; some are still my favorites, others feel as dated as my high school poetry, but all of them are part of who I am.

* * *

The Myth of Certainty, by Daniel Taylor

I grew up as an earnest, passionate kid in a fundamentalist religious community — a True Believer who learned apologetics and theology to spread the Truth. When I eventually questioned the unyielding principles I’d learned, the most difficult part was feeling trapped between unacceptable extremes. I could ignore my doubts to please fellow believers, or abandon everything to fit in with skeptics who acknowledged my questions. The Myth Of Certainty described a third way, one that was less comfortable but more honest, and helped shape my understanding of doubt, faith, and empathy.

Surely You’re Joking, Mr. Feynman, by Richard Feynman

Richard Feynman helped develop the atom bomb in the 1940s, and became one of the world’s foremost experts on quantum physics. Reading this collections of stories from his life, it’s easy to think that his days were a giant parade of practical jokes, recreational lock-picking, bongo-playing antics, and arguments with Albert Einstein. What stuck with me after the zany anecdotes? The universe is full of amazing things; discovering them and sharing them with others is one of life’s greatest pleasures.

The Elements of Style, by William Strunk and E.B. White

“Clarity, clarity, clarity!” That phrase echoed in my skull for years after reading this tiny little book of grammar, punctuation, and composition advice. It’s a classic, and the author’s passion for communicating in words comes through every syllable. When I wanted to throw everything I had at the page, it gave me clear and simple tools to pull my unruly words into line.

The Beauty Myth, by Naomi Wolf

Naomi Wolf’s first book was also my first encounter with feminist writing, and its critique of modern society’s “beauty machine” was a shocking eye-opener for me. Although some of the book’s statistics on eating disorders have been criticized, its documentation of the relentless pressure to be beautiful is still compelling. It helped teach me to look for systems and structures that can hurt the people around me, even if I’m blissfully unaffected.

Purity of Heart Is to Will One Thing, by Søren Kierkegaard

I’m pretty sure I picked this one up at Borders because I was 21 and wanted to look really, really profound. I was out of my depth from the first page, but slowly, some important stuff sunk in. The importance of honestly assessing one’s own priorities and admitting them to others — of owning one’s own choices — hit me like a ton of bricks. Kierkegaard wasn’t exactly a cheery guy, but he changed how I understood faith, ethics, and social responsibility.

The Sparrow, by Mary Doria Russell

I love science fiction, and I’ve read my weight in pulp novels more than a few times over. This story of humanity’s first doomed encounter with an alien race has depth and complexity that puts other every other first-contact novel to shame. Mary Doria Russell’s experience as a cultural anthropologist informs its exploration of tragedy, the human need for meaning, and the desire for connection. It broke me of the adolescent belief that “real” science fiction is about technology, and I always keep a second copy to loan.

A Pattern Language, by Christopher Alexander

It’s impossible to learn a programming language these days without stumbling across some mention of “Design Patterns.” The idea of describing common approaches to architectural challenges isn’t unique to software developers, though. In the 1970s, architect Christopher Alexander used it to describe a new and more holistic way of approaching the design of rooms, buildings, neighborhoods, and even whole cities. Picking up his original book, rather than just reading about software Factories and Facades, was worth the effort. It helped me realize that meaningful systems, with carefully designed and complementary elements, could improve our work in all kinds of fields.

* * *

These books aren’t necessarily the best writing in the world, or ones that every reader will care about. Other books have emerged as my favorites over the years, and even changed my feelings about the ones in this list. But they’re a part of me, and their spines on the shelf are a kind of time capsule.

What books have shaped and changed you? What stories and experiences would you have missed without them? Take the time to remember, and let others know about the writing that’s helped make you who you are. I know I’d love to hear about them — and I miiiight just have enough room on a shelf to fit a few more in…

Listen or GTFO

It’s Memorial Day as I sit here with my laptop, polishing the last few paragraphs of a post about building presentations on short notice. It’s a fun little bit of technophillia, but here in the final stretch, I’m struck by a simple thought: It doesn’t matter.

I’m a middle-class heterosexual white American guy in the world of open source web development. If someone at a conference, industry event, or networking event is holding the microphone, odds are they look an awful lot like me. My ideas, my perspectives, and my experiences are echoed over and over. When I do speak up, I can take it as a given that I’ll be listened to — even if I’m just speculating about something I have no expertise in.

It’s a good life.

* * *

A couple of years ago, I was attending a large open source conference. I’d just finished delivering a presentation and I was on my way to participate in a panel discussion, but I had some time to kill. Confronted with the usual buffet of sessions, I picked the one that was closest: a small roundtable discussion on “Diversity” hosted by an anime-haired lady wearing a Drupal T-shirt.

I think of myself as a feminist, happy to speak out on whatever gender issue is hot on Twitter at the moment. I had some thoughts I was ready to share, and I was looking forward to showing support for a group of people I already believed in. Once I sat down, though, I saw lots of unfamiliar faces — women, minorities, and handicapped members of our community. People I’d never seen keynoting a camp, or featured on a conference schedule, or promoted in the usual circle of bloggers and tweeters and guest-posters I rubbed shoulders with.

Some of the frustrations they discussed were familiar to me: sexual harassment and misogyny was a big problem online, tech circles were unwelcoming to people of color, and the usual statistics about the white-boys-club. But some of the comments were unexpected and frustrating — they made it sound like I was part of the problem.

Sure, some white men. But not all, obviously. As I prepared to jump into the conversation — to help clarify that not all of us were determined to hog the spotlight, that we were excited to help — I did something uncharacteristic.

I shut the fuck up.

I shut the fuck up, and I tried to listen.

* * *

Our industry, our culture, has a serious listening problem. Only recently has mainstream coding culture started internalizing the idea that listening to users rather than fellow programmers is important. Instead, too many of us prefer to start from first principles and reason out what some other person really needs. When the moment of truth arrives and our ironclad arguments collide with reality, it’s easy enough to blame the stupid people who just don’t understand what we build.

Despite the best attempts of empathetic, insightful advocates, we still see regular culture clashes and grim power struggles when different disciplines are asked to find common ground.

In one code-centric open source community I participated in, a common lament was the lack of designers willing to contribute their time and experience. Of course, when designers and UX specialists did chime in, they were usually brushed aside. Who were they to criticize what we’d built with the best intentions? Why did they think we weren’t smart enough to build great experiences? If they thought they could do better, why didn’t they learn to program and fix it themselves? Why were they trying to take over?

Defending ourselves is always easier than listening to difficult truths.

* * *

Early this weekend, a twenty-two year old man murdered three men and three women. According to the trail of online manifestos, message board posts, and chilling videos he left in his wake, his motivations were simple. He was unlucky in love, he resented the alpha males who hoarded more women than they deserved, and he really hated the women who didn’t give him the sexual attention he felt he deserved.

By his own account, he murdered innocent strangers to punish womankind for withholding his sexual birthright.

It’s a horrifying story, one that’s quickly sparked debates about gun control, mental health care infrastructure, and the deep roots of misogyny grown into our culture’s foundations. Even more horrifying is the response from the women I know: A complete and utter lack of surprise. This, many of them have said, is just the most recent and most visible example of the expectations and violent resentment they’ve always endured.

In the days that followed the shooting, a Twitter hashtag — #YesAllWomen — became an organic flashpoint for their stories. It’s hard reading — the kind of stuff that makes you want to click away to kitten pictures. The kind of stuff that you want to argue away just to avoid processing the sadness of it. The kind of stuff that guys like me — guys who think we understand the world and are used to having the microphone — quickly respond to with a prickly correction: “Not ALL men are like that!”

God forbid those women forget to defend us. It’s a lot easier to complain about unfair generalizations than to look, long and hard, at the stories they share.

* * *

Our world is full of enormous structural problems whose solutions seem impossible to imagine, let alone implement. Of course, it’s people like me — articulate, successful white dudes — who often have the luxury of treating these challenges like abstract thought exercises. They’re interesting problems to ponder when we’re not optimizing database performance or building new front-end frameworks.

I joke sometimes that I became a feminist because I enjoy arguing with people who are wrong, and misogynists and bigots are easy targets. It’s usually good for a laugh, but it’s hard to pretend that I have much on the line. When the argument is over, after all, things default back to a world that looks like me — a world that treats me as one of its own to be heard and acknowledged.

Perhaps, just maybe, it’s time for us to quiet down and listen to the people who have to live it.

We may not know all the answers, but we can stop pretending it’s someone else’s problem. We can listen.

The Ballad of Boris and Ray

Back in 2000, when dinosaurs roamed the earth, I worked for a software development shop that packed a secret weapon: a methodology. While other companies wrote mere code, we practiced Code Science™, a homegrown process cribbed from the hot new “Extreme Programming” movement. We had best practices and core principles and ten steps and lean documentation; we wrote tests first and we crafted user stories and we pair programmed. We explained to our customers how much better it was than the old way of doing things, and printed up impressive brochures about the new age of software development it would usher in.

No, seriously, we trademarked it.

In retrospect, of course, it’s clear we were swept up in the excitement of the early days of the Agile movement, and desperate to differentiate ourselves in the tough post-dotcom-crash market. Code Science™ did have some good ideas that helped our teams work through tricky challenges — we weren’t stupid after all, just full of marketing hubris. Despite our best intentions, circumstances eventually made it clear that a magical process couldn’t solve every problem.

Boris and Ray were two hardcore coders who worked on my floor. They were both efficient, opinionated, and enthusiastic about their work. They were both assigned to a new project around the time that Code Science™ was being implemented across the entire company, which meant they’d be pair programming.

For a week or two, the novelty of it kept them both on their best behavior: they were professionals, collaborating in realtime! It was a grand adventure! As the project progressed, however, they began to express... differences of opinion. Every task would turn into an epic battle as the two fought about code style, or tool choice, or loop structure, or pattern preference. They argued, everyone agreed, like an old married couple: first to win, then on principle, and eventually because it’s simply what they did.

It seemed intractable, but against all odds they eventually settled down. A month into the experiment, they seemed to have worked out their differences: the office was quieter, code worked, and I mentioned to Boris that pair programming was a success after all.

“Oh, hell, no,” he replied, glaring over his mug of coffee. “Ray is insane.”

Ray, of course, insisted that Boris was crazy.

They had, despite their differences, figured out a system that worked. Every morning, Ray would come in bright and early at 6am, rewriting Boris’s code until it was to his liking. At noon, Boris would roll in. Like WWII soldiers eyeing each other over the trenches, they’d discuss the critical issues that needed to be completed. Ray would leave, and Boris would work until late in the evening — tearing out Ray’s code, writing his own, and declaring it Done. Every morning, the cycle would begin again.

Both Boris and Ray were great programmers, conscientious workers, and all around smart guys who knew their stuff. They’d worked — together, even! — on other successful projects. Meanwhile, there were other developers around the company having great success with pair programming. For whatever reason, though, Boris and Ray’s pairing just wasn’t meant to be.

There are quite a few lessons I could’ve taken away from that episode, but the one that stuck was simple: the amazing powers of magical project acceleration that we attributed to our Code Science™ methodology were oversold. While the systems we used to describe our work played a part in our successes and failures, the biggest factors were simpler. The people working the projects, their skills, and their ability to work effectively together were what made the difference. Tools that improved those factors helped us, and those that didn’t were just fluff. In a 2010 blog post titled Agile Bullshit, Pawel Brodzinski summed it up simply: “Different approaches work or fail because of people, not because they are universally good or bad.”

If you’re good at what you do, the day will eventually come when you’re asked to do something terrifying: explain how you do it. Whether you’re a sous chef or a software developer, it’s difficult to translate your own intuitive processes, the hunches and gut-checks you’ve learned to rely on, into systems that other people can use.

When that day arrives, it’s easy to panic: “Wait, I’m supposed to be Someone Who Talks Authoritatively About Stuff? I should have charts and diagrams and four-point systems and maybe even acronyms, or people will think I’m just making it all up!” Carefully defined best practices and impressively-illustrated processes definitely have their place, but the truth is, that’s not where the magic lives.

Document processes because they work for you and help others understand what’s going on. Capture “best practices” because they save you pain and frustration and time. Research what others have discovered because learning from their experiences is good. Share what you’ve figured out so others can benefit — and offer insights you might’ve missed.

And the next time you’re tempted to put on a show and pretend that you’ve invented Code Science™ or Quantum Hosting™ or some other crazy new recipe to solve all the problems, remember the story of Boris and Ray. Do great work, share what you’ve learned, and don’t sweat the trademarked, nine-step methodologies. The humble truths of your real-world perspectives are worth more than a bandolier of silver bullets.

Squirrels Optional

As I write this, it’s a gorgeous, sunny day outside. That’s big news here in the midwest, where winter has been a grim, Hothlike spectacle of snow, ice, and sub-zero temperatures. Today, though, the ground is visible, the sun is out, and rowdy crowds of squirrels are scrambling over every available surface.

Of course, there’s no time to go out and enjoy it: I work in the world of web development and digital publishing. My company prides itself on work-life balance, but our industry is all about extra-curricular activities. There are new languages to learn on the weekends, side projects to work on in the evenings, and open source projects to contribute to in the margins.

I enjoy writing, so crafting articles (like this one!) appears on the activity list. I love photography, so a trip to the park is a chance to snap bokeh-filled pictures for my next presentation. If I want to take a break with a good book, it’s about content modeling or project management or business strategy or the fundamentals of graphic design or... well, you get the picture.

Talking to friends, I hear similar stories. Between the pressures of a fast-moving industry and a tech culture that loves to blend work with play, we could easily fill every minute of our lives with always-on ultra-productivity. It’s worse for those who really love the work they do: the all-consuming flood of work-ish side projects and research and conversations and learning can sneak up on you. If you don’t have a vocal partner or friend who’s willing to point it out, it’s easy look up, startled, and realize that you haven’t really taken a break for months.

This is a terrible thing.

The always-on lifestyle eventually grinds away the very productivity it’s meant to accelerate. Although some of us (particularly the young ones) can power through on coffee and cat-naps, the creative insights and fresh perspectives we need require disconnecting.

For me, disconnecting meant carving out time for a hobby that was utterly unrelated to my work and stepping back from a few projects I’d been involved in. I’d joined in because I was passionate about them, and they were all feathers in my professional cap, but there’s only so much time to go around. Sacrificing them meant leaving more of that time in the margins: time for family, time to relax, and time to recharge with things outside of the work I love. Sometimes, I have to grit my teeth and remind myself that the world won’t fall apart if my new code snippet remains incomplete for another day. Sometimes, my wife has to drag me from my desk for a hike around a nearby lake. And sometimes, to be fair, I have to pull her from her writing: the dangers of going all-in aren’t exclusive to the tech world.

No matter what form it takes, though, it’s worth it. Carve out the time to fill your metaphorical tank. Read a trashy novel even though Monday’s around the corner. Make some chili, because it’s delicious and you can play Threes while it cooks. And take the time to watch the squirrels while they’re tearing around the yard: those little imps are hilarious.

“You’re all doomed.”

When my wife introduced me to the world of opera a few years ago, I assumed it’d be a peek into high culture, not a lesson in keeping technology projects on track. But as we sat through Les Troyens — The Trojans — I watched a familiar story unfold.

Cassandra is a familiar presence in Greek mythology. She can see the future, but spurning Apollo’s amorous advances earns her a curse: no matter how accurate her prophecies are, no one ever listens to her. Les Troyens finds her in the city of Troy in the final days of its brutal war with Greece. When the Greek soldiers surrounding the city mysteriously disappear, leaving a gigantic wooden horse behind, all of Troy celebrates.

Obviously, the end of a decade-long siege means that it’s time to break out the champagne! Cassandra warns them that it’s a deadly trap, announcing that they’ll all be killed... but of course, no one listens. They’re too busy feasting and admiring their new monument. Their Trojan Horse.

Spoiler warning, folks: the horse is full of Greek soldiers.

Stop me if you’ve heard this before

If you’ve ever been the skeptical person in the room during a new project’s first, joyous planning session, you probably know how Cassandra felt. You’re seeing bad omens, and your spidey-senses are tingling — but everyone else is smiling and saying, “Let’s crush this!”

The horse is full of crazy deadlines, and no one will listen.

Getting stuck in the role of the naysayer is never fun, especially in the very early stages of a project. Often, the warning signs you’re picking up are vague, and easy to dismiss.  At those moments, it’s easy to lean back and turn the concerns into Cover-Your-Ass disclaimers. “Perhaps,” I sometimes think, “tacking an ‘assumptions’ section onto the project plan will shield me from the consequences of a disaster I fear is inevitable...”

As tempting as that can be, I try to remember Cassandra’s fate. She was right when she warned Troy that it was doomed, but she lived there, too. When disaster struck, she perished along with the rest of the city. If we really care about the projects we work on and the people we work with, there’s no joy in saying “I told you so.” The entire team suffers, and we’re right there with them.

Dodging Cassandra’s curse

In a mature team under ideal circumstances, gut checks can be enough to get a decision-maker’s attention, but there are always times when something more concrete is necessary. How can we overcome Cassandra’s curse, and turn our vague portents of doom into clear, unambiguous advice? There’s no magic bullet, but a handful of basic techniques can improve our chances.

  1. Catalog the uncertainty. Is there a hard deadline, but a fuzzy and ill-defined list of required features? Is unfamiliar or immature technology required to make it happen? Does the team lack an unambiguous set of success criteria? Make a list, and map out those scary shadows. Sometimes, there are answers and they’ll assuage your fears. When there aren’t, though, it can help decision-makers realize they need to head back to the drawing board.

  2. Compare the work to similar tasks and projects. If the early estimates for a large project feel too optimistic, it can be difficult to explain why. Whenever possible, find examples of similar projects or tasks from the past. Show how long they took, and if the estimates for those projects shared the same early optimism, point it out. As unpleasant as it is to keep time sheets and logs, they can be critical ammunition in the fight for sanity.

  3. Identify deep dependencies, in technology and teams. Are you building a mobile app that relies on a third-party library... which relies on a fourth-party service... which relies on a fifth-party startup? Does one department control the infrastructure your project will need to launch, while a second is responsible for content and a third handles the development? The more external dependencies a project has, and the deeper those chains go, the more risk there is. There’s no way to avoid reliance on outside teams or tech, but mapping them out makes the risks clear.

  4. Identify fuzzy authority roles. Few things are as depressing as ironing out a project’s requirements, building it to spec, and preparing for launch  only to discover that your client wasn’t really in charge. The last-minute emergence of a VP with different aesthetic tastes, or ongoing conflict between two or three equal stakeholders, can sabotage an otherwise well-run project. If you hear talk of “running the plan past a few other people” before it can be approved, or it’s unclear who’s in charge of key decisions, don’t be shy. Get a list of people with veto power and ensure there’s a single buck-stops-here person for key decisions, or wave a red flag. 

  5. Time-box and prototype. Especially when new or unfamiliar technology is involved, accurately judging risks and sketching out timelines can be impossible. Carving out a small chunk of time for a prototype is critical. If the exercise reveals unanticipated challenges or problems, you have concrete evidence to offer rather than vague concerns.

Towards a happy ending

The goal of these techniques is twofold. First, the work that goes into them can reveal solutions to the problems and clear answers to the troubling questions. Obviously, that’s the best outcome: successfully routing around danger rather than grumbling about it. If that isn’t possible, though, carefully articulating the concerns can make the pitfalls clear and unambiguous.

It’s an approach that goes beyond avoiding blame and puts important information in the hands of people who need it. It doesn’t always work, and we can’t always avoid the dangers, but it’s far better than the cynical alternative. 

Now, if you’ll excuse me, I have to look into the next trip to the opera. This time? I think we’ll try a comedy...

You matter more than the cause

A few years ago, I started thinking hard about the idea of “burnout.” It was no academic exercise: I was part of a high-profile Open Source community that was feeling the crunch, and the people I knew and cared about were suffering.

I say “community” instead of “project” or “job” because those words don’t do it justice. I belonged to a group of people that created a particular kind of product, and helped others get the most out of it. We had a charismatic founder who’d started the ball rolling, but within a few years it had outgrown him. His little backyard project accumulated hundreds of passionate contributors, a bunch of infrastructure, a few community conferences, and lofty long-term goals.

Many of us developed close friendships, and some built our work into lucrative careers. Thousands of volunteers worked to keep things running smoothly, and a growing number of people were paid to do it full time. But with more and more people noticing and benefitting from what the community did, the pressure ratcheted up… Which brings us back to the burnout.

While the outside world could see the growth and the success, insiders increasingly felt like the wheels were coming off. The work kept growing faster than we could keep up, overworked co-workers were losing sleep, and previously-happy planning meetings turned combative. A number of our best volunteers left, swearing they’d never return. Resentment started to build, as some of the volunteers questioned why a lucky few were paid to work on the project. The paid staff countered that they were on the hook for their work, unlike those easy-come, easy-go volunteers.

At best, we celebrated the workaholic 110% people and held them up as examples for everyone else. At worst, we started to treat healthy boundaries as a kind of betrayal, pushing out the folks who couldn’t afford to burn the candle at both ends. Looming above it all was The Mission. All of us cared about doing great things and helping others, and without realizing it we’d become convinced that dialing it back meant abandoning our ideals.

Anyone who’s ever contributed to a successful open source project or worked for a rapidly growing startup is probably familiar with that story. The details may differ — “volunteers” might be “community moderators,” the product might be a service, and the grand mission might be an IPO — but these pressures build whenever we feel we’re part of something bigger than ourselves.

The interesting part — to me, at least — is that my story isn’t really about an Open Source community at all. Instead of slinging code, I was part of a large Midwestern church. Instead of a prototypical Silicon Valley garage, it was founded in a suburban movie theater. And instead of a visionary CEO with a startup to flip, we had a pastor who wanted to help the less fortunate. The church drew on its huge pool of staff and volunteers to run soup kitchens, homeless shelters, free auto repair clinics for single moms, sports leagues for disabled kids, and… well, you get the picture.

Was my thematic bait-and-switch cheesy? Yes, but the pressure to do good, and to avoid disappointing the team, isn’t restricted to a religion, an ideology, or an industry. Whether you volunteer for a save-the-world nonprofit, play bass in a band, or burn the midnight oil to launch a web app, the same cycle of ratcheting demands, resentment, and burnout can be deadly.

I feel fortunate: Although it’s far from perfect, the particular community I was a part of invested a lot in building a healthier culture for its staff and its volunteers. The lessons it taught me about perspective, balance, and commitment have helped keep me sane through my second life in the tech industry.

Each of us has two roles to play in preventing burnout and repairing dysfunctional, overloaded communities. To help the people we work with and build a healthy culture, we need to celebrate them for who they are, not just the work they do. That’s doubly true when encouraging their workahalic tendencies benefits us. Am I willing to help a fried colleague reduce their commitments, even if their work is a boon for my team? When someone steps back from intense involvement or sets healthy boundaries, do I treat it as a normal part of being human, or a failure and a betrayal? The fear of losing colleagues’ respect or friendship can keep a well-intentioned person trapped, long after they’ve hit rock bottom.

If you’re the one that’s being crushed, the next steps can feel even harder. You have to look out for yourself, and say “No” when it’s needed. No mission, project, or community is big enough to sacrifice your health or well being; your responsibilities to yourself, your family, and your loved ones are just as important as a worthy cause or an investor’s profits. If a cause or a project can only survive by chewing you up, it deserves to die. David Hansson of 37 Signals doesn’t pull any punches in his advice to entrepreneurs who cultivate a culture of workaholics:

“If your start-up can only succeed by being a sweatshop, your idea is simply not good enough. Go back to the drawing board and come up with something better that can be implemented by whole people, not cogs.”

The good news is that communities, companies, and causes can change. If you’re on the edge of burnout, get support, scale back, and remind yourself that it’s OK. If you’re surviving but see others around you getting pulled under the waves, offer a helping hand and let them know: “It’s okay. You’re worth more than this work, and who you are is more than what you do.”