The creators of an app fail to consider how it could be used to harm. A founder implements significant changes to community tools with little thought given to the consequences. Thousands of entrepreneurs and investors contribute to the creation of an advertising model that collects enormous quantities of data on users but apparently never imagine how that data could be abused. Organizations declare an end to “structure” without asking themselves how power is exercised absent such systems. Others claim they operate in a meritocracy without, it seems, ever really wondering what such a world would even look like.Here’s a very simple method: when you set out to make something, whether it be software or policies or mechanisms for organizing information, ask yourself what’s the worst that could happen. Imagine a powerful person, someone endowed with the right circumstances of birth such that the odds are nearly always in their favor; and imagine, also, the reverse—someone for whom discrimination, oppression, violence, and poverty are commonplace. Then optimize to protect the latter, even at the expense of the former. And do it right away: not after you scale, not after the money is rolling in, not after a leak exposes you, but now. Yesterday, even. Go.
25 Mar 2014