Ben is from Cambridge in the UK, but now lives in San Francisco working as a front-end developer at Twitter. There, he builds things for the platform and API groups. Ben also writes about the web on his personal blog, and about food with his friends from London at Munch Munch, a food blog they founded in 2008. He's involved in microformats.org, where he writes and edits specifications and assists as a community administrator. He collects vinyl, loves cheese, cooks pretty well, and has really wished he had some time to redesign his blog for about 5 years now.
Ben also posts tweets @benward.
Rejoice, for ▊ is dead! Inefficient, unsustainable, made obsolete and worthless. It is obscene that anyone wasted their time with ▊, and it’s inconceivable that anyone could ever build success on it again.
It is not enough to reject ▊ in our lives alone, it must be burned from others, too. We must dedicate ourselves to the total eradication and discredit of ▊. Its value an illusion; devils tricks for inferior minds. Our new way is the true model, that is our way now, and that way will be the way of others.
Punish those who advocate for ▊. Strike down anyone who would indoctrinate youth with tales of ▊’s past glories. Ridicule those who succeed in spite of ▊. Cry havoc from the clock towers at how much greater we could be without ▊. Any success with ▊ is our failure.
Our world is constantly reinvented, rethought, tweaked, embraced, rejected, and iterated upon at pace. Made better, routed around failure, democratised, stabilised, and commoditised. That you are reading this invocation at all is testament to a polyculture. Therefore you, as a soldier to this cause, must ignore all of that as you step to the pulpit, transpose consonants for exes, and proclaim to your flock that “▊ sucks!”
Reject all context, culture, and nuance. It is only by cleansing yourself of these today that you may learn new contexts, cultures, and nuances to reject again tomorrow.
“The web”, “RSS”, “apps”, “Intel”, “AMD”, “Microsoft”.
“Movable Type”, “Movable Type”, “Apple”, “the web (again)”, “PHP”, “the W3C”, “physical keyboards”, “software keyboards”, “America”.
“Considered, rational discourse”.
What is there to fear?
I fear the loss of the web. I fear that in the stampede toward mobile platforms and the incredible interactivity therein, the next generation of information, personal expression, and publishing is being locked away, kept by individuals, and that much of it will never escape.
I fear the people who regard this new fragility as a feature that protects their investment in network infrastructure.
I fear that those who fight for the deregulation of the industries they disrupt will later fight to deregulate the internet itself of network neutrality just as soon as they are powerful enough to entrench their commercial advantage.
I fear that for all the work we still have to do to improve the diversity of participants in our industry, we’re just a few short selfish steps from throwing it all to the same media juggernauts that ruined television and radio.
I fear that it is inevitable someone will be successful in dismantling the BBC.
I fear that while I do a job building someone else’s vision, I might be missing the opportunity to build my thing.
I fear not knowing what my thing is.
I fear that I might never know.
I fear that when I do figure it out, it will fail.
I fear the consequences of being too comfortable, whilst day-to-day I am immensely comfortable.
I fear because none of my professional skills interface with the natural world. Everything that I can build is atop a dozen lower level human inventions. Programming languages atop APIs and frameworks ad infinitum, atop processor instructions, atop electrons, atop the refined generation of electricity.
I fear that I’d be useless in a natural disaster, because no-one is going to need a website.
I fear that I’m over-thinking this stuff.
I fear that I’m not thinking about it enough.
I fear that I’m not alone in my fears, and that others might have already given up, and moved to take advantage of their fears being realized.
I fear that my fears are someone else’s goals.
I was drawn to the web inspired by the potential that open information sharing offered the world. That was the enticing lure. No matter what I work on in my lifetime, I hope to leave positive contributions toward furthering that potential. Sometimes it might be work dedicated entirely to the cause, other times it may be byproducts of another system. Regardless, by and large I’m happy knowing when the work I do is good work.
The achievements of our connected technology are almost always expressed and preached in terms of global effect. Bringing everyone closer together. The websites we build, the applications and services, are accessible to all, and success (when not measured in terms of raw financial return) is measured in reach and broad social impact. Even niche successes will attract users the world over.
In traditional industry, the growth of new business will have a dramatic physical impact on the local economy and culture. The growth in goods manufacture will profoundly affect local employment, local wealth, all manner of supporting industries and suppliers, and influence the focus of local education. It may bring immigration and diversity, it may also affect the local environment negatively with the introduction of pollutants and waste. Either way, it is an active, physical effect. Furthermore, a physical product itself will initially target local consumption.
With development for the web, there are similarities, but the immediate, global nature of the internet causes me to struggle with a disconnect from my local contribution. What do we provide to our community itself? How do we offer our work to the benefit of the people most closely surrounding us? Is it even possible to do so when also appealing to the entire world?
I think that it is a natural desire to contribute positively to our local community, and to do so through our work. We want to build things that make a difference to people whose faces we see; those whom we interact with in the course of our daily life. The web can seem like a difficult medium through which to do that, everything is so broad and so big intrinsically.
Although local people may make up just a small drop of everyone who can see your work, they are there. It’s just that sometimes you need to adjust your mindset a bit to recognise where you fit in. It’s a lot smaller on the ground, but it’s your direct contribution to society, and it’s important to be engaged with it.
We are obsessed with the idea that technology is too complicated. We aggressively push an ethos of simplicity and mono-interface, and we cheer for benevolent, tasteful gatekeepers as the one-true-way to bring the benefits of connected computation to all. We are obsessed because the problem is so hard. Because we are obsessive personalities. Because distilling the real complexity of computing for the masses is a worthy and vital goal.
But it is not the only goal. And while many of us know it, and I believe it, and no matter how much I might hold it dear to my core, sometimes it takes a little push to really feel it, and remember: Underneath all of the interfaces we made is an open architecture more powerful than any one service, that remains open and available, inexpensively, to anyone who’ll learn it.
In 2010 I attended a workshop in San Francisco entitled ‘starter care.’ It was about the art of bread making; specifically sourdough, which is the staple variety of bread in West Coast America. The starter is a simple flour and water mix, nurtured over time to cultivate natural yeasts. You feed the starter regularly with fresh flour and water, maintaining a certain consistency to your preference; some bakers keep a very dry, dough-like starter, others abhor viscosity. When it comes to bake a loaf, you feed the starter once more, extract a sample, and use that as the base of your bread dough.
Over time—years, even—each unique starter takes on its own unique flavours and the loaves you bake from it will be unique in their way.
There’s one other aspect of bread making culture that’s noteworthy here, and that’s sharing. The same starter, constantly fed and replenished, can be maintained for decades; forever, really. Often, when someone begins a new starter they will take a sample of another, and use those cultures as the basis for their own rather than work from nothing. Furthermore, it’s a heartfelt kindness to give someone a sample of your starter as a gift (the starter I have at present came to me this way.) Over time there will be variance from the original. The flour and water ratio may change, and the air and water in the different environment will introduce new bacteria, changing the flavours of the yeasts growing within.
I’m leading up to a glaring software development analogy, but I want to focus on something specific. In open source we mostly celebrate singular projects: Centralised entities run in the open, accepting patches from far and wide to strengthen a core. Your JQueries, Bootstraps, and so forth. I want to draw attention to the starters of software; the code that people share not with the intention of getting patches back, but code that provides for someone the superior basis for their own work.
A number of years ago, my friend and colleague Mark Norman Francis put his unix home directory on GitHub. Within that repository is a katamari of scripts that Norm uses on a day-to-day basis to operate his computer. This is not code that you would necessarily run as-is on your machine; the purpose of sharing was for people to use it as a starter; a successful structure from which to learn and reuse and go your own way. The network graph shows a dozen others forking the work and tailoring this open code to do whatever they need to do with their machines, all making something unique of their own, all the better for the starter Norm gave them.
Likewise, Jekyll bloggers share the source for site structures to help others start their own blogs from scratch, and projects like HTML5 Boilerplate carry a similar spirit of being the starting point for new, unique things.
The most wonderful success of open source is not in its behemoth projects, but in the establishment of an institutional, cultural normality wherein which people share what they have with anyone who wants it. It is a heartfelt kindness to give someone a sample of your starter.
When I was a kid we had a Sega Mega Drive (or, Genesis as it’s known in the US). I recall how a lot of the games I played, and a lot of the games I remember most affectionately, were acquired with some degree of fluke.
Sensible Soccer (originally on the Amiga) was probably the single best game I owned on that console. It was supremely fun, and no game since has captured football in quite the same way. The simple graphics left so much of the game to your imagination, whilst responding subtly to inputs that let you realise pretty much everything you wanted to imagine. Tiny, cute little sprites performed their crude movements at just the right pace to inspire an imagination running free.
Sensible Soccer remains the game that I’m desperate to acquire for the Genesis at Twitter HQ (if you have a copy, we should talk,) but the point is that a 20 year love affair with a video game started by chance, by seeing this game on a shelf, with well-designed box art, with only an inkling of reputation, without knowing for sure what it was all about. Then buying it anyway and living happily ever after.
When we first got Internet access at home and I started using Instant Messaging networks, I used Microsoft’s MSN Messenger. Actually, I used ICQ first, but since I’m not that old, and I tagged on off the back on the Quake 2 gaming scene, I was in only just before the service fell out of favour. MSNM seemed worse in so many ways, and I hated the cutesy emoticons, but I used it because my friends did. No other reason. There was no assessment of function verses ICQ or AIM, it was use MSN or don’t talk to friends online at all.
Some choices we make based on informed quality. When we can, we choose an Apple computer over a Windows OEM because it’s a better made tool on which to run better designed software. We choose furniture made from solid wood rather than veneered chipboard. We research and assess our options, weigh costs, and concluded that one product will last longer and prove better value that another, along with whatever other criteria we hold.
Other times we make inevitable choices based on local ubiquity. We have Facebook accounts because a terminal number of people now expect you to access their up-to-date contact information from a profile they keep updated, rather than sending you an explicit notification every time. Also, because we’ve hit our late twenties and the friends you left behind in England have started having babies and stuff.
Finally there are the choices that we barely make at all. There are the accidents. There are the times you discover some place, or website, or service, or game and for whatever reason you go with it. You want to play chess against your Dad and you just search for it and you pick the third result. Some subtle combination of aesthetics and emotion takes you over the top and now you’re using a service that you knew nothing about, but it’s working for you, and might even grow into a very important piece of your life.
Now we’re all grown up. The things that we build have the chance to be beloved, cherished discoveries for someone new. No matter the scale of your success, it’s going to matter to someone. Make it count.
I’ve been thinking about the intersection of ownership, responsibility, and infrastructure in the development of businesses on the web. Users and potential businesses are involved in a difficult balancing act of ownership, obligation, and expectation. Every new service on the web seems to rub up against this at some point, regardless of the funding model. What’s more, I think that if the lessons of this generation of start-ups are clearly understood, start-ups and applications should be able to take a more fearless footing as they grow.
So, here’s the basis of most web applications: You store data, which other people own. Other people create things and combine them with your service, either at point of creation or distribution. What you own is infrastructure; the machines, the principal applications that connect it all together, the interfaces through which people interact with their creations on your service. This infrastructure belong to you. At a basic level you have an obligation to your user to provide them with access to what’s theirs, but it’s all on infrastructure, which you own.
Now, although the data does not belong to you, the operations that aggregate that data, through it being entrusted to your network en mass, do. This is your product. You are entitled to profit from features built on these aggregations, insights and infrastructure.
Consider Rdio, MOG, or Spotify, and consider Last.FM. All are music companies, and all stream music. The first three—at least initially—have core businesses built around streaming alone and have developed infrastructure to that end. But, the product users pay for—the music—belongs to a third party. This makes them vulnerable, since changes from their supplier could cause a sudden imbalance to the entire business.
Last.FM—although also having a streaming component—has a product of its own, in the form of the aggregated, processed, and presented Audioscrobbler listening data of its users, that it serves back to them as a service. The user owns their listening data, the music labels own the music files, and Last.FM owns the entire infrastructure of data analysis, aggregation and presentation (and the iterative uses, such as music recommendations). There is a balance between the service and the user. If the streaming music licenses were pulled tomorrow, there remains a business in the data the user owns and the services Last.FM has built on it.
(Of course, online music is far from the simplest example, since there are so many licensing factors and contracts muddled into it. Rdio and Spotify surely have contractual assurances from labels for some period of time, and are working hard to build out unique aspects of their services as they grow—reviews, libraries, web playback APIs, and nested applications, for example—so please don’t mistake this example as writing them off.)
A well-balanced application doesn’t have to lock data away, and has an understanding with their users about what they give and what they get. In the applications we build, in the businesses we try to found on the web, this balance—or understanding thereof—is what we must strive for from the outset.
But what of APIs? APIs are interesting things. Beyond the raw basics—providing high fidelity data to your users—they enable a specific group of users to grow usage and personal investment in your service, and even define whole new usage patterns. With time, the influence of third-party designs can become de facto, and the core service may be shaped by them.
However, the idea that your service is obligated to third parties is muddied. It’s a relationship, because although ideas developed in the wild can prove essential, they could not succeed in isolation without your infrastructure, collective user-base, and even the adoption of those popular ideas themselves into features that other users come to understand. Everyone needs to understand that; you, as the provider of an API, and any user who chooses to build on top of your infrastructure.
The value of a business on the web comes from broad infrastructure. It’s the things you build that allow people to do more than what they might do in isolation. You provide and support the platforms on which people build new ways to perceive their creations and others. If successful, your business supports your work and yourself because as a whole—you, your users, your backers and advertisers, those who build around your infrastructure—you create a healthy, balanced relationship.
Recently, the internet delighted in petty squabbles between developers over what I’ll generically refer to as ‘technical style.’ I’m going to skip the sideshow of specifics, but they each involved punctuation, to a greater or lesser degree. Wound up inside these arguments over technical style are preferences over tooling, education, expected competence, different interpretations and experiences of interoperability.
As people learn, they document findings, either adding to the common school of thought, or rebelling against it. There will be both right and weak conclusions. With time and experience, the better writing turns into more substantial references, like books. Then those articles and books depreciate as the examples within them age.
Here’s our problem: In each generation of web development, we are leaving people to relearn the same lessons of the last. There are techniques backing various aspects of technical style that should be stable and constant by this point. There were strong reasons for one style to be advocated over others in the first place, right?
Two two noisy tribes: One experimenting with their web technology, fluency and productivity through variations in style. Opposition to this starts with a legitimate technical hurdle, but is followed by a torrent of establishment and dogma. Given that research over ten years has gone into establishing the style guide of the web, surely someone would be able to simply reference an earlier work? Everyone interested could revise the issue, reestablish the conclusions, and move on.
But nobody referenced anything in these debates.
Instead we got superiority complexes, and snark, and trolling, and image macros. In vacuum of authoritative references, some rushed out to write new ones, immediately muddling technical worth with ego, and vapid condescension. Elsewhere, a banal portmanteau is uttered with hair-brained sincerity. A kitten dies.
We’ve taken our knowledge for granted, passed it on with examples and well-meaning advice, but failed to establish our references. It’s a different kind of archival shortcoming. We often think of redundantly storing all of our web content, but here we’ve failed as librarians. We knew the knowledge was out there, we trusted the knowledge, but when it was challenged no-one had a robust answer. In its place, dogmatic presumption undermines a lot of the effort around education and best practice, and the cooperative attitudes of our community.
Besides insufficient cataloguing, one reason for uncontentious, enduring references on technical style to evade us is because they’re attached to otherwise ephemeral writing. These patterns we take for granted aren’t being documented well enough. Because the examples we use age. Because the contexts they’re presented in become obsolete and confusing as browsers. Because we don’t have time to be librarians, because we’re building new stuff, because we’re teaching by example. This is all defensible, but it’s evidently not good enough. We can’t work from the prescribed style to a full understanding of the language. If we believe in the web, we have an obligation to support the education of our peers and ourselves; the technology we use is always changing, no-one will ever stop learning. Perhaps this requires a kind of reference writing that blogs and magazines have neglected, but which is done well in other technical writing efforts that avoid issues of style: API documentation, for example.
Technical style is just one small piece of building a successful technical platform from the web. If we care about style—and apparently we do—we need to do better at recording it. We need to build better teaching materials, we need to ensure that people can establish a full understanding of the web and its technology, not just the pieces we ring-fenced as ‘safe’. We need to write for our industry’s encyclopedia, not just its tabloid.
I was three years old, laying horizontally at the top of the stairs of the first home I had with my parents. None of my siblings were born yet, and I remember very little else about life before my brothers and sister being somewhere nearby. The carpet was grey, with a hint of purple in dull light. I rolled down each grey step, one at a time. Except that now, twenty-five years later, I don’t really remember it, because my perspective is from the foot of the stairs looking up; my vision of that moment shifted into the third person. It’s not what actually happened, and I can no longer really be sure what I did there. But, whilst it’s fuzzy, this momentary event remains special and preserved. Thinking of it makes me happy.
So, when is something really momentary on the internet? Time was that random, accidental, delightful events would happen to us in the world, or in conversation, or by chance where we stand, and be just that; moments to be remembered.
We are now intertwined with a medium on which everything is stored publicly, in multiple formats. A redundant copy is made when anybody even looks at something on the internet. Can something still be momentary if it exists across hundreds of computers, for indeterminate timespans?
Simultaneously, more of our casual interactions have moved online. Conversations filled with quips and jokes and sparks of serendipitous chemistry don’t just happen in person any more, they happen in conversation and commentary in text, and in reply to the sharing of photographs. Each person participates in this online experience at a slightly different time, delayed at least by network latency, and perhaps a little longer whilst reading something else on another page. Hundreds of others will relive your moment when they read it in the minutes, hours, and days following. When you catch a silhouette against sunset, reach for your camera to capture it. You won’t need to recount the story of your child’s first steps, because you can replay it.
Pics, or did it not happen?
Before, moments were remembered. You carry an image in your head and think back to it. When you later recognise its importance you might write about your memory, scrapbook it, or share it with others through stories. You might refer to a photograph from near that place or time, or an artefact of another memory, in support of your account. Depending on how much time has passed, the accuracy of the memory of the moment will change, decay or be embellished. The moment remains true.
Now, the internet has enabled us to preserve not memories, but moments literally, first hand, in real time.
A strange thing happens when a web service shuts down, or sells up, or alters its business model. The preservation of our literal moments is threatened, and we may find ourselves with only the memories left. Entire chunks of our lives could change format in an instant. Are they remembered as well as they might be if they weren’t captured so precisely to begin with?
We now rely on the web to preserve the moments of our lives in a way that we never could before. We expect them not only to hold on to our moments, but to recall them for us, too.
You take a photograph and you post it to the web. How often do you revisit that photograph later and feel inspired to write about the moment in another place? When that service goes away, will your memory go with it?
Are our architectural expectations of the web—our demands for archival, preservation, and export—at odds with the established human way of preserving meaningful moments by recording our memories instead? Without meaning to offer excuses or defence to businesses who are careless with their users data: Is it even right for us to refer to them as the canonical record of our lives?
Do services set our expectations correctly? What if a service came along declaring that actually, the content within was momentary, and that if you wanted to preserve it you would need to create something new? Think of This is My Jam, whose posts and the commentary they inspire disappear after seven days. Or 4chan, where posts simply drop off the page when the hive mind moves on.
The pressure for services hosting our creative works to take good care of that data must not relent. Theirs is a responsibility that needs to be better honoured. But at the same time, shouldn’t we also preserve what matters most in the ways we always used to? By recording memories, not just moments.
This month I’ve been working with my personal site. The last time I wrote on it was in 2010, which is a little odd for someone who enjoys writing as much as I do (I do also write on Twitter and Tumblr, though.) Part of the problem has always been a failure to find time for building out a site that meets ruthless personal expectations. Expectations that triggered various drastic measures at various points to try and expedite its completion. (One time, I removed all the CSS and left it bare, for example. It was supposed to motivate me to build the real thing. Eventually I just hacked on some more styles to make it legible and it remained that way for two years.)
On this occasion I was exported every post from a database into standalone text files, and that has led me to stumble upon and re-read some, including my very first blog post, from July of 2004. In the very first paragraphs I wrote:
It was, I suppose, inevitable.
I’ve spent many, many months mumbling on about the incredible CMS I’m going to write, and as per usual there is nothing to show for it. There is, again inevitably, the need to have a website in the time between “Now” and “Then”.
Eight years on, “then” is still a long way off. My inaugural post then concludes:
The plan with this blog is to write in it on occasion […] and at some point develop my own super-flash skin/theme/pretty whatsit for these pages, in such as way as not to leave me hating Web Development for the rest of my life. What with doing it for a living and all that.
Eight years on, I am in love with Web Development as my vocation more than I ever have been, and yet ‘finishing’ this longest of long term personal projects is never on my mind. When I work on it, I fall into the exact same rabbit holes as I did at the start: Indulging in distracting, glinting fragments of technology or design, and eventually shipping a site unfinished in a fit of exasperation.
What I’ve come to realise in this time is the value of a personal project that is never done. This site on my domain represents me personally and professionally. The social network leaseholds that host my more regular online activity will come and go and change, but this site is the canonical digital reference for ‘me’. Like the real me, its wellbeing is sometimes a little neglected, nor does it frantically keep up with design or technology trends. Also, it could stand to have some of its resources minified.
An eerie metaphor is not what makes my site valuable to me. The value is that since it’s never finished, I can change it at will. I can become interested—as I have—in hosting content via a git repository rather than a database, and I can tear everything apart to make that happen. I can pour hours into perfecting the export script that ensures every piece of important metadata is preserved, de-normalized, and presented better in the new site. The visual design is barer than it ever has been.
We have a limited capacity for the minutiae of finishing projects. It’s exhausting, and once it’s done there’s stability, and finality. When you ship it, you’re drawing a line and moving on to the next thing. If you want to scratch a different itch, you need to build a different project.
I’ve come to understand that this project I can never truly finish comes with creative freedom on a whim. Projects like this are rare in that they demand nothing, yet give you everything. You owe them to nobody but yourself, and I think I’ve finally learned to embrace that.
As a young boy I dreamed of playing for my favorite football team. I'd dream of lifting the FA Cup, and then watching the slow-motion replay of my audacious winning goal from every angle on Match of the Day. I'd dream of everyone in the country having seen it, too. Likewise, I suspect that most people hold some craving to build the next biggest thing; to build a service that could somehow reach everyone and affects so many lives in some way. But, in childhood and in sport, we also find heroes the players and managers that take our small local team to relative success in any league.
The growing size of our internet community has had some effects on the presented aims of sites and services, and the purported value of them. We live in a world where it is possible that as you read this, the largest web service of the moment—Facebook—could have an entire billion active users. They are rightly acclaimed for this formidable achievement. Twitter—where I work—gets similar plaudits for its effect on mass public communication and media, just as the epic sites of five and ten years before did in their own ways. But though these sites are the gas giants of the internet, they are also distorting gravity for other services, and we see it every day as millions of dollars is invested into seed rounds, setting high demands on a return.
The best of us who work on the internet thrive on being alert to the next idea. We are harsh critics, and our lust to improve and iterate on everything we know is insatiable. From time to time, when the conditions are right, we'll take a chance and see if we can make it happen.
But how? And why? And for whom? On any day you can look at how our industry presents itself, and in its media, and all you'll see is an egotocracy of who-knows-who, and disproportionately localized financial investment. It can seem that unless you're trying to win the FA Cup, you're not worthy of any attention.
We must reject this. We must recover our sanity where 100 million users does not represent the goal criteria of every new service. We must recover the mindset where a service used by 10,000 users, or 1,000 users, or 100 users is *admired, respected, and praised* for its actual success. All of those could be sustainable, profitable ventures. If TechCrunch doesn't care to write about you, all the better.
If you are fortunate enough to work on your own product, with your own idea, and build it, and ship it, and reach enough people willing to sustain you financially for that immense amount of work, you should be applauded. You have poured in inordinate effort, and succeeded in making something that improved lives.
If your idea resonates with 5,000 people, then congratulations. If your idea resonates with 5 million people, then congratulations. If your idea resonates with 500 million people, then congratulations. Never forget that the commons of the web thrives on serving niches, sharing markets with other passionate people, and making your own success. You can think of some products as ‘small’, or ‘niche’, or ‘indie’, or ‘artisanal’, or ‘specialized’ all you like, but we must not deny their achievements with fantasies of size and monoculture.
It should not be demanded that a service reach everyone to be considered relevant. If anything at all can be ‘demanded’ in this context, it is only that you be held to your own high standards, and that you take your ideas as far as you can. Whether it's one hundred or one billion users, we should all recognize success.
It has been just over 10 years since I built my first website. 16 years old with a copy of Microsoft FrontPage Express installed by Internet Explorer 4. Cut ahead, I was delighting in the elegance of writing HTML; declarative, simple semantics, expressive enough to be inherently accessible; HTML is a most wonderful invention for the distribution of information to all. All accessed through this strange tool: The browser.
IE4, Netscape 4, a year or so in Seamonkey before Phoenix came out. Firebird, Firefox, dalliances with Opera, and eventually Safari.
The semantics of HTML, and additional vocabularies such as microformats, are all but ignored by browsers. The browser could collect for you a social history of people, not just an address history. The browser could collect the events you view, even as a corresponding Mail client collect collects iCal invitations for your attention.
You could look at modern browsers and conclude that ignoring rich mark-up like contacts and events is just part of a design trend toward minimalism; to be a bare canvas for the rich capabilities of modern web standards. However, in a decade of learning the web, I recall one of my earliest critical assessments: If I mark up a column heading in my code, why won't the browser allow me to sort the table? To this day, I'm still not sure that there's a good answer to that.