Anne van Kesteren

Anne van Kesteren is a member of the WHATWG (less commonly known as The Web Hypertext Application Technology Working Group), where he writes standards for the web platform. He is the editor of important specifications about the DOM, CORS and XMLHttpRequest, amongst others. Anne speaks about the web platform and its politics at various events and conferences. On the Pastry Box Project, Anne publishes his texts under CC0.

You can read his posts on his blog and follow his tweets at @annevk.

Published Thoughts

We have operating systems, browsers, app markets, and the web. I want to change the world so that the operating system is the browser and its app market the web. I am using this last post to describe this system in a little detail as it is exciting to think about, yet much is still unclear.

Foremost, everything on the system has a URL. You bookmark https://mail.example.com/ to “install” your Example Mail app on your home screen (about:home). Some of these URLs could be built-ins, e.g. about:settings orabout:phone, to deal with configuration and legacy applications the web does not really have an interest in supporting natively. After all, with WebRTC we can make https://phone.example.org/ happen (and so much more).

Being able to navigate to these URLs is essential. This allows apps to be integrated with each other in the same way sites are today, using <a href="">. On this system apps and sites will be within the same continuum.

To get there offline will need to work. A flaky network should still give us access to our bookmarked URLs and let us take photos with http://betterphoto.example.com/nightly. We need to get smarter about allocating storage and not prompting the user for it. And since we will be putting the trust with the end-user rather than a centralized app store, we have to get creative about APIs that are today deemed privileged. Can we do a socket API that works for sites? Can we expose Bluetooth? Is exposing a Bluetooth API even sensible on a platform that will likely outlive it?

That is what I’m working on so one day I can say “Native is dead, long live the native web.”

The web platform consists of many layers. And many of those layers lack in detail. These subsystems are generally understood (in the sense of X goes in, Y comes out), but the exact rules of each varies across the board and it takes time and a lot of effort to plug the holes. I like plugging holes. Finding the bits that are not defined or incorrectly defined and fixing them is one of the pleasures I take in my job. CSS 2.1 is generally seen as the first real attempt at doing that within the web platform community. A more recent example is the HTML parser. Parsing HTML has been an unpopular reverse engineering project for over a decade, but nowadays you can simply write some code that matches requirements written in plain English.

At the WHATWG I am leading a couple of such efforts, trying to standardize the DOM, Encoding, Fetch, and URL subsystems in great detail. These web platform layers have existed for well over a decade now and converging their implementations is a costly and painful process. Long term however, having the same behavior across clients will greatly benefit web developers and the long term health of the web.

End of the third flight was nigh. “We’ll see you at the airport.” Not exactly true, but close enough I suppose. And finally some different lines from the standard spiel. I have the feeling she might have been slightly drunk, with the other steward smiling awkwardly while she spoke to us. But who knows, maybe she was in an unusually good mood, hard to tell sometimes.

A lot of lights again, Boston this time. Light pollution is probably way bad, but looks nice from up high. Also no enforcement at all today with regards to wearing headphones during take off and landing. Feels like an improvement, though maybe the three sets of staff encountered just care less. I certainly did, with what ended up being an eight-hour delay for me with regards to getting here, I was in no good mood. I did my best not to take it out on anyone in particular and even cracked some jokes with the Starbucks folk in Detroit, but boy I was pretty livid in Amsterdam. There is nothing quite like an airline letting you down.

She just informed us she will say goodbye to everyone personally. “Vaarwel!”

When writing specifications, it is important to describe the world. The model. If you want to write a specification for zip archives, you would start out explaining the world of zip archives. E.g. “A zip archive is a map of zip path/zip resource pairs. Zip paths and zip resources are byte sequences.”

On top of this small world you can build other things. For instance, an API. Now if your API supports enumeration, you run into a problem. You either have to order at the API-level, leave it undefined (bad), or change the model. You also realize it is not clear whether you can have zero pairs. So you change it: “A zip archive is an ordered map of zero or more zip path/zip resource pairs.”

You’ll also need to define the mapping of the zip format to zip archive. That is, how you go from the input byte stream to your model. And maybe the reverse if you support serialization. And slowly your world starts growing up while staying relatively coherent and clear.

This all may sound relatively trivial, but many specifications fail or struggle to define their world and how APIs, formats, and URLs interact with it. So I thought I’d mention it and maybe help someone.

There is a lesson we learned doing CSS, HTML, JavaScript, et al on the web. Mode switching hurts. Quirks mode has not been worth the cost. Absorbing the few differences and providing ways to opt into the desired behavior would have been better. Similarly, "use strict", introduced as consensus compromise by TC39 while developing version 5 of ECMAScript, still causes trouble. Deciding how new features in JavaScript have to work often begs the question as to whether that needs to be different in "use strict" code or how it would interact with such code. Networking seems to somehow get away with it more easily, although we still have support HTTP/0.9. I suspect this might be because there are less moving pieces and deploying a server stack is generally more involved than writing some HTML. I’d be interested in seeing research in this area.

Brendan Eich wrote this somewhat poetic entry on why we evolve the web and its many features rather than introduce versioning or replace features wholesale:

We do not get to re-engineer our mtDNA in situ now that it has spread throughout seven billion people. Making life with new mtDNA would be interesting (and dangerous). Unless you make a virus that rewrites seven billion humans’ mtDNA, you have to deal with “backward compatibility”.

Backward compatibility is what binds us most. It is deeply engrained in the web, and in the internet. If someone could get past it and spin up a new web, they would soon enough face the same problems we face, perhaps with a bit better up-front design helping them get a bit farther toward one idea of “perfection”.

But the evolutionary system doesn’t care about “perfect”, it cares only about “better and backward-compatible enough to hop to”.

Within the web there are local exceptions and variations, but the general tide is this.

Via programming is terrible I stumbled upon You and Your Research by Richard Hamming. I read through the transcript (great read overall) and while I’m not a scientist, I do wonder, what are the really important problems on the web?

Patents obviously. Unfortunately no attack scenario for those. No attack scenario for increased government interference either.

Mobile maybe. Although mobile more seems like a big opportunity than a problem per se. We need to get offline to work, keep pushing performance, introduce more features to enable applications to work without walled gardens, etc. And of course site developers need to stop being silly about mobile and make sites just work. No endless redirects that lose track of your target page, no “install app” spam, no plugins, you know.

Parallel computing. It seems like we will get ever more cores, yet most of the web platform is a single-threaded operation. Workers exists, but we need to do more.

Would love to learn what other people think the really important problems are.

Within the standards world every now and then modularization comes up. “Standards should be modular!” One could imagine Jeff Jaffe (W3C’s CEO) going on stage at one of W3C’s conferences and yelling “modularity! modularity! modularity!” and one would not be too far from the truth. The standards created however are often not modular, but rather bolt-on solutions on top of the existing stack (often bolt-on solutions themselves). So rather than modules that evolve over time, we have an ever increasing set of standards that patch each other, often in non-obvious ways.

One way this is evident is that the software that uses these standards often uses a different architecture. In software CSP is not a single module, but rather would be part of e.g. HTTP (header processing), fetching (protocol for retrieving bits out of URL used by HTML’s img element and every other piece of content that can be used to initiate a request), and some mediation part that enforces the policy. In part it makes sense to design new features separately initially. This helps implementors to grasp the work that needs be done and developers how they can make use of this new feature. But long-term it harms the understanding of the platform. Say we introduce a new feature that performs a fetching operation. What will its effect be on CSP? What will its effect be on other specifications bolted on top of fetching? You do not just need to use the fetching protocol in the right way, you also need to patch CSP and potentially other places. In other words, the modularity has left the building.

The standards process needs to become more flexible so the documents that describe the web platform can evolve over time and change shape to meet new demands and constraints. The WHATWG has been pioneering this model for close to a decade now and given the superior class of documents that have been developed it seems about time to take note.

TV manufacturers are digging their own grave. I just moved to London and am trying to acquire a TV. In the stores I noticed they are promoting features once again, rather than ease of use. TVs come with processors these days. Some better than others as the sales guy made clear. A better processor gives you better streaming 3D on your connected TV. Most have a browser and offer a YouTube “app”. I used YouTube once via a remote control — it was a hilarious demonstration of a user interface disaster.

The complexity increase stands in stark contrast with added end-user value. Pretty much exactly as we were used to with music players, phones, and computers. Drawing the parallels is easy. What users care about is battery life, usability, aesthetics… Not the amount of CPUs a device carries. All I want is a display to which I can stream content, that looks somewhat nice, and is somewhat big. I do not care about connectedness, 3D, 3D glasses, processors, and the myriad of other options.

It seems to me that in their haste to make money out of large monitors they forget that this opens a door. Someone will offer a way cheaper largish display that is integrated with AirPlay (or equivalent) and lets me control it from whatever device I happen to be holding at the moment. No remote control, no features, just “display this”.

As the sad news broke that my former employer will stop developing its own browser rendering engine, John Lilly wrote the following:

What we do know is that in technology, we’ve never been served well by monocultures — we know this for sure. I worry that in our desire for clearer definition, easier standards, faster progress, we’re forgetting that we know this.

Time to get back to prevent that from happening.

Having written about my departure from the W3C last month, it seems only fitting to write about my return now. I was elected to its Technical Architecture Group, as part of a reform campaign driven by Alex Russell. Alex has great ideas on improving modern day web development and I share his passion for figuring out how the web is layered and how we can expose those layers for the world to do something wonderful with them.

In the beginning I had some misgivings about Alex, in particular the Web IDL bashing[1]. I understand now that his concerns are with developers coming from a C/C++ background designing horrific JavaScript APIs. And that DOM for someone from TC39 (designers of JavaScript) means Web-IDL-designed API, rather than DOM.

I was also pessimistic about this TAG adventure, and since five reform candidates were running for four slots I was hoping I would not make the cut. But now I look forward to it. Taking a more high-level perspective with a different set of people than I usually hang out with will be a great learning experience. And ideally that leads to better standards writing down the road.

[1]To me Web IDL was the first real attempt to make it easier to define JavaScript-friendly APIs with some enforced consistency with respect to argument handling. Its predecessor, aptly named OMG IDL, was terrible in that respect.

Clay Shirky wrote Napster, Udacity, and the Academy the other day. It is well worth reading in its entirety, but let me selectively quote a part that struck me:

Once you see this pattern—a new story rearranging people’s sense of the possible, with the incumbents the last to know—you see it everywhere. First, the people running the old system don’t notice the change. When they do, they assume it’s minor. Then that it’s a niche. Then a fad. And by the time they understand that the world has actually changed, they’ve squandered most of the time they had to adapt.

It succinctly captures what I am involved with and see happening in the web standards world. Now for the second time in fact. First we had the decision in 2004 at a W3C Workshop around Web Applications (now referred to as “The Workshop”) to no longer work on HTML and JavaScript, but instead focus on XML. To counter that decision the WHATWG was formed and proved the W3C wrong. The W3C now heralds HTML5 as one of its success stories.

Now the WHATWG is our new story and it has ably demonstrated that the bureaucracy of the standards process can be circumvented. More became possible, if you will. The second time came when the W3C reaffirmed its stance on a restrictive license for specifications. Specifications belong to anyone. They prove themselves by being in wide use. That combined with the fact that the W3C Process requires editors to do a lot of make work, of which complaints during my seven years there went unaddressed, made me feel my time would be better spent elsewhere.