A year and a half ago, I wrote about just how unimpressive and uninteresting HTTP 2.0 is. At that time, I called out the IETF on the decision to just repackage SPDY, and while I got a bit of flak from a few of the people involved, nothing really seemed to change. Since then, not much has happened, to be honest. The working group is mostly still just bickering about exactly what HTTP 2.0 is supposed to be, rather than coming up with any concrete solutions.

However, it seems like the working group is slowly starting to feel the pressure of releasing something, as Mark Nottingham today posted a very interesting entry to the mailing list:

The overwhelming preference expressed in the WG so far has been to work to a tight schedule. HTTP/3 has already been discussed a bit, because we acknowledge that we may not get everything right in HTTP/2, and there are some things we haven’t been able to do yet. As long as the negotiation mechanisms work out OK, that should be fine.

In other words, the working group seems to be realising that they’ve gotten nowhere in years. But, rather than admitting that they’re stuck and need to start from scratch, they’re just moving to push on through with a new HTTP standard that’s subpar at best, and then fix it in a newer version. While I generally applaud people taking incremental steps, HTTP 2.0 is not only nowhere near incremental, but HTTP is also no laughing matter. HTTP 1.0 has been in active use since 1996 – it’s superset, HTTP 1.1, since 1999 – so to think that we can just push through and adopt a crummy version 2.0 and then fix it later is absurdly naïve at best. I’m rendered virtually speechless by the fact that the supposedly best people in the industry to undertake this task can have such a short sighted stance on HTTP – they, of all people, should know just how bad technical debt is for the industry to be lugging around.

Luckily, there’s at least one person on the mailing list who maintains an actual implementation of HTTP; Poul-Henning Kamp. While Kamp has been a general opponent of a lot of parts of HTTP 2.0 for the last couple of years, Nottingham’s post finally prompted Kamp to call the working group out on their crummy job:

So what exactly do we gain by continuing?

Wouldn’t we get a better result from taking a much deeper look at the current cryptographic and privacy situation, rather than publish a protocol with a cryptographic band-aid which doesn’t solve the problems and gets in the way in many applications?

Isn’t publishing HTTP/2.0 as a “place-holder” is just a waste of everybody’s time, and a needless code churn, leading to increased risk of security exposures and failure for no significant gains?

The rhetorical nature of the wording aside, Kamp hits the nail on the head. Going down the path that Nottingham seems to be indicating would mean nothing but pain for the entire industry as a whole. So, I can only echo the ending words of Kamp:

Please admit defeat, and Do The Right Thing.