Discussing software, the web, politics, sexuality and the unending supply of human stupidity.

web development

A fictional conversation about progressive enhancement

“I am disappointed by modern web development. Too many bloated frameworks, too much JavaScript, single page web apps, hash bang URLs—it’s all a bit over engineered. We have lost the old techniques of progressive enhancement and in return we have ghastly nonsense like infinite scroll which looks nifty but does not really improve the user experience. It all seems a bit like we have reinvented the era of Flash intros but we think it is so much better because we have made all this pointless bullshit in JavaScript rather than Flash.”

“I take your point, granddad. Perhaps this technology is excessive for mere web sites but we are building web apps now.”

“At some point someone will give me a clear explanation of the difference, riiight?

“Well a web app is something you can’t really experience without a whole lot of scripting. Like, you can’t progressively enhance it.”

“So a web app is defined as a system that requires the JavaScript excesses for it to work. And the argument for the JavaScript excesses is that we need it to build web apps. That sounds a teeny bit circular to me.”

“Bah. Logic. I don’t need logic. Just because you can’t fit it into your theological categories doesn’t mean there isn’t a distinction. Like, I can point to clear examples of web apps. Gmail! Google Docs! They don’t make sense if you don’t understand them as apps. They don’t fit that old fashioned web pages with little blobs of progressive enhancement model that you grumpy old Luddites keep banging on about. If I want to build Google Docs, I need to do it in the new way.”

“You make a good point. You do kind of need a modern browser with bells and whistles to be able to edit a spreadsheet in Google Docs. The user experience of using that in Lynx is going to suck, so perhaps you don’t really need that.”

“See, this brave new world of apps is not so scary! Shall I help you with your Gulpfile now?”

“Let’s not be too hasty. I mean the argument is that Google Docs is completely useless without all the modern front end stuff all working.”


“And there is literally nothing you could display to someone viewing a Google Docs spreadsheet or word processing document if, say, their browser had scripting turned off?”

“Absolutely. This is why you need to approach it with an app mindset rather than a document mindset.”

“What is the user editing in Google Docs?”

“Well, rich text files and spreadsheets.”

“Which are types of what?”


“Can you repeat that word for me?”

“Oh fuck. Documents. You got me.”

“So what could you do if the user loads the page in a browser that doesn’t have the capabilities to edit the document?”

“Well, we could display the document, I guess.”

“And what technology do you need to render rich text and tables in browsers?”

“You know the answer already. HTML and CSS.”

“And if your browser can edit the document—”

“—then it loads the relevant code to edit it. It is still progressive enhancement! I get it.”

“And you can even use your silly Node.js reimplementations of GNU Make if it makes you happy.”

Russian translation

Jeremy Keith has smart things to say about browser support.

I’ve been screwed over in the past by saying “we’re going to support IE9, IE10 and all versions of Firefox, Safari and Chrome that people actually use”. I’ve done this based on solid data—basically, going to Google Analytics, dumping the data out and applying the Pareto Principle and a craptastic Python script I have that extracts the information I need out of the morass of Google Analytics mess.

The problem occurs because then (a) we don’t build the back-end in a sane, reasonable, agile1 way (like, say, having good programmers building it in Django, instead management decides that it needs to be built by enterprise Java devs in some monstrosity of a CMS that refers to itself as an “enterprise portal engine” or some other bullshit) and (b) the front-end gets built in some crack-addled, buzzword-compliant JavaScript framework picked because it is on Hacker News and is sexy. If we built websites in a sane and rational way, this shit would be so much less complicated.

  1. By ‘agile’ in this context, I mean simply that it is built with technologies that make it easy to adapt to change based on feedback from design and front-end developers. Like, say, Django or Rails rather than Spring, and deployed on Heroku (at least during development) rather than some half-baked ops process.

Why I'm turning JavaScript off by default

I managed to offend a lot of front-end programmers in the office today by announcing that I was installing NoScript, and enforcing a strict JavaScript whitelisting policy.

I’m quite vocal in my dislike of JavaScript. They seemed to think this was some kind of slight of JavaScript. I’m not a big fan of JavaScript, but the reason I’m installing NoScript isn’t because I don’t like JavaScript. It’s because I dislike an enormous amount of client-side scripting… which just so happens to be done in JavaScript.

I snark a lot about JavaScript, but I’m of the opinion that most of the web would be improved if there were a lot less JavaScript running on it.

I don’t want web designers redesigning the “experience” of using the web. The unification of the user experience of using computers is a positive thing. If you use old software from the early days of computing, everything had a different user experience. If you use Windows or OS X, you’ll know of software that behaves differently from the norm. If you are a reasonably perceptive user, you’ll see it, and then you’ll be annoyed by it. The reason I prefer Pixelmator to Photoshop is that it more closely adheres to the way OS X apps are supposed to be designed. When I use Pixelmator, things like file opening, window management, document navigation and so on are consistent with other applications I use. This makes things more predictable and thus usable.

On the web, the controls I’m referring to are things like knowing where I am on the web, having links and navigation elements behave consistently and sensibly. If I right click on a link and choose “Open Link in New Tab”, if it’s a “proper” link, it’ll open in a new tab. If it is just an anchor element that triggers a blob of JavaScript, it doesn’t do anything. The things that look like links behave differently for no discernible reason.

With the triumph of client-side scripting—of “web.js”, in this specific way, we’re going back to the bad old days of computing but for the most trivial pieces of software. Why does every newspaper or blog have to behave differently, to modify the experience of using a simple system for the retrieval of rich text documents. It doesn’t. There’s no valid justification for it. It’s a cargo cult: people do it because everyone else is doing it.

The purported justification for it is the creation of “web apps”. As Jeremy points out, a web app seems to be nothing but a web site that requires JavaScript. And the justification for building sites that don’t work without JavaScript is that it’s not a web site, it’s a web app. Needless to say this is circular.

In the era of web.js rather than the old-fashioned web, URIs don’t matter. A URI doesn’t identify a resource—it doesn’t really do anything. At best, the blob of unreadable JavaScript might interpret the URL as an instruction to perhaps load some blob of JSON and render up a stack of semantically-meaningless div elements in the document object model at some indiscernible time in the future. In “web.js”, elements that aren’t div exist primarily as a nostalgic throwback to a gentler era.

Another nostalgic throwback to an earlier era is the idea of progressive enhancement. Thanks to frameworks like Angular and Backbone, you can build applications that contain no data in the HTML document at all. Hypertext without any actual hypertext. What happens if someone views it with JavaScript turned off? Or on an old browser? Well, there’s pretty much one answer to that: they’re fucked.

This is apparently a good thing for user experience: you change the user experience arbitrarily on different websites, and have it so the content doesn’t degrade gracefully.

What UI innovations does this give us?

  • How about the return of the pop-up window in the form of annoying JavaScript hoverboxes?
  • How about making the page load many times slower by filling up the user’s browsers with endless JavaScript libraries, tracking cookies, useless social widgets and maybe, with the advent of Web Workers, a bit of illicit BitCoin mining on someone else’s CPU cycles?
  • How about infinite scroll? Because continuing to malloc without ever getting around to free-ing is an awesome idea. Especially on low memory devices.

Whenever a browser has crashed on me, or allocated so much memory I’ve needed to restart it, it’s never been because I’ve been reading too many plain, simple web pages that aren’t bloated down with JavaScript. It’s been because of monstrously overly engineered beasts.

Turning that shit off is the first step towards sanity.

Obviously, some sites need JavaScript. If I trust them enough to not fill my browser and RAM with badly-written shit that makes the user experience worse, they’ll get on my personal whitelist. Most sites won’t. If they abuse my trust by making the experience worst for fashion-driven reasons, they get taken back off the whitelist.

Perhaps we could go a step further and share the whitelists and blacklists. A web of trust for client-side code, where the default is “off”.

If we build a community of people keen on having the old web back before it started getting ruined by overenthusiastic client side developers, we might be able to save the web from sliding any further down the ruinous path of “every website a web app (even though we’re not quite sure what one of those is)” and other similar follies.

Do I expect you all to do likewise? No, I’m perfectly well aware that I’m probably likely to be something of a pariah in my crusade. I’m a gay vegetarian: I’m okay with being in the minority. Whatever.

Do I hate JavaScript? Well, it’s not the language I’d want to code in for the rest of time. I’m not fond of it. But as I said, the language isn’t the issue. If Haskell or Scheme were the language of client-side web scripting and could be used in place, I’m sure we’d see just as many dumb things in that approach.

JavaScript is a necessity. I use JavaScript on my own site. Incidentally, not for a great deal. The only person who is significantly injured by turning off JavaScript on my site is me, because it’s needed for the login system and posting UI.

I do think modern web development has gone down a deeply unwise path. Only through exercising our personal choices can we bring it back. We have mostly stopped the web from being a hellhole of shitty punch the money adverts by blocking the living shit out of adverts. JavaScript is becoming the new conduit for awfulness. I like the web too much to have to endure any more of it when not strictly necessary.

Introducing awfulness.js

Is your website too boring, functional and usable? Want to make it more exciting and “responsive”? Just include awfulness.js and you can benefit from all these new features.

Badly reimplemented statefulness

In the old days of Web 1.0, long before we created Backbone.js and other client-side frameworks, you could tell what state a web page was in based on messages in the browser chrome. The little ‘e’ would stop spinning and display a gnarly message on the screen telling you that the webpage hadn’t loaded, and give you some idea why.

This is far too user friendly. It’s so much more fun to cryptically reimplement the browser’s statefulness in JavaScript, so awfulness.js does that. It’ll give cryptic loading signals, fail in spectacularly unpredictable ways, not make clear whether it is communicating with the server or not, stop updating to the point where you need to hard refresh the browser tab… all because the old way wasn’t “responsive” enough.

awfulness.js will make it feel like nobody has tested what happens when someone tries to post something on your site and goes into a 3G deadspot like a train tunnel in the middle of the upload. It’ll just freeze and have some generic spinny-icon going around forever, because why bother respecting things like the browser’s in-built TTL, when awfulness.js can reimplement one that’s completely pointless and subjective.

Our user testing has shown that users far prefer silent failure so they can keep the vain hopeful pretense that the action they performed on your site has been successful, even when it hasn’t. That’s a far better user experience than getting a browser message. With awfulness.js, your user will never have to know their tweet, photo, status update, forum post, wiki edit (etc.) hasn’t gone through: because we just don’t tell them. Or we might just give them a cryptic error message that has nothing to do with what’s wrong. This in-built unpredictability between different websites improves the user experience significantly.

Infinite scrolling: because paging is too convenient

One of the features we are most proud to include in awfulness.js has been pioneered by Google Groups: infinite scrolling as a replacement for useful navigation.

Imagine: you find a new mailing list about a project you are interested in. You want to see the early messages to this list. In Web 1.0, boring old web developers used to implement some form of paging. So, you’d go to the bottom of the page, and there’d be some ugly thing like this:

So, you’d click “earliest” or “oldest” and it’d take you to the last page. Deeply unsexy and unresponsive. Instead, now, with infinite scrolling, when you reach a group with 20,000 posts and you want to see the oldest, you simply scroll past 20,000. Yes, yes, there might be a little memory problem of rendering up a DOM with 20,000 items in it, but whatever. We’re modern JavaScript developers: it’s not like we have to worry about low-memory devices with underpowered CPUs and small screens running on shaky networks with expensive data and roaming charges.

Who needs URIs anyway?

Of course we use hashbangs. Having actual useful URIs for things that actually return content is so passe. Much better to just give you back a big arbitrary blob of JavaScript that will eventually parse this URI-like thing and slowly return you some content. Welcome to the future.

Coming to an app store near you

Love the user interface, uhh, improvements, but hate the fact that websites remain on that boring old web thing? Well, get this. Awfulness.js will soon be available as a library to make your iPhone and Android development just as awful and responsive as the websites you develop. iAmAwful and Awfuldroid will apply the same sexiness-driven development and responsiveness to native app development, and allow you to ensure your app has features that make it Hacker News-worthy, even if they annoy the living shit out of anyone with more than two brain cells.


I expect you are pretty excited by awfulness.js.

Download now.

Hacker News