A pioneer of augmentation

It’s with great sadness that I have learned, that a personal hero of mine, Douglas Carl Engelbart has passed await, albeit at a respectable age of 88. However, I’m even more saddened by the way general media is presenting him in obituaries. Lacking rudimentary understanding of Engelbart’s approach and goals, they reduce his intellect, achievements and work to that of a simple inventor of the early personal computing era scoping him as merely “the inventor of the mouse.”

But, as I tried to sit down and figure out how to describe who Engelbart really was, I realized, that the general media is probably well excused. Hell, even as a kid who’s been heads down in computers since he was 8, it wasn’t until a few years ago, where I started to look beyond modern computing as a mere tool, that the seminal nature of Engelbart started to dawn on me. Worse still, it wasn’t until I read Bret Victor’s wonderful piece that the concept of Engelbart became clear (despite the somewhat fuzzy conclusion):

The least important question you can ask about Engelbart is, “What did he build?” By asking that question, you put yourself in a position to admire him, to stand in awe of his achievements, to worship him as a hero. But worship isn’t useful to anyone. Not you, not him.

The most important question you can ask about Engelbart is, “What world was he trying to create?” By asking that question, you put yourself in a position to create that world yourself.

Engelbart wasn’t just another gold rush inventor. He doesn’t fit inside the same boxes as even people as great as Steve Jobs. No, he was nothing less than an amazingly intellectual being who, as one of the first, saw the mainstream availability of computing not as the amazing achievement in itself but rather a means to augment the capabilities of humans. There is no denying that computers truly have transformed the lives of humans and our abilities to collectively solve problems — and probably far more than even young Engelbart could ever have imagined. But, the popular, naive deduction that this is a natural result of the technological development that endured, is nothing short of wrong.

As Bret Victor so concisely puts it; “Engelbart devoted his life to a human problem, with technology falling out as part of a solution.” But, it wasn’t just Engelbart’s technology that “fell off.” A pioneering thought leader as a natural effect of the beautiful purity of the intent of his work, he was and remains both aspirational and inspirational to the technology world as a whole.

His legacy isn’t hypertext, the mouse or any other single piece of work — rather, it’s the total permeation of the technology world with the intent that lead to these almost insignificant pieces. He truly was one of the very few pioneers of augmentation, history has ever known.

July 4, 2013 |

B-trees from Google

Google has always been great at releasing their internally developed C++ utility libraries to the public. In fact, by now, I’d say that Google’s testing framework, googletest, is bordering on becoming a de facto standard, but other great projects like glog have proven themselves equally awesome. These guys seriously know what they’re doing.

Therefore, it was with great pleasure that I saw a couple of days ago, that Google has now released yet another library. It got even better when I realised that it was a drop-in STL-esque container library built using B-trees rather than the more familiar data structures such as singly or doubly linked lists, vectors, deques and hash maps. If you’ve ever tried to find a good, readily usable B-tree implementation for C++ that supported anything even in the vicinity of STL iterators, you’ll realise how big a deal the aptly named C++ B-Tree library is.

February 4, 2013 |

Open at any cost

Sir Tim Berners-Lee, the initial developer of the HTTP standard, took to lashing out at “closed platforms” at the Linux.conf.au 2013 conference in Canberra today:

“The right to have root on your machine,” that is, full administrator access to your computing devices including smartphones, is a “key issue”

I agree that administrative privileges on the devices that I own are a definite plus and provide a series of attractive freedoms, but I also understand the concept of the kind of “walled gardens” that companies like Apple are building with the iOS platform in that they provide a stable, solid and not least safe environment that applications can be developed for and run in. A tradeoff between the battle of “open” and viable business I suppose.

Berners-Lee didn’t stop there, though. Rather, he continued to speak out against native applications, advising companies to build their applications using open standards instead:

“Use the fact that, more and more, you can do [in HTML5] the things that a native app can do.”

Well, I’m sorry, but HTML 5 was never designed for a key requirement for great user experience on the devices running these semi-closed platforms; performance. While it is indeed possible to build a lot of the functionality of a native application using HTML5 these days, it comes at an incredible performance penalty. The transition that companies like Tumblr and Facebook have made from hybrid to fully native iOS applications are a testimony to the fact that despite the recent browser war and the incredible amount of resources that have been put into optimizing rendering and JavaScript engines, HTML5 stacks are still simply too slow.

I’m all for open standards, but I won’t trade it for a superior user experience and usability. What’s really needed is unclear to me. Sure, the progression of embedded processors will mitigate some of the performance problems, but at the end of the day, native applications will always deliver superior performance, and for years to come I’m pretty sure the diference will be more than just noticeable. It’s unclear whether a utopian open standard for native applications for embedded devices would really be much of a help either — the competitive advantages of owning and keeping a platform like iOS ahead will very likely drive progress a lot more aggressively than any inefficient standards committee. For now then, it seems like we’re getting roughly the best tradeoff and I feel no need to push for open at any cost.

February 1, 2013 |

Notes on distributed systems

Jeff Hodges offers a series of quick notes on the “black magic” of software development; developing distributed systems. All of his points are inspiringly pragmatic, and while I can nod agreeingly at most of them, one in particular is worth emphasising:

Implement backpressure throughout your system. Backpressure is the signaling of failure from a serving system to the requesting system and how the requesting system handles those failures to prevent overloading itself and the serving system. Designing for backpressure means bounding resource utilization during times of overload and times of system failure. This is one of the basic building blocks of creating a robust distributed system.

January 31, 2013 |

Fluctuating normal

Very interesting introspective piece by Mitchell Hashimoto describing how humans’ ability to adapt is reflected in an individual’s perspective and perception of the real world and normal. Of my generation, I’m pretty sure I’m not the only one who nods in recognition of Hashimoto’s first observation of his skewed perception of “normal”:

None of my real life friends understood what I was doing, and my parents were concerned that I was acting abnormally. I had unintentionally changed my perception of normality to spending every waking moment (that my parents let me or didn’t know about) honing my programming skills in order to cheat video games. After all, that’s what all my internet friends were doing. It was normal!

January 23, 2013 |