Goran Peuc has written a pretty interesting post on what web developers should stop doing now that we've hit 2014. While the bullet point format is interesting albeit a bit tired ("5 things you shouldn't do on your website"), it gives a good set of pointers for the people most removed from being a non-technical user. However, the really interesting part of the post is Peuc's point about the relationship between a horrible user experience and developers in discussing PayPal's ridiculous credit card input field:
Yes, you got that right, developers of the site force the user to understand how the backend logic of the website works.
If we extrapolate this a little, it reveals one of the main causes of bad user experiences: developers forcing users to do the developers' job, as my colleague Casper Lemming so gracefully puts it. I know that communication is not always the average developer's strongest suit, and of course, all abstractions are generally leaky, but it's 2014. The .com days of "you know HTML? You're hired!" are over — if you're building front facing stuff, as most web developers are these days, your job is as much a communications job as it is a developer job.
Really, then, screw the bullet points — your developers should stop making users do their job.
January 7, 2014 | Permalink →
I haven't said much about this. I don't know exactly what to say that others haven't said better, and I honestly fear what stepping in to becoming a big part of this discussion as a non-American might mean – a fear that surely resonates with a lot of my peers. The only thing I can say, is that it has made me paranoid in that "privacy is my right" kind of way – much to the joy of my paranoid computer science friends who now get to quote me on saying "dammit, you guys were right." But, to see guys from one of the big "targets," Google, start publicly lash out at the NSA is pretty unexpected. Among everyone who's spoken about this, Brandon Downey says it better than anyone else has:
Fuck these guys.
[...] seeing this, well, it's just a little like coming home from War with Sauron, destroying the One Ring, only to discover the NSA is on the front porch of the Shire chopping down the Party Tree and outsourcing all the hobbit farmers with half-orcs and whips.
While "fuck" is probably not the single most constructive word to use in this context, Mike Hearn took a step back and got it right:
Thank you Edward Snowden.
November 6, 2013 | Permalink →
The arrangement of numbers on a phone — or, by proxy, basically almost any input pad for digits — is virtually a universal standard. As it turns out, though, this universal standard is the result of quite an impressive amount of design and testing work by Bell Labs in the 1950s:
This is yet another example of the fact, that the startup world is indeed reinventing the wheel over and over again. "User testing," as it turns out, was not invented by Eric Ries and is not only in the "tool box" of people applying the "lean methodology."
September 10, 2013 | Permalink →
The Linux 3.9 release finally introduced a — at least for a networking geek like me — long awaited extension to the socket model: the
SO_REUSEPORT socket option. Not to be confused with the virtually default
SO_REUSEADDR POSIX socket option,
SO_REUSEPORT has its root in BSD 4.4 and offers the ability for multiple, independent processes to listen on the same port at the same time.
This basically means, that from Linux 3.9 onwards, we no longer need to build our own fork(2) hell-based master/slave watch dog contraptions to have multiple processes handle incoming connections efficiently. Instead, we can now leave this to the kernel and just spin up the listening processes we need. As pointed out on the Free Programmer's Blog, this is especially exciting for programming languages that are, mostly due to implementation specific issues, inherently shitty or incapable of any kind of parallel execution — like Node.js, Python and Ruby.
However, it should probably be kept in mind, that while this is at least in the long run a Godsend in terms of reduced complexity for simple applications that just need some kind of parallel execution, it's still likely that the solution is not necessarily the performance wise most optimal as we approach extremes. This could probably do with a round of benchmarks, but for now I'm just glad that my days of being dragged kicking and screaming through people's optimistic implementations based on complete disregard for documentation about the exact consequences of the forking process model might slowly be coming to an end.
September 1, 2013 | Permalink →
Say what you want about the tab hating "king of usability," Jakob Nielsen, but a study released yesterday by his consulting firm, Nielsen Norman Group, on tablet usability concluding that "flat design and improperly rescaled design are the main threats to tablet usability" is absolutely spot on. While some of the points seem like more of an observation of the evolution of user interface paradigms we've seen occur numerous times over, his point about the recent "design" (or "undesign"?) trends can be applied far more universally than to the limited scope of big touch devices:
Why not allow users to easily see what they can do? We need a golden middle ground between skeuomorphism and a death of distinguishing signifiers for UI elements.
I, for one, dread the release of iOS 7. Not only because it will make my everyday experience worse, but mostly because less tech savvy people will likely hit a solid wall, and, like in the Windows days, we're left to clean up the mess.
August 6, 2013 | Permalink →