As much as I love programming and good old-fashioned text-based command lines, I have an interest in ergonmics and futuristic interface. A few days ago a post entitled “A Brief Rant on the Future of Interaction Design” made the rounds on the Internet. It opens with an old, but interesting video and goes to make the argument that our current obsession with flat touchscreens and simple gestures is doing our us all as disservice. Our hands are capable of complex gripping, grasping and touching motions and having all that expressivity confined to a small, two dimensional surface with a limited number of motions is self-defeating. The article makes the damning statement: “Are we really going to accept an Interface Of The Future that is less expressive than a sandwich?”
The article helped me express an uncertainty that’s been floating back and forth in my mind for some time. I use my iPod Touch on a daily basis and I’ve been loving the multitouch trackpad on the new Macbooks. I love the swiping motions for window management and moving things around. At the same time I’ve started drawing by hand again (I loved drawing as a kid) and I realize that putting a pencil to paper is a rather complex but very fulfilling activity. Strangely enough I think that both the pencil and the touch-based iOS interface have a lot in common. In both cases, the actual physical device almost disappears letting you focus on the underlying application. The iPad or iPhone itself is just a thin frame around whatever app you’re using. The pencil is basically just a simple pointer but allows us to create an infinited range of images with it.
However in both cases, the expressiveness offered by the device is not enough. Pencils are not enough to express all the images we might want to create. That’s why we have pens, brushes, chalk, crayons and a variety of papers and canvases. The flat touch interface is also not enough, especially if we are confined to a small surface that fits in one hand. The question then is how we can take the simplicity of our current touch interface and extend them to a larger set of expressions and interactions?
Case in point is the camera interface on the iPhone. For a long time there was a software button that you had to touch to take a picture. But that meant sticking your finger in the middle of the picture. Normal cameras have a better interface: there is shutter button on the top that keeps your hands far from the actual image (even if you’re using a LCD screen instead of a traditional viewfinder). This deficient interface on the iPhone led to the Red Pop, a giant red shutter button and now iOS 5 turns one of the hardware volume buttons into a shutter button.
Having a fluid, upgradeable, customizable software interface is nice and I like smooth gradients and rounded corners as much as the next guy. But our hands evolved to use actual physical matter and before computer interfaces we built a lot of interesting physical interfaces. Apple has hooked us on the idea of sleek, smooth devices with no extraneous. While it’s great to lose unnecessary knobs and edges the Apple design philosophy might not be best in the long run, especially if your device’s UI doesn’t neatly fit into the touch-drag-swipe system of gestures.
Ultimately it would be great to have “smart matter” physical interfaces – the flexibility and programmability of software with the physical usability that solid matter offers. Imagine some sort of rearranging material (based on some form of nano- or micro-technology maybe?) that can be be a simple smooth shell around your interfaces but can change to form buttons, sliders, knobs or big red shutter buttons as your application requires. But in the years (decades?) between now and then we need other solutions. The range of accessories and extensions available for the iPhone (including the Red Pop, tripods, lenses etc.) seem to suggest that enterprising young device maker could use the iPhone (and it’s successors and competitors) as a computing core to which they can attach their own physical extensions. With a more open and hackable platform (an Android-Arduino hybrid perhaps) we might see a thriving device market as well as an app market. Am I a dreamer? Hell yeah, but as the projects I’ve linked to show, I’m certainly not the only one.