The 2012 BLUR conference was held last week in Broomfield, CO to showcase developing technologies in Human Computer Interaction (HCI). Far from a conference of people developing screens, keyboards, and mice, the topics centered on technology that reveals how much humans have been adapting to their input devices instead of the other way around.
Then most striking aspect to the conference was the focus on 3D printing. The discussions opened up an entirely new way of thinking about the technology. In addition to using the printers to create parts as an alternative to injection molding or machining, the printers can be thought of as a holography machine that works very, very slowly. It is a method of bringing something on the screen to life as another interface to the digital world. As printing time, cost, and obscurity continues to fall it will be easy to think of a printed design as just another means of HCI.
Three of the presenting companies came from different 3D printing business models. PrintrBot was represented by Brook Drumm introducing how cheap and easy it is to get into 3D printing with his Printrbot JR kit weighing in an just $399. Bre Pettis, CEO of Makerbot, filled the market for upper-range home 3D printers with their Replicator 2. Shapeways showed the possibilities of a centrally-based 3D printing company that brings the best technology to the masses by allowing designers to post their products for anyone to have made.
All conferences need a crazy technology that is developing exciting promise for attendees to dream about the future. InteraXon was just that company. They displayed the Muse, a thought-controlled input device that has 4 sensors on the brain all packaged in a svelte headband. The company was even bold enough to allow anyone to use the prototype with a computer display that responded to the user's mind. A thought-input example of the demo was clenching one's teeth would cause bubbles to appear, and mental focus was indexed in the corner of the screen. InteraXon is looking to break through the HCI challenges on the consumer level in a big way, offering their Muse for a mere $145.
Orbotix, the company the makes the robotic ball 'Sphero,' came to debut their newest invention 'Sharky'. Sharky is an iOS app that turns Sphero into an augmented reality system. Essentially, the iPad displays the camera view with Sphero replaced by a video game character. Sharky runs around as a representation of Sphero collecting cupcakes in the user's environment. Adding the real-world robotic ball component takes augmented reality beyond the screen!
Not to ignore the screens that we will continue to interact with, Ideum spoke of how they design interactive displays for various clients including museum education displays. It is easy to merely gawk at custom technology that only larger marketing/education budgets will be able to afford. However their talk brought up GestureML (GML), an XML-based gesture interface language. As many different companies continue to develop touchscreen interfaces, the number of gestures and what they do continues to grow. GML is way to map gestures to actions in a standardized way. An application written with GML would enable modification of what gestures do by only tweaking the map instead of the entire application. It is easy to see how gesture customization per user could be implemented just as any other setting, and follow that user from device to device.
After the conference I came back to my favorite keyboard, mouse, and dual 23” monitors for the first time with a sense of archaic workflow. It is clear that these technologies are gearing up to reach mainstream adoption in the next decade when we will all look back at the keyboard/mouse combo, and laugh.