fox@fury
July BayCHI
Wednesday, Jul 12, 2000

Tonight I went to the monthly BayCHI meeting (Bay Area Computer-Human Interaction special interest group) at Xerox PARC.

There were two speakers. The first, Niel Scott from Stanford, talked for about 90 minutes on the struggles of adapting (or augmenting) computer systems for use by the disabled, and how this problem is compounded both by the lack of support within modern OSes for overriding of input and output streams, and by the variety of forms a disability, or combination of disabilities, can take. Their solution is to essentailly add a physical hardware layer on top of the computer, which can in turn have its own input and output devices, translating the computer's video into terms the disabled user can understand (by way of actual computer vision techniques, recognizing what windows, buttons, and icons look like) and in turn taking the disabled user's responses and translating them into keyboard and mouse movements that are fed into the computer's keyboard and mouse busses. Basically this system takes the digital of a computer, converts it to a form of analog, understands its salient features, and presents them to the user in a digital fashion again. Then the whole process happens in reverse for input.

The second speaker was Dan Russell from IBM's Almaden Research Center. He was a lot more etherial and visionary, talking about pervasive versus ubiquitous computing, and the future where everything with an embedded processor talks with everything else. Very inspiring, well thought out, and interesting.

What nobody brought up in the subsequent Q&A, and I didn't realize until I was driving home, was that the two people were almost talking about the same thing. On one hand Niel was talking about having an arbitrary range of input devices controling an arbitrary number of different kinds of computing platforms (he gave the real-world example of controlling a Sun box via a Visor with a bluetooth card and special input software for cursor and text-entry control, as well as voice control of any number of devices) and on the other hand Dan was talking about the coexistance of the myriad of electronic devices, including the imminent problem of standardising data protocols and 'manners' for how these devices will communicate and control each other. The interesting bit is that this is exactly the failure which necessitated Niel's foray into abstracting the computer out of the interface, in order to let people control it.

Essentially, in Dan's future world, all of Niel's problems would be solved.

One more thing from Dan's presentation stuck in my mind. He said "The poor are always with you." in reference to the need to design not only for 800x600 24-bit color but for the 9" monochrome screen. I remember thinking that this goes both ways. It's not only the poor we must keep in mind, but the ultra-bleeding edge. The techiest of techies are using WAP phones and PDAs, and AvantGo browsers. Bleeding-edge as they may be, they require a simpler interface than any I've made since Mosaic 1.1. Technology is a curvy road that sometimes loops back on itself for another go.

If you like it, please share it.
aboutme

Hi, I'm Kevin Fox.
I've been blogging at Fury.com since 1998.
I can be reached at .

I also have a resume.

electricimp

I'm co-founder in
a fantastic startup fulfilling the promise of the Internet of Things.

The Imp is a computer and wi-fi connection smaller and cheaper than a memory card.

Find out more.

We're also hiring.

followme

I post most frequently on Twitter as @kfury and on Google Plus.

pastwork

I've led design at Mozilla Labs, designed Gmail 1.0, Google Reader 2.0, FriendFeed, and a few special projects at Facebook.

©2012 Kevin Fox