Design and Analogy

The design is always the hardest part of producing a new product.

By which I mean both design of good DSP, and design of a productive user experience.
It’s the latter which I’m blogging about in this post.

As a software designer, you begin with a rather significant disadvantage.
Building hardware, the user’s first impression is tactile. It’s no secret
that the vast majority of well-regarded hardware designs use military-grade
potentiometers and switches. You touch it, it feels good. Then you hear what it’s
doing to your audio.

With software, the first bite is with the eye. A beautiful user interface goes
a long way to establishing a positive first impression. But the process is somewhat
deeper. The real issue is one of analogy.

Analogy is how we understand the world. We identify similarities between processes
with which we are familiar, in order to quickly acquaint ourselves with new things.
The ideal design is one that suggests comfortable analogies.

So why do we like interfaces that look like hardware? Because it’s familiar.
We understand hardware, and we associate certain properties with it. Most
hardware units have a simple, well-thought out set of controls that allow us
to adjust things to our tastes. Controls that meet our current analogies
regarding how audio processing works.

EQuality shamelessly matches its interface to high-end console EQs. Rows of knobs
covering all the important tasks that we undertake. Yet there’s a secondary analogy
we’re familiar with; the graph. It corresponds with our understanding of pitch;
it’s a keyboard, with low frequencies on the left and high frequencies on the right.
We understand that because it’s a very familiar analogy; everything from your parents’
hi-fi, your controller keyboard, and every graphical EQ you’ve ever used does this
the same way. Because it makes sense.

Watch someone interact with an iPhone/iPad for the first time. The gestures make
sense because they’re perfect analogies for the way in which we interact with real
physical devices; scrolling, twisting, stretching all behave the same on iOS
as real world items.

We (Krzysztof and I) spend a lot of time debating what exactly the common
underlying analogies for interacting with processors are. The last few weeks of
development work have been largely taken up by these discussions, and conversations
with other engineers to try and elicit deeper insight into the matter.

What’s really interesting is that people generally end up with the same internal
models of how things work; subject to their experience of course. But we generally
all imagine things the same way. So we try to design user experiences which
are as close as possible to how people expect things to work.

The perfect design is one where everything you try to do with the interface does
exactly what you hoped it would. And that’s what we’re working on.