This is the text of a quick talk I gave at Generate Conference in London on 17th September 2015.

You can download the slides from Slideshare.

You can find out more on SensoryUX.com.

“I’m here to talk about sensory design and what I do: which is to try and make information meaningful.

Meaningful to people with physical or cognitive impairments.

I can take visual information and convert it to tactile or convert complex textual information to simple graphics or audio.

For example, last year, I worked on a tactile map of the new World War One exhibition at the Imperial War Museum. You can go and visit it. The map is both information about what to expect and where to go for people with visual impairments and a map of where to find a peaceful seating area for people with sensory processing disorders (or anyone who is finding the tour a little stressful). It’s information for different physical and cognitive needs.

This is multimodal design: choosing the right senses to be able to deliver the appropriate information at the right time, in the right place.

Where this crosses over with digital design is in the new areas of post screen interaction, wearables and the Internet of Things. All of them are also about the person at a time, in a place and not simply the screen.

In order for you to design for these new technologies, these new experiences; you need to understand how human beings sense, adapt and make personal meaning.

I’m going to explain a few relatively complex things and also I’m going to con you.

It’s a trick but not a cruel trick.

So let’s start.

In order to understand sensory design you have to understand embodied cognition.

The basic argument is that the body and the brain are fundamentally the same thing. There is no separate brain. You are your body and your mind.

Thought is not simply in your brain. It’s distributed through the action of your body. You move, you think, you act.

Let’s start with just how individual cells operate: they sense and they take action. Neurons (which are in your brain and through your body) in particular are about action.

We are more than individual cells now and so we take sensory inputs, make meaning and then act.

But with five senses, there’s too much sensory input to easily make meaning.

So we coat some sensory inputs with emotion.

Emotion is the way that we can exaggerate certain sensory inputs so that we can find meaning faster and act quicker.

But we don’t just have five senses, we have nine, perhaps even more. It’s all too complex: so much sensory content mixed with waves of emotion.

And so we have consciousness.

The You that you think is You.

Your consciousness is there to seize a meaning from the waves of stuff, to make decisions and to find a way of sorting all that complexity of senses and meanings.

Thus we sense context and content.

We adapt to the complexity of context and content.

We adapt to complexity through embodiment.

We adapt to complexity through emotions.

We adapt to complexity through consciousness.

So what do you do as a web developer?

You support embodiment through personalisation.

But how? I will give you two quick ideas today.

Firstly, sensory mapping.

All of us are standing (or sitting) in place, in time with our senses. It’s important however, to realise that our senses aren’t just a spectrum.

We are balanced on the axes of conscious and unconscious sensing and desires. We are all both wanton to seek and wishing to avoid sensory experiences. We want to find content and to avoid content. To be attentive and inattentive.

Understanding that in the design process is hard but we do have ways of discovering these individual differences. There are questionnaires which allow us to explore and map these embedded biases.

Secondly, there is sensory framing.

You can design to support embodiment through personalisation.

And here is where I play my trick. This is the con.

When I say you can support embodiment through personalisation you need to know that the design framing for sensory personalisation is something which already exists.

Like all good tricks: it’s always been in front of you.

Personalisation is accessibility.

In IBM terms, accessibility is hyper-personalisation.

Now, honestly, this is easy to miss. W3C headings, like this, do not make it obvious.

Headings from W3C website

They often treat the impairment as the user not the human.

Better to look at iOS and its accessibility menu. There you see ways of enabling the adaption of the device, the user interface to the sensory needs of the individual user.

This is accessibility as personalisation for anyone.

So, to return to the question of How?

Understand user sensory capacities.

Sensory Design through mapping & enabling personal sensory capacities

Enable user personalisation.

Map people’s ways of sensing.

Provide a framework for them to adjust to meet their sensory needs.

And if you do that it means you design and enable something even bigger.

Something extraordinary.

User personalisation enables diversity.

Diversity is normal.

Neuro diversity, sensory diversity, gender diversity and racial diversity.

All normal, all open to support through personalisation.

With your skills, you can do things I cannot do.

You can broaden experiences.

You can deepen experiences.

You can enrich lives.

Start making use of our senses, enable peoples’ abilities to do more in the way that makes sense to them, personally.

Thank you.”

Written by

Sensory Design Consultant, usability researcher and workshop facilitator. www.linkedin.com/in/alastair-somerville-b48b368 Twitter @acuity_design & @visceralUX

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store