My space, my sight : on person-centred information in an augmented world

Image for post
Image for post

This post is an exercise in public prototyping of language and ideas for a workshop on information design.

It’s about how to think about information in an augmented and mobile data world. How to think of information that has physicality beyond the device screen. Information that gains and loses meaning by its apparent proximity. Information that is visible to some but not to others in the same physical space.

I’ll start with 3 concepts:

  • Particularisation of space
  • Proxemics
  • Perception

Particularisation of space

Image for post
Image for post

Particularisation of space is a fairly easy idea and generally was not that important to people in their information planning or use.

It’s an architectural term for defining what a place is designed for. Thus a big room with a kitchen attached is designed to be a restaurant dining room. A small room with a toilet and hand basin is a restroom.

This obviousness is overlaid by maps and signs to make quite sure people know what the physical space is for.

With personal, mobile devices this provider-defined sense of space and its purpose breaks down.

A restroom is where a person can be working out how to dump their current Tinder hook up and find another.

A restaurant can be where a person is doing a multitude of tasks. To be honest, they always were but somehow doing those tasks on a smartphone is improper.

The defined use of spaces by owners and designers is breaking down. From a user-centred aspect, this is not a problem. It is the personal intent in the space that matters not the provider definition of what that space is.

The breakdown of particularisation of space affects not merely those who own and design the physical places but also the planners who licence and authorise use.

It clearly affects how we design apps and devices that enable user intents within physical spaces. The invisible piling up of capacities and capabilities in a co-shared place could create awful stresses and misunderstandings for all of us.

Proxemics

Image for post
Image for post

Proxemics is a generalised concept of how people use space around themselves in different cultural ways. I’m just interested in the information-side for now.

The reason Proxemics is of interest to me is that it is the crossover point between two research area that matter in my work on tactile design: Embodied Cognition and Perispace.

From a person-centred design viewpoint, the physicality of the person and the space just around them matter as they both hold knowledge and memory.

The idea that knowledge and memory are held in one part of the body, the brain, is wrong. We embody both our memories and our capacities

For example, sitting in a particular way, like at a school desk, can recall memories that other poses cannot. This is because the memory is held in physicality as well as brain neurons.

This is one part of Embodied Cognition. Andrew Hinton’s Understanding Context is probably the most helpful book if you want to get some grounding in this area.

How we use things within arms reach is a more specific part of embodiment.

This is Perispace. The knowledge and memory space defined within our reach, within the grasp of our hands.

Humans are tool users and thus the tools we allow near us and the tools we place in our hands are extremely important.

The strong acceptance and strong rejection of smartphones by people shows how this space matters. What you let near you, what you let into your hands is part of your choice of who you are and who you want to be.

Proxemics is the broad concept that allows us to talk about how information enters and leaves the space near a user. We need to be mindful and respectful of that physical space as it has always been an information space.

Perception

Image for post
Image for post

Finally, for now, Perception.

I work a lot in sensory perception, emotion and meaning so it is an area I’m content to discuss.

However, in a world of intersecting information spaces that partially share physical place and partially extend beyond, it is becoming more difficult to talk of what space or place I’m describing.

Interaction17 had a great talk by Brenda Laurel on Virtual and Augmented Reality. I’d like to borrow some of her words as they help clarify information spaces for me.

She speaks of Prime space.

The physical world we live in and primarily experience.

Yet we need to talk of extended, intersecting and adjacent spaces too.

Perhaps, Prime Plus for Augmented Reality. Prime 1 for mobile digital spaces. Adjacent 1 for Virtual Reality.

I’m not too worried by the specific words. I’m more concerned that we all know that we need to have a shared vocabulary or else many design conversations will become hopelessly confused.

The End

I’ll probably need to think about Augmented Memory and Cognition and I have no words for yet.

This post is merely laying out a space for us to explore and talk about together. I’m also thinking about how to structure an experiential workshop at a conference.

Image for post
Image for post

Walking and talking about how we sense information in multiple co-located spaces.

Update 27th March 2017

EuroIA workshop

Image for post
Image for post
Walking Through Information workshop

It’s going to be a walking, strolling and sitting around type of workshop to try and understand how the different rooms of the conference hotel work as co-located information spaces.

Tools

I was re-reading Kalbach’s book Mapping Experiences which is very good.

The models of UX and Service Design within the book don’t quite work for me as I’m approaching from human-centered viewpoint.

Adapting existing tools from sensory audits of museums, it’s primarily about understanding journey as a sequence of moments from the user’s view.

The secondary aspect is breaking journey into Threshold moments rather than Touchpoints. The movement of a person through place and time is generally divided by the crossing of thresholds (or interstitials).

Mapping what can be seen at these moments seems to be worthwhile.

Image for post
Image for post
Sketch prototype

As noted in the original post, there are difficulties in trying to specify and organise co-located information spaces.

The language in this tool is attempting to embody the differences by talking of looking up, looking around and looking down. This is about trying to share metaphors that might make sense to people.

The idea of looking down at devices to look at information delivered through ebooks, apps and websites seems reasonable. Describing these spaces as Adjacent is more difficult but deceiving them as Digital is just not helpful anymore.

The idea of looking around to see (and hear, smell, etc.) the physical environment seems reasonable. Describing it as Actual again is difficult but is related to the unhelpful real/digital divide that makes no sense.

Finally, looking up for Augmented information. To some extent, this is the not-quite existing spaces that needs planning for. The idea of overlaid information spaces that enhance or extend a person’s perception and comprehension. As look down and around are already taken, up is the only way to go.

More to do

If you’d like to chat or have some ideas of how this area of work can be made more accessible to more designers, please contact me.

Written by

Sensory Design Consultant, usability researcher and workshop facilitator. www.linkedin.com/in/alastair-somerville-b48b368 Twitter @acuity_design & @visceralUX

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store