Emotion in conversation – notes for EuroIA18
This post is the start of public prototyping for a workshop I’m running at EuroIA in Dublin.
The conference theme is Humanogy. The merging of humanity and technology.
The workshop is called Architecting Emotions. The conference description is as follows.
Emotion in the moment of perception affects how both human and artificial intelligences make meaning and take action. As Information Architects, we also need to offer clarity to designers and users so they can understand and communicate their emotional needs and wants. Our ability to enable connection and compassion is impossible without learning new skills in emotional clarity.
This workshop uses ideas from Non Violent Communication and NeuroErgonomics to enable participants to understand:
- how emotion in perception and meaning making works
- how to understand emotional clarity in interaction design for both human and artificial intelligences
- how important emotion is to the future of humanogic design
This workshop will mix theory and practice to provide new ways of both understanding and communicating emotion in design discussions and usability work. Rather than viewing emotion as a barrier to human/artificial intelligence relationships, we can apply tools to empower and enable. Understanding clarity of emotional communication is also a key issue in the new design areas of neuroergonomics and neuroplasticity. These areas underlie Humanogic Design – merging human and artificial intelligences and cognition.
Conversational design and voice user interfaces (VUI) are quite popular issues at the moment. I do not know how to programme them (tho I have played with systems like DialogFlow) and both this post and the workshop are not about programming VUI.
However, I am interested in exploring how we, as humans, and our technologies can communicate more clearly with emotion.
Non Violent Communication
I’ve been interested in a technique called Non Violent Communication (NVC) for a few years. I have tried being trained in it but it is hard (I’ll loop back to this later in this post).
The core idea of NVC is compassionate connection.
Communication that clearly links the needs of people speaking to willingness to assist of people listening with both emotion and empathy in between.
NVC is a highly structured (and successful) method for improving human communication. It is that structure which I was thinking of for emotion in conversational design. The structure is important because it helps highlight a few problems for both humans and technologies in talking to each other.
As I said, I like NVC but find it hard to practice.
NVC tries to help make communication clearer by making needs easier to understand. It does this by making Requests based upon Needs that are founded in Feelings that are Observed.
It is very easy to conflate needs and emotions because they are viscerally mixed within us, as humans. Understanding our feelings and how they underlie our needs is important.
People are terrible at identifying and naming their emotions. Most people have delimited vocabularies for emotions.
NVC offers words.
Words for Feelings and words for Needs.
We cannot ask if we do not have the vocabulary to speak.
We cannot be clear in our request unless we align our needs with our emotions.
For the workshop, how do help centre users around such clarity?
How do we do this in a way that respects the breadth of ways in which humans feel?
Using NVC with non-human intelligences highlights a lot of problems.
At this point I am also drawn to issues raised by Antonio Damasio in his latest book The Strange Order of Things. The book deals with a vast number of ideas around evolution, feelings and consciousness. What matters to Damasio is that our feelings are embodied in our viscerality as physical beings. That compassion for others comes out of our own ability to sense pain. Empathy is founded in self-awareness. He shows that feelings not intelligence define the success of humanity (which also links off to The End Of Average).
So, if viscerality is what makes us human, what does that mean for our technologies?
We expected robot servants like Robby The Robot.
We got disembodied voices like Alexa.
But if Damasio is right then these voices can never have feelings, compassion or empathy as they do not have bodies.
This is a crucial point for future design and it’s why I wanted to run a workshop for people to explore it.
NVC for technologies
NVC offers a number of ways of thinking about Listening and Acting upon requests.
That they are highly structured appears to allow for their use by technologies. Perhaps this may offer a route through the lack of viscerality problem?
What cannot be done, due to lack of physicality, could be done through structured language and character. VUI with structured listening could mimic compassion, not completely but maybe enough.
How do we design for better listening? Not just language comprehension but actual listening and reflection. Actual conversation not just interaction.
Designing a workshop
I am only just starting the workshop design but I know it will involve a lot of role play, observation and conversation.
- Practicing and observing speaking emotionally with clarity of need and request.
- Practicing and observing listening empathetically with reflection and compassion.
Here’s some cards I may use to help people talk and listen.
I will update as I go along.
Just to finish, I do need help with this workshop. I do not know deeply about semantics, Natural Language Processing and VUI design so any pointers at good work in those areas would help me. I have read books like Erika Hall’s Conversational Design and Sherry Turkle’s Reclaiming Conversation and messed around with chatbot construction systems but that is not deep knowledge.
If you know anything else that seems relevant, let me know. I’m trying to share ideas and help people explore possibilities at the workshop so I’m open to things I haven’t encountered.