A recent New York Times magazine feature, What Google Learned From Its Quest to Build the Perfect Team, concludes that the best teams are those who listen to one another and show sensitivity to feelings and needs. What differentiated good teams from dysfunctional groups was how team mates treated one another. They found that in good teams, members spoke in roughly the same proportion, and they were skilled at intuiting how others felt based on the tone of voice, their expressions, and other nonverbal skills.
In these teams, members engaged in emotional conversations, brought their personality to the office, and felt free enough to share things that scared them. Bottom line is that listening and understanding created the psychological safety that enabled members to feel heard and seen. They were valued not as resources that get work done but as people who want to make a difference and accomplish meaningful things.
Just after reading that article, I happened to read the February edition of the IBM Systems Power version article: IBM Explores How Affective Computing Can Benefit Society. “Ideally, we would like the computer to reply in a natural way, as if it has emotions.” Hmmm…just like a person on an effective team. “Most existing applications for affective computing are based on capturing what the emotion is, and recording and reporting that emotion.” If we expect our computer systems to identify and name emotions, wouldn’t we expect our people to do the same?
Makes me wonder if we are paying enough attention to how well we understand and manage our own emotions (If you read Feeling It, you’ll know I’m working on that one!) and whether we are tuned in to what’s going on emotionally with those we engage with.
Researchers are analyzing real discussions and the interaction that takes place, evaluating what went well, and using that information to teach the computer how to respond. How do we learn the best ways to respond? Are we tuned in to what’s happening in our conversations and seeking our own and others’ feedback to learn and get better at engaging?
They are teaching the computer how to read emotions using signals in the voice, such as tone and pitch, as well as facial expressions. The computer is listening with its ears and eyes. Maybe we can take a cue from the computer.
My challenge: In my conversations, see if I can give my whole attention to the other person, listen to the words they use, watch their face and body for signals. And then respond based on what I discover. I know it’s inevitable I’ll get distracted, and, when I do, I’ll simply notice and bring my attention back to the person.
Want to give it a try? Let’s “Listen and Look” as we engage with each other and discover a powerful secret to richer human relationships, and building effective teams.
* The photo of that furry cute pup was taken in San Agustinillo, Oaxaca, Mexico.