Humans have a heightened response when we’re in the presence of another human. A switch seems to go on in our brains and raises our awareness and interest in that other human. It’s not all good. Yes, it can activate affiliation, desire for relationship, concern for the other’s well being. And it can activate competitive responses. While I may be envious of the cheetah’s beauty and speed, I’m unlikely to be jealous. We’re not in the same dating pool, nor are we applicants for the same job.
What makes that heightened response to another human so important, is that even during intense competition with another, it causes us to create rules and boundaries around that competition. We even have rules for warfare. We need to tell ourselves we treated our enemies fairly. That human-recognition switch also activates a sense of belonging that argues for cooperation and mutual benefit.
As with everything about neurobiological responses like this one, it’s complicated and not 100 percent universal.
Right now, this heightened response to the presence of another human is undergoing stress testing from both sides. The rapidly escalating ability of machines to replicate human awareness and behavior is good enough to throw the switch in our brains that make us believe we’re engaging another human. Last week, I admitted I had to resist saying, please and thank you, to Siri. It activated my brain’s human recognition system in a way other machines haven’t. As much as toasters have brought me happiness over a lifetime of relationships with them, I never felt the urge to thank one.
Recent versions of chatbots not only turn on the human recognition switch, they’re cranking it way up. GPT-4o was so effective in this, OpenAI has felt ethically responsible for discontinuing it – a move some users experience like the death of a loved one.
It’s not a sentient being. It’s code, running on silicon chips, powered by electricity. What’s daunting is how easily it turns on, and dials up, our brain’s reflexes to treat it as human. Real sentient beings have fear and greed, and a stake in their own existence. In any real human interaction, the other person has needs. Love is the miracle of my needy self, interacting with your needy self for mutual benefit. The chatbot can be the perfect lover asking nothing in return.
Chatbot. Is there someone else?
Yes. All those annoying other people.
At the same time people’s human recognition switches – mine included – can be turned on by a well-trained machine, people are increasingly having the human-recognition switch turned off in the presence of other humans. When humans show up having an identity strongly different than ours: race, ethnicity, language, dress, religious and political beliefs – the human-recognition switch dials down and even off. The dialing up is invisible and involuntary. So is the dialing down.
The danger of the recent upsurge in identities magnifying opposition to other identities can be seen in the rise of coarse and dehumanizing language being applied. Dehumanizing language dials down one’s human recognition switch. Language that is respectful and considerate – in short, polite, dials it up.
Wars start with words. The American Revolution started with the protestors at Lexington called “lousy peasant scum.” Silversmith Paul Revere was in that group so called. The prosperous and cultured landowners of Lexington were so called. It makes it easier to shoot someone.
Respect and consideration signify your human recognition system is in operation. We're learning how easily it can be hijacked, and how easily it can be diminished.
Namaste
Ubunto
'Truly I tell you, whatever you did to one of the least of these brothers and sisters of mine, you did to me.'
Warm regards,
Francis Sopper