Estimated reading time: 1 minute, 45 seconds
I came across this article the other day. About how machines will be using AI to understand human emotions better than (some) humans can.
I tweeted this
The question this article raises for me is this: WHY ARE WE INVESTING MONEY IN HELPING AI understand human emotions, when we could maybe invest in helping HUMANS understand each other???? Why is that not the worthwhile investment???? Cc @14prinsphttps://t.co/6cCGrOBfaB
— ℳąhą Bąℓi, PhD مها بالي 🏵 (@Bali_Maha) September 6, 2019
And privately, in a Slack team, I quoted this part of the article:
“With AI mediating our communication, we can look to a future of deeper communication that acknowledges human feelings and emotions”
And my reaction is: Why do we want AI to further mediate already mediated communication? Why aren’t we investing in helping people get better at this themselves????
This is also a bit like… hiring someone to be your fiancé or something. Seeking fake human care.
It’s not like we have a scarcity of humans or scarcity of research on psychology, sociology, etc.
Can you imagine depending constantly on a machine to tell you how another human feels, to do that interpretation for you, so much that you forget how to use your natural intuition yourself, you stop exercising it… like people who use calculators so early they forget to to do basic aarithmetic in their heads… or like birds raised in cages who forget how to fly.
Who in the world asked for this? In whose interests is it to train AI to replace (o honestly, even augment) humans in their affective abilities? Are the tech folks so socially insecure that they would rather teach a machine to understand human emotions, instead of teaching themselves how to do so? In whose interests is all this?
I have learned and taught online for years. I can definitely tell emotion from a distance almost always. Sure, in huge situations, where hundreds or thousands of people are there, it won’t be the same, but that’s a problem with the model of scale, which we should reconsider. Or add more humans into it to do mentoring roles. The solution isn’t to give up on humanity in this weird way. I’m not valorizing what humans do, they have limitations. But we can invest time and resources into helping humans build relationships and understanding, instead of using machines to do it.