On the Automation of “Care”

Estimated reading time: 4 minutes, 5 seconds

Whenever someone says, “I used ChatGPT to…” I’m now irked and annoyed. I’m like, “Really? You didn’t care enough to make effort to…”

It started with the US senator who used ChatGPT to write a speech. Really? Really? You really don’t care enough to write actual words about an actual cause you care about? I know, I know, they have speech writers, but still. I assume those human speech writers actually understand the issues and the stance of the senator.

It happened again with two recent events. Someone who used ChatGPT to write part of the commencement address they were giving. And I’m like, “Really? You couldn’t take a few minutes of your time to think of how you might want to inspire young people who are about to graduate and go out into the world? Is this how we inspire them?” and then again, more so, when I read the news of how the Michigan State University DEI office [correction, I was told: It was not the Michigan State University DEI office that did this. It was the Peabody EDI Office at Vanderbilt University] wrote a message in response to the recent shootings using ChatGPT. Now that was low. Like really low. You’re the goddamn DEI office. You’re the folks who are supposed to care about diversity, not perform it. This is maybe the lowest level of performing equity and care.

Here’s what I’m thinking. When someone says they’ll let the machine do it because “they don’t have time”, that does not really exactly mean they don’t have time, per se. It means that time is a limited resource, and “don’t have time” means that this thing is “not a priority”. I’ll delegate to a machine because “I don’t care”. It does not deserve my personal attention, it does not deserve the time it takes to stop doing other things in order to do this thing.

Whenever I hear of AI tools that will write an email for you, I’m thinking, I understand why we don’t care about a lot of our email, but I assume that behind any email we read/write are promises to do actual work beyond what is written. So don’t we need to stop and think about that first?

I think I totally get why I’m OK with automating something like images, because art is not my thing, and I sometimes just need something quick that I don’t need an hour of photo editing to do. It’s not going to be perfect, and I “don’t care”. But it’s not going to be as meaningful as what a thoughtful graphic designer would give me, if I was trying to get an image about something I deeply care about.

Whenever I hear a faculty member ask if they can use AI to grade student work (and it’s been doing that for a long time, btw, this is not like a new development), I’m thinking, “really? Is what you ask your students to do so unimportant to you that you don’t care to see it, give feedback on it, and assess it?”. I think when someone asks me this, I want to ask, “don’t you want to change your assessments to ones you would care to read/assess?”. I know this is not always possible, so I rarely say it aloud.

And so, I truly do think, that if we end up in a situation where students are constantly taking shortcuts using AI, it’s just an expression of “I don’t care enough about this to spend time on it” – and we should do better in our education that students would care enough to want to do thinsg that are hopefully authentic, meaningful and relevant to THEM.

It’s like, if I’m traveling to a country for two days and I don’t have time to learn the language, yes, please, give me some translation AI! But if I’m considering marrying someone who has a different native language that I don’t speak and I want to connect with them deeply, I’m not going to automate that, I’m going to want to learn that language.

We don’t automate what we care about. So I want to ask you: what do you care about? What do we hope our students will care about? Whatever the answers are – don’t automate those. Before we start automating a thing, we should ask ourselves: do we care about that?

And on another note, I have been using ChatGPT less and less (only during workshops where I’m showing people like faculty what it’s capable of because they need to know, or to test something). Because every time I use it, I feel like it is a sign I care less about the people in the global South whose mental health was harmed by the process of making this AI appear more ethical to us as end users.

Feature image made by me on Canva.

8 thoughts on “On the Automation of “Care”

  1. And because AI produces just a statistical probability of care, something that looks at first glance like care, who cares to want that?

    I use ChatGPT only to mock it!

  2. so much here that I feel, related to and consider! …and also wonder about…it reminds me of cards we give one another around special events (holidays, milestones, loss, etc). Many are (for now) created by real people (?) but then mass produced so people can look through a few and say, “this one”, sign it and give it to someone.

    As someone who enjoys writing, I’ve–at times–found this to be a pro forma act that feels meaningless because it can feel empty: “I spent $2-4 bucks on this generic message, signed it, wrapped it in an envelope and gave it to you”. I’d much rather get a blank card and write the message myself. I feel like that’s some of the way that you’re describing using the above examples.

    What about those that aren’t gifted/invested in writing? Or those who can do some types of writing and not others (e.g. feedback that is effective, meaningful, and strikes the right tone)? Can using something like this help in the long run? That is, can it only mean that they don’t have the time to care but that they know the limits of their abilities?

    Or another way of asking is–if one outsourced their grading to TAs (as some often do) or even pay someone on Amazon’s Mechanical Turk is that the same thing? I know you say speech-writers are different in some ways, but it is the form of automation the issue of the failure to dedicate time? (And if it’s that, is it a result of the individual or the system that demands we are all assigned way more tasks to do in our work than we can reasonably fill in the hours in a day?). Are we all taking shortcuts because the systems in play push us to do so?

    don’t mind my rambling–just a Sunday morning here, enjoying my coffee and thinking 🙂

    1. I think Leaton01 is on to something in the final paragraph. I wouldn’t say personally that we don’t automate what we care about — some things, like data collection of my daily running habits is deeply meaningful but well-suited to automation – but that we should consider when our personal attention is most meaningful, for our ourselves and for others. In other words, when is “humanizing” in order?

    2. Enjoyed your response, Lance, and I was also just discussing those ready-made messages on cards the other day!

      With those cards… I care to give you a card, but writing heartfelt messages is not something I am skilled at or have time to do (because I am so bad at it, it would take too much time, right?).
      I think because my lens is education, I was thinking of the card analogy in education. If my course is about writing poetry, then I want students to learn to write the poetry on the card themselves, or if they will select a ready-made but suitable one for an occasion, to select the poem among many they read. But if I am teaching advertising and someone wants to include a poem inside an ad, maybe how they came about the poem is secondary now? Something like that? (Not that my argument in a spontaneous post has to hold for every situation, mind you. My post itself is as much of a rambling as anything. I just posted it when the thought came)

  3. Maha, you have brought up an interesting co-relation: automating something = not caring about it enough to do it manually/emotionally/thoughtfully by ourselves.
    I think about this a lot and go back and forth between the truth in that. There are a lot of analogies from tech advances in the past that come to mind: simple calculators, scientific calculators, computer programs to do complex calculations, coding manually, versus using visual coding apps. etc.
    I also can’t help but think about the difference between the use of “automating” something and “outsourcing” something. So, for example, my sister who lives in India, is a great cook, but she is also a new mom, and a Team Lead in an MNC overseeing a team of over 35 people. She employs a cook to outsource cooking daily meals, so her time can be better used to juggle her 9 month old baby and her very busy job. In this case, if she tried to do everything by herself (without outsourcing), it’d deplete her to the extent of not being able to be a good/present mother and/or a great employee/team lead.
    Picoting a bit, my main thought is this: at some point, we really need to think about how/why we moralize making things easier for us and equate that with not caring, not doing our jobs well, plain laziness, or even feeling guilty about not doing everything by ourselves. Why is all this tech not helping us free up more time and create more leisure for Us? Us as educators, us as students, us as workers, us as parents, us as inventors of this same tech. After all, isn’t that the ultimate goal? To have more leisure time?
    In any case, thank you for penning down something that moved me to think more and even write about it. I love how much you care and how passionate you are about educating in the constantly changing tech-scape of our increasingly global world today.

    1. Thanks for your thoughtful comment. I outsourced (when my kid was younger, but even now) a lot of the home tasks (like cooking) apart from the actual direct care for my kid, because even within our home care responsibilities, the one a mom can’t be easily “replaced” is the emotional care, rather than the cleaning, cooking, etc.

      I hear your point, though, about that connection I made being perhaps rhetorical and not always applicable to every situation. Though I still think even in the example you gave, the priorities are clear, still. Being “present” for work and child meant outsourcing cooking. That seems to still fit. If your sis enjoyed cooking herself she would outsource something different. For example, I enjoy baking w my child so have never outsourced it to my housekeeper BUT of course sometimes I buy cookies and cakes ready-made! I don’t care about it to the extent I would *always* do it myself. But I would never outsource kissing my kid in the morning… unless am traveling, and then I outsource to my husband or mom or someone I trust, right? Then she is older and can travel on her own and priorities shift again?

      1. Thanks for clarifying Maha, makes sense. I do agree that when we say “I don’t have time” we are really saying “this is not my priority” because we always find time for our priorities. Again, thanks for your thought provoking article.

  4. Here’s a thoughtful essay that parses that vapid essay Vanderbilt U used ChatGPT for to write a “heartfelt” message to students about the MSU shooting – https://withoutbullshit.com/blog/about-that-vanderbilt-post-shooting-email-chatgpt-feigns-sympathy-poorly-but-so-do-humans. I’m with you – I am using CHAT GPT occasionally as someone to brainstorm with when it comes to creating workshop outlines and titles. But never to do the writing for me!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.