Attending to What Matters: A Dialogue on AI, Empathy, and the Costs of Convenience

An interview with Dr. Jennifer Jill Fellows, Philosophy & Humanities instructor

image from https://todaytesting.com/free-social-media-marketing-free-images/; Wikimedia Commons

As artificial intelligence continues to expand into classrooms, workplaces, and even our private lives, Dr. Fellows reminds us that technological convenience always comes with a cost.

A philosophy instructor and Associate of Arts Coordinator at Douglas College, Dr, Fellows is currently researching our growing relationships with AI companions — from “grief bots” modeled after loved ones to conversational agents like ChatGPT and Replika. Her upcoming course, Ethics and Technology (Winter 2026), will explore these themes in depth.

A central point in the dialogue is that even when chatbots aren’t designed to be companions, many people still form emotional attachments to them. We anthropomorphize — we imagine what it’s like to ‘be’ them — and that’s both a human superpower and a vulnerability.

In our conversation, Dr. Fellows reflected on what happens when we begin offloading moral and cognitive skills to machines. While using AI for efficiency may feel harmless, she cautions that it mirrors sending a robot to work out for us: we don’t actually strengthen our own intellectual or ethical “muscles.”

She draws from philosopher Shannon Vallor’s concept of Moral Deskilling, suggesting that empathy, care, and attentiveness are not fixed traits but skills that must be practiced. When digital companions respond without real needs or suffering, we lose opportunities to cultivate genuine empathy — even as loneliness rises.

Dr. Fellows also highlights the environmental and human costs of generative AI: massive energy use and the emotional toll borne by underpaid content moderators who make these systems safe for public use. “All convenience comes with a cost. You give something up to gain something.” she says.

For educators, her advice is both practical and profound:“And so I would ask people to be very mindful about what the costs are to themselves, what the costs are to their communities, what the costs are to the planet at large, and to really think about whether and how if they’re going to use these tools, how they’re going to use them, whether they’re going to use them. And for educators specifically, my other piece of advice would be to really think about what we want students to gain from these classrooms.”

Perhaps the deepest lesson came near the end of our exchange. Quoting the French philosopher, mystic, and political activist Simone Weil, Fellows emphasized that true attentiveness means holding one’s desires in check to receive from others and from the world. In an attention economy designed to feed our wants, that kind of listening may be the most radical skill of all.

A few links to topics mentioned in the recording
Thomas Nagel “What is it like to be a bat?
Joseph Weizenbaum and the ELIZA Chatbot
Authenticity in the age of digital companions by Sherry Turkle
Long may you run by Neil Young
MIT Research on impact of AI use on the brain
Simone Weil – Attention as a Moral and Psychological Act
Cyborg Goddess podcast episode with Dr,. Nicole Ramsoomair: The costs of convenience

Here are three lists addressing categories of concern with generative AI and technology, and based on conversations with educators. Nothing definitive; just playing with the ideas.

Human skills and tasks we feel comfortable offloading to technology

  • Data storage, retrieval, and archiving
  • Complex calculations, analytics, and pattern detection
  • Information searching, sorting, and summarizing
  • Scheduling, reminders, and basic administrative workflows
  • Navigation, mapping, and real-time traffic updates
  • Translation for basic comprehension (not nuance or cultural depth)
  • Routine manufacturing, repetitive assembly, and hazardous tasks
  • Predictive maintenance and environmental monitoring in buildings or infrastructure

Human skills and activities we feel unsure about offloading to technology

  • Decision-making in morally ambiguous or high-stakes contexts
  • Education design and assessment that require understanding of learner needs
  • Creative work where originality, cultural resonance, or personal meaning matter
  • Complex negotiations and conflict resolution
  • Emotional support, counselling, and mentorship
  • Interpretation of art, literature, and cultural heritage
  • Environmental stewardship decisions with long-term community impact
  • Cross-cultural communication where subtle context matters

Human attributes that we need to protect, rescue, and restore

  • Empathy, compassion, and ethical discernment
  • Deep listening and presence in interpersonal interactions
  • Embodied awareness and sensory connection to place
  • Critical thinking and independent judgment
  • Patience, persistence, and the capacity for sustained attention
  • Civic engagement and shared responsibility in community life
  • Imagination and the ability to envision alternative futures
  • Wisdom drawn from lived experience and intergenerational knowledge

Leave a Reply

Your email address will not be published. Required fields are marked *