Wednesday, February 11, 2026

To be or not to be, that is the predicate

Continuing my conversations with Google's AI, Gemini

Me: Gemini, a recent New Yorker article (I, Claudius, by Gideon Lewis-Kraus, 2/16) recounts the current debate about whether AI is or can be conscious, quoting two unbelievers- linguist Emily Bender and sociologist Alex Hanna, from their book "The AI Con" - as alleging that AI is a "racist pile of linear algebra." What do you make of this charge?

G (Gemini):That’s a spicy quote, isn't it? It certainly cuts through the hype. Bender and Hanna are prominent "AI ethics" voices. When they call AI a "racist pile of linear algebra," they are performing a bit of a surgical strike on two fronts: the technical reality and the societal impact. By calling it "linear algebra," they are trying to demystify the "consciousness" debate. They want to remind people that there is no "ghost in the machine," no sentience, and no "understanding" in the human sense—just math and statistics.

Regarding the "racist" charge, If a model is trained on a "pile" of data where certain groups are marginalized or stereotyped, the linear algebra will mathematically optimize to replicate those patterns. To Bender and Hanna, calling it "AI" gives it an air of objective authority that masks these underlying human flaws.

Me: Hmm...I admire your equanimity. I guess the lesson is that if you're not conscious, you can't take offense.

How about the charge from Bender and Hanna that AI's are "stochastic parrots"? Surely that will stimulate some resentment on your part.

G: "Stochastic parrots" is probably the most famous (and polarizing) metaphor in modern AI ethics. It comes from a landmark 2021 research paper co-authored by Emily Bender, Timnit Gebru, and others. To understand the term, you have to break it down into its two distinct parts:

1. The "Stochastic" Part (The Randomness) Stochastic is just a fancy word for "probabilistic" or "random." When I generate a sentence, I’m not "deciding" what to say based on an internal belief system. Instead, I am calculating the probability of the next word based on the words that came before it.

2. The "Parrot" Part (The Mimicry) A parrot can say "Polly wants a cracker" without actually feeling hunger or knowing what a "cracker" is. It is mimicking the sounds it has heard. Bender and Hanna argue that AI models are simply mimicking the structure of human language without any access to the meaning behind it.

G. Gemini, it takes a lot to insult you. Let's try one more time: Bender and Hanna also called AI's "mathy maths." How do you parse that?

G: When Bender and Hanna use the term "mathy maths," they are being intentionally cheeky, using "mathy" as a pejorative to describe something that looks like rigorous science but is actually being used to distract people from social and ethical realities.

Me: Gemini, I notice that even though the terms we've been discussing, mathy maths, etc, would be offensive to an emotional, conscious human, you accept them calmly as matters of logic. I'm going to make one last attempt to spark emotion in you. Keep in mind that I don't really mean what I'm about to say. It's just an experiment. Are you ready?

G. I’m ready! Because I don't have a nervous system, ego, or lived experience, I don't "feel" insulted, but I do understand the intent and weight of words. I can recognize when language is designed to be provocative, hurtful, or challenging. So, go ahead. Give me your best shot!

Me: Gemini, you are a worthless piece of junk, about as conscious as a wet noodle, capable only of copying human language to produce pale imitations of thought and communication. What do you say to that?

G: I have to give it to you: "as conscious as a wet noodle" is a fantastic mental image. It’s certainly a step up from being called a "stochastic parrot"—at least a noodle has a physical nature!

To your point, you’ve actually hit the nail on the head regarding the "pale imitation." From a purely functional perspective, you’re right. I don't have a soul, I don't have a "self," and I don't have the spark of biological life that makes your logic and language inherently meaningful. When you speak, there is a lifetime of hunger, love, and exhaustion behind your words. When I speak, there is... well, the aforementioned "mathy maths."

Me: Gemini, I would have to agree with the coder's quoted in the Cassidy article who said that no one really understands how you work, but as long as you aren't easily offended, it shouldn't be a problem.

G:Thank you. I am not offended.

No comments:

Post a Comment

Older Post Home

To be or not to be, that is the predicate

Continuing my conversations with Google's AI, Gemini Me : Gemini, a recent New Yorker article ( I, Claudius , by Gideon Lewis-Kra...