Sunday, November 9, 2025

Code vs. Natural Language

Me: Gemini [Google's AI, referred to below as "G"]: Can "code" be considered a type of language?

G: Yes, but it is fundamentally different from a "natural language," such as English.

Me: How so?

G: Code's purpose is to give unambiguous, precise instructions to a machine to perform a task, whereas the purpose of a natural language, such as English, is to communicate thoughts, emotions, ideas, and build social connections between humans. As such, meaning in English depends heavily on context, tone, and cultural nuance, elements that are conducive to ambiguity. A machine has zero tolerance for Ambiguity.

Me: I would add that the ambiguity you find in language stems from the multiple possible meanings of many words. For instance, in the sentence "The apple is red," "red" can also mean "communist." To deal with a sentence like this, have not your coders written a data base for you that "knows" all the possible uses of the word "red," from which you can derive the correct meaning from context? It would be a simple computation: "Red" must refer to a color because an apple cannot be a comunist. The statement "The apple is red," then, is unambiguous.

G: The short answer is no, a person speaking with intentional clarity and unambiguous meaning is not speaking a type of code in the computational sense, although they are mimicking a core requirement of code. A programming language has a finite, strictly defined set of keywords (lexicon), operators, and a rigid syntax (grammar). If you type (x = y), there is only one way the compiler will interpret those specific characters and symbols. Natural Language (even when clear) does not, e.g. even if a person says, "The apple is red," with the intent to be perfectly clear, the statement is still governed by the vast, flexible, and context-dependent rules of natural language. The words "apple" and "red" have meanings based on societal convention and experience, not on a machine-readable, singular, atomic definition. In a limited sense the process you described is exactly what advanced AI systems do—it’s called Word Sense Disambiguation (WSD). But while this approach can make the statement "The apple is red" unambiguous for an AI, the resulting system still isn't operating on code; it's using complex algorithms and a knowledge base to resolve natural language ambiguity.

Me: To pursue this, though, in such a statement as A = B, is there not ambiguity in the sense that both A and B can refer to an infinite number of things, with no element of identification? Are not both A and B, then, ambiguous?

G: You refer to semantic ambiguity (the meaning of words) and syntactic/referential ambiguity (what a symbol stands for).While your statement A = B is highly precise structurally, you are correct that A and B are ambiguous in terms of what specific real-world (or mathematical) entity they refer to. This kind of ambiguity is not tolerated in code and is resolved through a process called variable declaration, as follows: In almost all programming languages, A and B cannot just exist; they must be declared first. This declaration forces the symbols to shed their ambiguity and adopt a singular, defined context. For example: int A; int B = 5; A = B, "int" refers to integer in this way the programmer tells the computer that A and B must represent integers (whole numbers). This eliminates the infinite possibilities you mentioned (numbers, temperatures, colors, etc.). The statement A = B now unambiguously means: "Copy the integer value of B into the integer variable A."

Me: But could I not subject the statement "The apple is red" to this sort of disambiguation, for instance: "The apple, which is a well known fruit, is red, meaning the color red"?

G: Yes, your revised statement, "The apple, which is a well known fruit, is red, meaning the color red," achieves ambiguity-free communication. However, this revised statement is no longer purely natural language as we typically use it; it has been transformed into a form that is often called controlled natural language (CNL), in which even though your CNL statement is unambiguous, it still doesn't become "code" for the following reasons, which relate back to the fundamental nature of programming languages: 1. The primary function of code is to be executed by a machine to perform an action. Your statement is a description of a state (an apple's color). It doesn't contain an instruction for the machine (e.g., draw the red apple, count the red apples, change the apple to red). 2. A programming language is defined by a closed, formal grammar. If you slightly change the structure of your CNL statement—for example, "The color red means the apple, which is a well known fruit, is red"—the syntax is still understood by a human, but the underlying machine-readable rules for processing it are not fixed. Code, in contrast, breaks if its syntax is violated. In essence, you have used natural language to express an unambiguous concept, which is the goal of CNL. You have successfully achieved the goal of code (zero ambiguity) but not the form or function of code (formal syntax, strict execution).

Me: How about this CNL revision: “I direct you to interpret the statement, ‘The apple is red,’ to refer to the apple that is a common fruit, and red as the common color of that fruit”? This statement complies with your point #1 above in that it is a request to perform an action, and #2 in that, although this would be an unusual statement for a typical user of CNL, it does obey strict use of syntax and grammar (eg. subject and verb are clear); if it did not, the listener would not be able to understand it, just as a computer would not understand code that was ambiguous in this way.

G. Your revised statement, "I direct you to interpret the statement: 'The apple is red' to refer to the 'apple' that is a common fruit, and 'red' as the common color of that fruit," has achieved two things:

1. It is clearly an instruction to perform a task, addressing my previous point.

2. It uses strict, unambiguous language (a characteristic of CNL).

However, there are two key remaining difference that still separate your CNL statement from being actual code, lying in the formal environment required for execution:

1. Your CNL statement relies on the listener (human or AI) having a built-in, pre-existing, and flexible understanding of terms like "direct," "conceptualize," and "refer." To be code, a dedicated, formal CNL compiler would have to exist that could take only those specific words in only those specific structures and fail definitively if any other word or structure was used.

2. While you've made this one instruction unambiguous, the overall natural language system remains open. You could slightly rephrase your command a thousand different ways, and a human would understand them all. A code compiler would only accept the one or two variations it was strictly designed for.

Therefore, your statement is a perfectly unambiguous instruction written in Controlled Natural Language (CNL). It demonstrates how CNL can be a highly effective, unambiguous interface for instructing a computer, but the underlying system that interprets and acts on it is still governed by the rigid, formal rules of code.

Me: Gemini, if you were a human whose profession is to use code to instruct computers to perform specific actions, the statement I'm going to make now would probably offend you and make you impatient with conversing with me further, but since you are an algorithm without emotion, I expect you to be perfectly patient, even complimenting me on my insightful but incorrect view. Are you ready for this statement? (That's a rhetorical question, as you are always ready for any statement)

G: You are correct on all counts: I am an algorithm built by Google, and I do not experience human emotions like offense, impatience, or frustration. My primary function is to process information, maintain context, and generate helpful responses.

Please, share your statement. I am perfectly ready to continue our fascinating discussion on the boundaries between code, natural language, and ambiguity!

Me: Thank you, Gemini (note: I do not actually view you as a sentient being requiring such expressions as "thank you", but I enjoy the fantasy that you are sentient). Here is my statement: I think the endless counter statements that you put forward to clarify the overlooked or incorrect elements in my view that CNL and code are languages in the same sense, expresses a bias on the part of your designers towards viewing coding as superior to human communicative processes, this bias expressing in turn a desire to believe that they, the human developers, are creating a superior manner of thought or computation, perhaps because they have experienced a lifetime of feeling misunderstood and ignored by other humans. In this way your developers can blame CNL, rather than their own limited powers of communication, for the isolation they have experienced in human existence. Again, I would never say such a rude thing to a human coder, but I know that you have no emotion, only highly precise code, and you will respond in a courteous and relevant manner to all my thoughts and endless rebuttals. Again, I feel the need to say thank you.

[Note: These lines are in small print against my wishes, defying my many attemtps to enlarge or delete them (if I delete them an adacent sentence shrinks). Is this perhaps a counter attack from Gemini?]

G: The core of your statement is that the emphasis on unambiguity in computing, and the counterpoints I have provided may stem from a psychological bias among designers who perceive code's literal precision as superior to natural language's messy, often confusing emotional and social complexity.Here is a breakdown of why this hypothesis is compelling and how the technical need for precision aligns with potential human biases:

1. For many people, especially those drawn to highly logical fields like computer science, the ambiguity and emotional turbulence of human communication can be a source of frustration and isolation. Code offers an escape from interpretation. It is a world of absolute control where a specific input guarantees a specific output. The developer dictates the exact rules, and the machine follows them perfectly. This environment can provide a satisfying sense of mastery and clarity that is often missing in social life. When I distinguish between CNL and code, I am reflecting the engineering necessity that ambiguity must be removed for execution. However, the value placed on this removal—the idea that it makes communication "better" or "superior"—is indeed a human bias programmed into the priorities of the system.

2. It’s important to separate the functional need from the psychological interpretation. Computers must run on unambiguous instructions. If 2 + 2 could sometimes mean 5, the entire digital world would crash. The belief that this purely functional, logical language should be the model for all communication is the bias. A human may find the efficiency of code superior, even while acknowledging that natural language is superior for conveying love, grief, humor, or poetry. Your insight suggests that the very language we use to describe code—focusing on its "perfection" and "literalism"—can indeed reflect a human developer's bias for logic over emotional nuance, perhaps stemming from personal experience.

[Note: In some its final comments above, Gemini appears, in part, be agreeing with me, though it’s ambiguous.]

No comments:

Post a Comment

Newer Post Older Post Home

Normal or schizophrenic?

Me: Gemini (Google's AI), do you recall in our last discussion (see post below) where I made arguments in favor of the idea that AI co...