LLM's precise certainty

One of my clients today put his record of our five years of conversations into an LLM for it to process a summary. It's one of the most advanced LLMs at the moment. With LLMs and AI in general, the best we can say is, at the moment, because the one he is using is considered the most advanced today. There was another so considered a month ago.

I had a tweak of apprehension before hearing the results. Might I have said something I might not agree with now, or even find cringe worthy? When I heard the results, I thought the page or two of summary of five years of advising was good and helpful, if a bit cold. Cold can be useful in delivering pure information. I'll argue the Ten Commandments are cold. If that were enough, the world would have changed the day Moses brought them down from Mount Sinai. The work is in turning information into action. My perception is the world still needs some work. We can't agree on what a simple sentence like "You shall not kill." means.

Cold.

LLM outputs are token strings that advanced statistical probability suggests are the most likely response to the token strings you delivered as input. Having followed this development since 1976, from the first Kurzweil machine, which had been developed to optically scan printed words and match them with sounds matching those words, current LLMs dazzle me.

Yet leave me cold.

Cold. The early Kurzweill machines had a more or less binary task, a token, cold, for example was matched to another token, a sound an English-speaking human would recognize as the spoken word, cold.

Modern LLMs have an exponentially more complicated task. First if it's processing English, English isn't a language like Latin or Putongwa, which, while these languages have loan words from other languages -- most native Chinese speakers in Shanghai will know what you mean when you say, bye bye -- these languages have a core foundation of grammar, syntax, and spelling. English, by contrast, is a messy collection of dialects with grammar, syntax, and spelling borrowed from Latin, Greek, Anglo Saxon, and French to start: bureau, chest, compartment. Add the complexity of euphemism, a Greek word with the last syllable knocked off, which means indirect words often used in place of very direct Anglo-Saxon four-letter words.

Cold, a simple Anglo Saxon word.

A simple word for temperature that gives almost no insight into what number you may read on a thermometer. In Vermont this spring, temperatures in the 50-degree Fahrenheit range call out shorts and t-shirts. In LA, that same temperature range calls out puffer vests.

Of course, you know when I referred to the LLM summary of my advice as cold, there was no reference to a quality that could be registered on a thermometer.

So often our words in any language are approximations or suggestions of something deep in our human experience. As I wrote last week, we use the word, feel, as an approximation or suggestion of knowing something we can't quite explain, but those knowings seem to pass from human to human despite the imprecision.

In The Forgotten Dialect of the Heart, poet Jack Gilbert explores this imprecision.

How astonishing it is that language can almost mean, and frightening that it does not quite. Love, we say, God, we say, Rome and Michiko, we write, and the words get it wrong. We say bread and it means according to which nation. French has no word for home, and we have no word for strict pleasure.

I encourage you to find it. The word "hearbreak" is never used.

What we feel most has no name but amber, archers, cinnamon, horses, and birds.

Warm regards,

Francis Sopper


FAQ Terms Privacy