That’s how much it’s taking to power the composition of this column. The LLM – Large Language Model – being used is in my own head. I think I’m safe in saying that no amount of Artificial Intelligence – AI – could write what I’m going to write sitting here at this very instant.
Let’s take the above paragraph for example. I used the word “composition” in the first sentence. I could have said construction or the writing of, or I could have gone way out there and used the word creation. I could have said writing this column rather than sitting here writing at this very instant.
But then I would have lost the little scene-setting that’s in what I wrote, describing myself as sitting and putting you in the action with at this very instant.
Those were choices I made that comprise what you’d call my style, I guess. I don’t think of myself as having a “style” of writing, but people have told me that I write conversationally, so I guess I’ll take that, because when I sit down to write, I think of it as talking to another person. That the other person happens to be a reader is what makes writing what it is: the transmission of ideas by arranging letters into words and sentences and putting them down on paper or, these days, on the screen of a cellphone or computer. It’s the same thing, in the end. Writing uses language to convey information, description, emotions, anger, delight, confusion – you name it, writing does it.
See that word “language” in there? I was curious about why artificial intelligence was described as coming from, or amounting to, or even being a “large language model” until I realized that it is really just a system that has gathered information in the form of language and organized it in such a way that it can be regurgitated as the second word in AI, intelligence.
It’s not intelligence, however. It’s other people’s words repositioned from the documents and information channels they were derived from into a new form that derives from an algorithm that uses previously gathered information produced by a person or a set of persons to get to a desired output or outcome. Looked at another way, it’s a fancy search engine that doesn’t just take you to source documents where information resides, but “scrapes,” an AI word that perfectly describes the process, information from those sources and puts it together for you. In other words, it’s a second step in search, in that it arranges information as well as presents it.
I’ll give you a good example: One of the big AI systems, ChatGTP or OpenAI, could have probably produced some version of what I have written in this column so far, but it couldn’t have come up with the idea for this column.
I spend most of the hours of every day doing that part of the job – coming up with the subject of the column I’m going to write. I don’t even really think about what goes into that process, but I do know what doesn’t. I don’t write what I think people will want to read, because I don’t know the answer to that question.
Which raises a possibility: If I asked an AI system in the middle of the afternoon, about the time I usually come up with the idea for my column, to write a column for me, what would it do? From what I know of AI, it would scrape my previous columns for patterns of subjects, probably coming up with “politics” or something similar. Then it would scrape my Substack data to see what went over best in previous columns by measuring numbers of views, and comments, and replies to comments. To the extent that AI could, I think it would put a number value on comments to measure what was popular and what wasn’t. Then AI might have enough data to produce a subject for a column and write it.
But I can guarantee this: No AI system would have put a headline on a column on the subject of artificial intelligence using the words “grilled cheese sandwich.”
The truly extraordinary difference between ChatGTP and yours truly is in the fact that the AI companies I’ve already mentioned, along with Meta and Amazon, are currently in the process of building facilities around the country so they can do some version of what I have just described. Have a look at what it is taking OpenAI to build just one portion of what they are calling its flagship Stargate AI data center in Abilene, Texas. They have leveled 1,200 acres of land on which the complex will sit when it’s finished. There will be eight buildings, each holding 72 racks of servers containing a total of 60,000 Nvidia GB 200 chips. The eight buildings, when they’re finished, will consume about a gigawatt of electricity. That is more electricity than is consumed by the city of El Paso, with a population of about 680,000 people. The Abilene AI data center will be powered by five – that’s right, five – new gas-fired power plants that will be built all around Abilene and will use natural gas that comes from fracking wells in the Texas Permian oil basin.
That’s just one of the OpenAI Stargate projects. They will be building two more similar facilities in Texas, one in New Mexico, one in Ohio, and another in a Midwest location that hasn’t been announced. OpenAI President Sam Altman announced last month that the entire Stargate project will cost $500 billion and will consume 10 gigawatts of power. That’s enough to power New York City – the entire city of 8 million people that sprawls over 475 square miles. The thinktank RAND estimates that AI data center demand will be around 70 gigawatts of electricity by 2027, and 327 gigawatts by 2030.
As you can imagine, new demands on the electrical grid by AI data centers are sending electricity costs ever upward. Nationally, electric utility prices are up six percent. Prices near AI data centers have increased 15 percent. A typical AI data center uses as much electricity as a city of 100,000 residents. Even Donald Trump’s Department of Energy estimates that by 2028, AI data centers will consume 12 percent of the electrical power generated in this country. This at a time when Trump is closing every renewable energy project that has been touched by federal funding, including wind, solar, and geothermal.
There are things artificial intelligence can do way, way better than human beings. AI systems can crunch through zillions of numbers to solve problems even computers can’t solve, like looking for variations in genes that cause cancer, tracking them down to single genes in single types of cells. All of it can be done much faster than such problems have ever been solved.
But while we’re marveling at the genius of AI, we should remember that NASA built the rockets and landers and everything that got us to the moon using slide rules that were held in human hands, and measuring tapes were used in constructing it all. The calculations necessary to build the St. Louis Arch from both bases at once and have the arch meet in the middle within a tiny fraction of a millimeter in the air hundreds of feet above the ground were done with slide rules. Thomas Pynchon wrote “Gravity’s Rainbow” on a typewriter. James Joyce wrote “Ulysses” in longhand on large sheets of paper because of his failing eyesight. Neither wrote their masterpieces in buildings that covered a thousand acres of ground and used enough electricity to power an entire city.
Tracy just brought me my nightly cup of coffee made with an electric coffee maker, and I boil the water for my iced tea with an electric kettle, and we use a gas stove to cook our grilled cheese sandwiches, and a refrigerator kept the cheese cold, and I’m writing this on a computer that is plugged into an electrical outlet in the wall. So, power from the electrical grid was used in the production of this column.
Google just told me that there are 100 trillion synapses in a human brain. That’s a piece of information that I didn’t find using my trillion, because it wasn’t there. But now it is in my brain…at least temporarily…and it’s written down in this column. Later tonight, you’ll be able to Google my byline and Substack, and you’ll find a link to this column, so the information about the human synapses is up there in a data cloud somewhere, and from the descriptions I read about the way AI works, it will be “scraped” and added to the banks of Nvidia chips in Abilene or in Meta’s new data center in Richland Parish, Louisiana – only 92 acres were necessary for Meta’s air conditioned and water-cooled banks of server synapses.
It looks from the expenditures of hundreds of billions of dollars and the consumption of dozens of gigawatts of electrical power that artificial intelligence will be with us for a while. I find myself asking Google’s AI function questions that yield answers that lead me in turn to sources of information that I use to write my column. Tracy reminded me this afternoon that the capability of the computer that I write on every day, similar to ones I’ve used to write for several decades, now exists in the palm of my hand. It is possible today to access the product of all those AI data centers using a cellphone. It could be that during the same amount of time I have walked the earth, all those acres of data centers will be torn down, and artificial intelligence will come directly from a hand-held device that will make our phones as obsolete as a slide rule.
Here’s hoping that we won’t lose wisdom at the same time machines gain processing power, because the whole thing, from earth movers leveling ground to steel beams and aluminum siding to steel racks and servers and computer chips and the electricity to power it all comes from one place: our human brains, backed up by whatever of our souls is left after we have delivered ourselves upon this earth.