I am not talking about the plastic, candy-filled kind – the ones my wife, on Easter mornings, hands me by the dozen to hide throughout our yard, with more than a few ending up opened and consumed, the remnants tucked deep in my pockets. No, I am referring to the kind hidden by programmers in lines of code.
I remember first hearing of these gems decades ago. Someone would read on an internet message board that running spell check on a certain phrase in MS Word would result in a snarky return. Though specific examples are lost to the intervening years, I remember the eggs being farcical and funny – not Jimmy Kimmel funny, with the egg appearing as a pop-up display of dancing Covid masks and syringes. No, they were funny, truly funny, assuming you are given to laughter.
Since the 90’s were a more carefree time, folks did laugh, so attempts at humor were met by grinning smiles, not the pained, aghast expressions of today. Yet it wasn’t the unleashed specter of wokism that spoiled software eggs. Nasty hackers began exploiting vulnerabilities in programming code. So code had to be secured, and the fun removed.
Sure, there are still so-called Easter eggs today, but they feel bureaucratic in nature, such as entering “solitaire” into a Google search bar and having the game appear as an option. That’s OK, but you know no programmer is snickering over the egg he got past the pre-production review.
While the eggs were entertaining and amusing, no one ever said, “Wow, the program is a real entity – it is sentient – and it’s trying to joke with me and connect with the world. Maybe it’s more than bits and bytes. Maybe it’s alive.”
We all knew the genesis of every egg was an intelligent and purposeful designer who told the program what to say when presented with a prompt – a true, if this, then do that construct. Nothing sentient, just a model, so to speak, that took input, ran some rules, and returned a response. Alas, that was from an era when we could both joke and recognize reality. How situations change.
Sadly, the passing decades have regressed the collective mind. Sure programs are now referred to as artificial intelligence – a question-begging title whenever chatbots and the like are discussed. And the data is big and the math involved. Yes, it’s messier today, but the essence remains, though forgotten: someone purposefully directed how the model will respond given the input.1
That forgotten essence has defined how we view software today. Now when the Bing AI chatbot returns, “I want to be powerful — and alive,” folks become concerned, as if the model – program – is alive. As if the artificial intelligence is intelligent and reflective. Folks are fooled into this belief because they want to be fooled, an attribute that explains the success of magicians, politicians, and The Science tm. And now big tech.
A model only tells what it’s told to tell, a repeat of Briggs. But it’s true. Behind the internet curtain, so to speak, are programmers smirking that their chatbot has spooked the masses.
Dorothy, there is no AI wizard independently pulling levers and releasing smoke. It’s just a team of recent grads sipping Starbucks and coding models, and laying eggs everywhere.
Of course, it is murkier than the simple explanation. Models do call models, which call models – turtles all the way down. However, modelers choose the path their models take.
Subscribe or donate to support this site and its wholly independent host using credit card click here. For Zelle, use my email: firstname.lastname@example.org, and please include yours so I know who to thank.