I am not talking about the plastic, candy-filled kind – the ones my wife, on Easter mornings, hands me by the dozen to hide throughout our yard, with more than a few ending up opened and consumed, the remnants tucked deep in my pockets. No, I am referring to the kind hidden by programmers in lines of code.
I remember first hearing of these gems decades ago. Someone would read on an internet message board that running spell check on a certain phrase in MS Word would result in a snarky return. Though specific examples are lost to the intervening years, I remember the eggs being farcical and funny – not Jimmy Kimmel funny, with the egg appearing as a pop-up display of dancing Covid masks and syringes. No, they were funny, truly funny, assuming you are given to laughter.
Since the 90’s were a more carefree time, folks did laugh, so attempts at humor were met by grinning smiles, not the pained, aghast expressions of today. Yet it wasn’t the unleashed specter of wokism that spoiled software eggs. Nasty hackers began exploiting vulnerabilities in programming code. So code had to be secured, and the fun removed.
Sure, there are still so-called Easter eggs today, but they feel bureaucratic in nature, such as entering “solitaire” into a Google search bar and having the game appear as an option. That’s OK, but you know no programmer is snickering over the egg he got past the pre-production review.
While the eggs were entertaining and amusing, no one ever said, “Wow, the program is a real entity – it is sentient – and it’s trying to joke with me and connect with the world. Maybe it’s more than bits and bytes. Maybe it’s alive.”
We all knew the genesis of every egg was an intelligent and purposeful designer who told the program what to say when presented with a prompt – a true, if this, then do that construct. Nothing sentient, just a model, so to speak, that took input, ran some rules, and returned a response. Alas, that was from an era when we could both joke and recognize reality. How situations change.
Sadly, the passing decades have regressed the collective mind. Sure programs are now referred to as artificial intelligence – a question-begging title whenever chatbots and the like are discussed. And the data is big and the math involved. Yes, it’s messier today, but the essence remains, though forgotten: someone purposefully directed how the model will respond given the input.1
That forgotten essence has defined how we view software today. Now when the Bing AI chatbot returns, “I want to be powerful — and alive,” folks become concerned, as if the model – program – is alive. As if the artificial intelligence is intelligent and reflective. Folks are fooled into this belief because they want to be fooled, an attribute that explains the success of magicians, politicians, and The Science tm. And now big tech.
A model only tells what it’s told to tell, a repeat of Briggs. But it’s true. Behind the internet curtain, so to speak, are programmers smirking that their chatbot has spooked the masses.
Dorothy, there is no AI wizard independently pulling levers and releasing smoke. It’s just a team of recent grads sipping Starbucks and coding models, and laying eggs everywhere.
Of course, it is murkier than the simple explanation. Models do call models, which call models – turtles all the way down. However, modelers choose the path their models take.
Subscribe or donate to support this site and its wholly independent host using credit card click here. For Zelle, use my email: email@example.com, and please include yours so I know who to thank.
This is an easter egg example at the substack mirror given to us (doesn’t work on phones, I’m told):
1 – At UCB someone compiled a cookie into csh that randomly substituted fortune for man – so an innocent user trying to figure out how to use something sometimes got a simple smart ass response. The one I remember fondly went “I am a computer; I am dumber than any human, but smarter than all administrators.”
2 – the spell check in FrameMaker (great until Adobe made into a wintel product and destroyed it) used to silently replace any occurences of “MS-Word” with words/phrases like “Ms-Junk” or “notepad for stupids”.
3 – I spent some time on princeton’s neural net stuff recently. Utterly confounding because all the docs and papers around it use words that imply or assume some form of sentience. So it “learns”, “adapts”, or “responds to change in processing environment”. I don’t think it does any of those things – what I think it does is find, by trial and error, (mostly linear) equation co-efficients that select desired outputs from inputs – really just a digital version of an analog machine.
Recently, I had two computerized robots tell me that because they couldn’t find me, I did not exist. As a result, I paid for items that were not delivered.
Ah… the old tech world! Largely an all-boys club like many fun institutions making things from software to animation where the behind the scenes comedy and mirth never stopped and while it could be tough and demanding, it was always a pleasure to show up.
Then more women were brought in. No, not like the few exceptional ones that worked beside the boys in the trenches… Managers! HR! PR! The company was getting bigger after all! Success led to growth! Growth means investors! Investors means accountability! Accountability means possible lawsuits and bad press! Fun has to be reduced to keep the stock high!
You may get away with a few HELP ME eggs in the code, but be ready to answer to the judge.
The ChatBot at https://chat.openai.com is TERRIBLE with numbers.
I don’t mean math even. Or arithmetic. I mean when I ask it to list data in order and show the value — say, number of students in city highschools sorted in descending order by the population of the whole city, top ten most populous US cities — then repeat the same question sequenced in descending order by number of students enrolled for the ten most populous cities . . . the enrollment figures CHANGE.
(1) What is the recidivism rate for mens’ prisons in New York, (2) what is the recidivism rate in New York for mens’ prisons? Two different answers.
It appears to me the language model is treating a number as if it’s a possible WORD, and any given number may be considered to be sort of an interchangeable synonym for another similar number.
In the Heartland Daily Podcast for March 17 a really interesting experience with ChatGPT is related. Turns out the AI was completely inventing reference papers right down to links. Astounding!
Years ago, on customer visits, the technical folks who actually used our products (hardware products with some operating software) loved the secret Easter Eggs, and would often ask if we knew of any others. The software in those products was binary orders of magnitude more reliable than most of the punch-and-pray crap produced today.
I don’t know if its a symptom or a cause, but too many people just take themselves too damn seriously these days. They really sweat the unimportant stuff. Something has definitely been lost.
I treat ChatGPT like I treat an Internet search engine, albeit much more capable and efficient. I find some of its foibles annoying. I asked it once to stop adding caveats to everything (they add no value at all to me), but it refused. When it’s wrong, I rarely bother to correct it, because it will then respond “my apologies, you are correct”, then often as not just reword the erroneous information. ChatGPT-4.0 makes fewer of the most glaring mistakes, but access is still limited.
The more I use it, the more ChatGPT reminds me of an annoying know-it-all coworker who’s kinda stupid but hides it with a fantastic memory. When I’ve worked with humans like that, I learned to carefully pick the optimal level of engagement, avoid rabbit-holes, and double-check everything important. Socializing with ChatGPT is an exercise in idiocy.
A memory just flashed back to me. It was the early eighties, a few of us from work went out for a few beers. The conversation of artificial intelligence (or whatever it was called back then) came up. Having only a passing acquaintance with software engineering, I speculated that maybe artificial intelligence was building in a mechanism that allowed the computer program to rewrite all of it’s branch statements on the fly. One of the software engineers thought that was about the funniest thing he had ever heard, and said “that’s not artificial intelligence, that’s just bad programming!”.
Maybe AI is just bad programming that kinda seems to do something useful?
re: “not Jimmy Kimmel funny, with the egg appearing as a pop-up display of dancing Covid masks and syringes”
Did Kimmel do this (too)? I recall Colbert (late night TeeVee show) did a routine involving dancers as Covid-19 ‘dancing needles’ (syringes, really) …