With bleary, burning eyes from a last-minute business trip, but with a desire to make the most of the sun’s surprise appearance in St. John’s, I hiked up Signal Hill yesterday to soak in the panoramic view of the sheltered harbour city.
Signal Hill is an infamous Canadian landmark, where the Italian inventor Guglielmo Marconi received the first transatlantic radio signal in December 1901. (If you’re a Canadian of a certain age, you’re likely familiar with this Heritage Moment1 depicting the event in history.)
But today’s post is not a lesson in the engineering marvel of this achievement. See, folks have been watching, with increasing horror, the scope-creep of another engineering feat: artificial intelligence.
It’s the stuff of science fiction nightmares - something that isn’t us is being trained to become better at being us. When it’s not justified anger - at plagiarism for using our creative content to train emotionless bots, or at fraud from those leveraging the “tool” to generate essays and portraits to then pass off as original works - AI instills fear, because when you live in a society that prizes economy above all, a tool that requires fewer resources will always be preferred.
It could render a lot of what we enjoy, as part of the experience that makes us human, obsolete.
I thought about this a lot; not so much that it could happen, but rather, why do we all give a hoot what some bucket of bolts does?
And after thinking about this a lot, and reading others’ well-considered posts on the matter, and then thinking some more, I have a hypothesis.
Compared to a trained data set, which is what AI effectively is, what makes our written words, blobs of paint, or sculpted bronze different is the meaning crafted into each choice. The meaning is baked in by someone in the “meatspace” who accumulated their own data set - but then made a choice fuelled by emotion to share it in that particular way. Someone created words or images or sounds that they felt was worthy to share in that way.
Now, some may counter: “You just said it; it’s merely the output of a trained data set. How are we any different than a computer in that regard?”
My returning volley is, we’re different because we don’t last, but the computer ostensibly can. Someone felt it was worthy spending their pocket of time expressing and sharing their experiences in that particular way, and feel a little less lonely by empathizing with that experience. This output matters because someone made a decision to share that content when we could have done any number of other activities. We each only have a brief period of time inhabiting this floating rock, orbiting a random star in this one of a hundred or so billion galaxies2 in the observable universe.
So, I’d argue there’s an increased value in our outputs because they only come for a limited time before our data set is removed from the matrix forever.
And to be proposed that some string of 0s and 1s can (try to) do that in the exact same way not only (attempts to) cheapen our own efforts.3 Frankly, it's insulting to say that the unique build-up of physical and emotional cuts and bruises that informs our keystrokes or brushstrokes are completely irrelevant.
Moreover, it wastes what little time we comparatively have on something that can never pretend to connect with us on the same visceral plain that creative content by humans can. We can’t get that from AI, no matter how well-trained it becomes. It can’t respond to the wonderful idiosyncrasies that lead me to share a tangential thought, to suddenly burst with a feeling, to react with empathy, with honesty.
Because it’s focused on an economically-driven end goal, it can’t decide to waste time telling you about the unique feeling of a cool breeze off the north Atlantic Ocean mixing with the warm diffuse light as the sun climbed higher in the sky.
It wouldn’t “think” to share with you how the gritty stone walls felt as I leaned to take another photo of the brightly coloured city below - the same walls that surround the tower where Marconi first reached out across the same north Atlantic.
It couldn’t think back to the image of that solitary seagull, so small against the dark teal harbour waters, and sift through the mixture of awe, loneliness, and peace that it generated all at once.
And it couldn’t think about the multitude of posts from fellow writers, and then choose to share Mark’s words from earlier this week because they hit you in just the right way: “Let’s leave matters closest to human meaning and happiness to the humans.”
It could try. But it will always be a missed connection.
Heritage Moments were government-funded historical shorts that would air during an advertising segment (admittedly, from the white-washed perspective much of our history has been shared to date).
Gunn A. (10 Feb 2023). How many galaxies are there in the universe? BBC Sky at Night Magazine.
To be moved is to feel empathy for that experience and I can’t feel empathy for a bucket of bolts or a series of patterns. Of course, this thinking has behaviourists screaming; they like to think we are nothing more than complex series of patterns - stimuli and responses. And I will agree that they are indeed predictable, those wacky behaviourists.
"So, I’d argue there’s an increased value in our outputs because they only come for a limited time before our data set is removed from the matrix forever." Love that thought.