People often ask me for my definition of the technological Singularity.
More specifically, they want me to offer some predictions as to what it will actually look like and what it might mean to them and the human species.
More often than not they don't like my answer, and it's probably because I re-frame the discussion and take the conversation elsewhere.
What people are really asking me to do is predict the outcome of the Singularity. And because I don't, they get frustrated with me.
But that's the problem. That's the whole point of this 'thing' we call the Singularity.
As has been noted elsewhere, virtually everyone has their own definition of the Singularity and it's become a very polluted term, one that's been stripped of all meaning.
So, before I tell you my own 'definition' of the Singularity, let me first tell you what it's not.
It's not any particular outcome or prognostication.
It's not any kind of definable event or transformational process.
Nor is it a term that can be used to describe a futuristic state of existence or the nature of advanced artificial intelligence.
But it's often used to describe these very things -- as if the term can be used as a synonym for what are essentially predictions. When people talk about the Singularity they can't help but inject their own anticipated outcome -- be it positive or negative.
I can be guilty of this at times. But so I don't get myself into too much futurological trouble I tend to refer to things as being in a state of post-Singularity. That's my clever way of avoiding any in-depth discussion as to how we'll actually get there.
Alright, so what's the technological Singularity?
Simply put, it's an unanswered question.
Vernor Vinge used the term Singularity for a very good reason. It's an event horizon in the truest sense.
But instead of a cosmological event horizon caused by a black hole's gravitational pull, it's a social event horizon caused by our inability to extrapolate the trajectory of human civilization beyond a certain point of technological sophistication.
The Singularity, therefore, describes a futurological problem -- a blind-spot in our predictive thinking.
That's it. There's no more to it than that.
Anything beyond this strict and limited definition is a discussion of something else -- an attempt to solve the conundrum and make predictions about 1) the actual characteristics and manifestation of the Singularity and 2) its aftermath.
So, if I say that the Singularity will involve a hard takeoff of SAI, I'm actually presenting a hypothesis that attempts to retire the term 'Singularity' and see it replaced by the term, uh, SAI hard takeoff event (we'll clearly have to come up with something better).
Or, if I say it will be a decades long process that sees humanity transition into a postbiological condition, I am likewise trying to put the term to rest.
Why does our predictive modeling break down?
Two reasons: 1) accelerating change and 2) the theoretic potential for the rise of recursively self-modifying artificial superintelligence.
Essentially, because disruptive change will be coming so fast and furiously, humanity's future remains largely unpredicted; there are too many variables and wildcards. And the rise of SAI, given its potential to be thousands upon thousands of times more powerful than the human mind, is simply beyond our prognosticative sensibilities.
Sure, we can make wild-ass guesses. And maybe one or two of them may actually turn out to be correct. But we won't know for sure until we get there.
Or at least until we get really close.
Consequently, the Singularity is a relativistic term.
People of the future won't use the word. That's a term reserved for us in our ignorance.
But as we get closer to the Singularity we will in all likelihood gain an increased appreciation of what will happen at the point when machine intelligence exceeds the capacity of humans.
And keeps on going.
At that point, once the fog that is the Singularity begins to lift, we will cease to call it the Singularity and replace it with a more descriptive term.
So, as we journey forward, what was once concealed over the horizon will finally be revealed.
In the meantime, just remember to frame the Singularity as a social event horizon, particularly as it pertains to accelerating change and the seemingly imminent rise of SAI.
Thank you for this. It's something I've struggled to put into words for others without much success. There's no "there" there, as they say.
ReplyDeleteI actually think we can look to our recent technological past for an analogy. Futuristic predictions of the 21st century abounded in the mid-late 20th century and they almost always landed far from the mark, imbued as they were with extrapolations of technologies and cultural mindsets as they were at the time the predictions were made. So lots of flying cars and video phones but almost nothing about social media.
So suppose we had a generic term in the 1950s that described the vague point in the future when communication technologies would become utterly democratized and the unpredictable social and political consequences of that process -- let's say mid century futurists called it "The Pulsar." They still would not have predicted the rise of blogs and texting and Twitter and YouTube and citizen journalism and the netroots -- that's what we would now know "The Pulsar" was referring to, but at the time, it was just this word that descibed the point beyond which we can't accurately say anything useful about communications technologies and human uses of them.
The singularity is kind of like that... times a thousand.
I guess I'm even more conservative in the use of the word. I wouldn't go so far as to associate the Singularity with AI, though I recognize it as a possible (and probable) association. The simple idea is that as change accelerates, and linear extrapolations become ever worse, our capacity for prediction becomes worse too, leading us to suspect an asymptote somewhere in the near future.
ReplyDeleteI'm not sure if such an asymptote will ever be reached: exponentials do not have asymptotes, and change may not be supra-exponential; in that case, we will just speed ahead with increasing vertigo, and a slowdown may eventually be caused by our own biological limitations.
On the other hand, finite size effects may interfere, which can be interpreted as a "phase transition", a violent change, but not necessarily as a runaway growth of any current trend; in that case, unpredictability would come to a stop, but the outcome could very well be a profound spiritual transformation and, say, a return to rural life. I wouldn't like that, but heck, who knows?
Singulariy means for me just the point in the future where the fog gets too dense for our eyes, the curtain beyond which any SF sounds naive. It is different from the unpredictability of "the 90's seen from the 50's" (even if you multiply it by 1000 or a billion), in that now we conjecture the existence of a climax, a breaking point.
Brilliant, thank you.
ReplyDeleteGeorge, very well said.
ReplyDeleteI would agree with dileffante in terms of "asymptote will never be reached" and "exponentials do not have asymptotes".
My suggestion would be to graph it using "logistical growth" which is composed of two terms: one term reflects growth (exponential) and another terms limits it (resources), therefore the graph would be an almost exponential curve until some point, and then it will approach asymptotically "the singularity" - where the Universe (the visible Universe!) will be "filled up" with intelligence.
The growth curve would be pure exponential provided unlimited resources. If this is the case there is no real singularity (mathematically speaking) in terms of growth. There is only limitless accelerating growth.
Can we have a discussion on this topic in one of TTA meetings ?
Throughout past eras, there has been a blanket term synonymous with Singularity: Apocalypse (or, for more extreme believers, Rapture). The Messianic flavor of the Singularity comes directly from those ancestors. The rest (the term as gestalt rather than specific predictions, etc) are par for most kinds of prophesy.
ReplyDelete"But instead of a cosmological event horizon caused by a black hole's gravitational pull,"
ReplyDelete- I don't think you mean cosmological event horizon here. A black hole event horizon and a cosmological event horizon are distinct concepts in physics. The former occurs at a certain distance away from the centre of a black hole, the latter is caused by the expansion of the universe.
Thanks for this article. I'm only familiar with the singularity on a basic level, so I don't have as broad a perspective on it as you do. I plan to peruse your blog more in depth when I have more time. I did want to note, however, that your Twitter link in the sidebar is broken; you left out the ".com." Just a heads up!
ReplyDelete