October 06 2008 / by Alvis Brigis / In association with Future Blogger.net
Category: Social Issues Year: Beyond Rating: 5 Hot
Built on a faulty definition of intelligence, the Singularity meme is an informal fallacy with limited utility that constricts our view of the future if we rely on it too heavily. As we continue to refine our collective model of a rapidly accelerating future dominated by convergence, we should look to more comprehensive scientific models to take its place.
Let me start off by saying that Ray Kurzweil’s The Age of Spiritual Machines is one of the most important books I have ever read. It ably makes the case for accelerating change and a resulting Singularity, so I highly recommend it to those interested in exploring the possible futures ahead of us.
Similarly, Vernor Vinge’s 1993 paper, The Coming Technological Singularity, which argues that the appearance of superhuman intelligence could mark an end to the human era and create unimaginable conditions, and I. J. Good’s statement on ultra-intelligence are must-reads for future-interested persons.
Each definition contains valuable nuggets about how the future may unfold. Yet I have come to believe all three are fundamentally flawed due to their reliance on the vague term: “intelligence”.
Intelligence Remains Undefined: There is no objective, comprehensive, scientifically valid description of the term. Though it’s easy to believe we understand what intelligence is and how it works, we humans have not yet achieved consensus on an overarching definition nor its constituent properties. There are many theories, but an objective law has yet to emerge.
According to an APA report titled Intelligence: Knowns and Unknowns, “when two dozen prominent theorists were recently asked to define intelligence, they gave two dozen somewhat different definitions.”
The Wikipedia definition reflects this vagueness:
Intelligence (also called intellect) is an umbrella term used to describe a property of the mind that encompasses many related abilities, such as the capacities to reason, to plan, to solve problems, to think abstractly, to comprehend ideas, to use language, and to learn. There are several ways to define intelligence. In some cases, intelligence may include traits such as creativity, personality, character, knowledge, or wisdom. However, most psychologists prefer not to include these traits in the definition of intelligence.
At the same time, the bulk of the AI theorists working to create Strong AI/AGI that matches or exceeds human intelligence are either 1) applying a very narrow definition of intelligence that equates one human brain or personality to a discrete unit of intelligence, or 2) building logical or neural processes step-by-step and refraining from venturing a concrete definition.
Definitions of the Singularity Rely on Vague Definitions of Intelligence that Don’t Hold Up: Singularity proponents and detractors alike go about making their arguments without questioning the underlying assumption that human intelligence is composed of discrete units. By and large, they either overtly or tacitly equate intelligence to the functions of an individual brain or system. This is not surprising considering how the brain likes to simplify subject and object so that we can go about living our lives. But that fundamental assumption appears to be wrong, and at the very least is far from verifiable.
Recent research by cognitive historian James Flynn, who incidentally discovered the fascinating Flynn Effect, suggests that intelligence may well be non-static and cannot effectively be defined without placing the subject in environmental context.
Similarly, general observations about social cognition, wisdom of the crowds, an emerging global brain and global body, and machine augmented intelligence reinforce the argument that intelligence is an elusive network property that is very difficult to quantify, much less effectively incorporate into theories about our future.
A clear trend toward permeating, system-wide intelligence growth appears to be revealing itself. If in fact this proves to be the rule and it replaces the commonly held belief that neuron clusters (brains) alone = intelligence, then a Singularity will only be possible if intelligence that we create, facilitate, or discover suddenly overtakes the total body of intelligence inherent in the entire system. Thus, the likelihood of a Singularity as currently defined is tremendously diminished.
And even if it does come down to a single human brain or small network, I find very compelling Future Blogger Will’s argument that “As our ability to understand the technological processes that could lead to a singularity increase, the point in time regarded as being [The Singularity] onset must be pushed further off into the future.” Depending on the extent to which we co-evolve with our technology, this postponement could go on for quite a while.
But whichever way you slice it, until we learn to accurately measure pockets of intelligence and determine their relationship to the broader system it will remain impossible to empirically define a Singularity. This realization greatly reduces the usefulness of the term in making future projections, especially considering how it tends to steal the spotlight from multi-variable future(s) in favor of a more singular, pardon the pun, vision.
So where then does that leave us?
Objective Topsight: Convergent Accelerating Change, Systems Theory, Information, Knowledge, Intelligence and Related Sciences: Though the Singularity may look more and more like the ultimate red herring, this doesn’t mean that things aren’t changing at a steadily accelerating rate. Therefore it’s incumbent upon us to quickly get better at identifying the myriad possible futures ahead of us by developing the skills, tools and knowledge base to do so.
We would do well to place greater emphasis on advancing comprehensive Evo Devo systems research, utilizing social media to more quickly generate wisdom, incorporating holistic thinking into the quest to generate AI, continuing to expand our definition of intelligence/life/humanity, pushing forward related sciences, and ultimately developing an overarching Info/Knowledge/Intel theory that jives with the rest of our knowledge base.
As the Singularity meme continues to heat up and the world reacts accordingly (check out Kevin Keck’s piece on the topic), it’s important that we frame the debate logically and scientifically. We’re presented with a great opportunity to advance the puck by fostering productive dialogue, developing new theories and amplifying our coping abilities – a goal that is being advanced by important forward-looking organizations like The Singularity Institute for Artificial Intelligence and The Acceleration Studies Foundation, both of which support powerful communities of rational thinkers.
Conclusion: If objective topsight is the ultimate destination, then it’s likely the inherently subjective notion of a Singularity will become less useful over the coming years. Still, it’s a scenario set that’s diffusing rapidly and can quickly open up minds to the reality of multi-faceted accelerating change.