10 Questions with SIAI President Michael Vassar

June 10 2009 / by Alvis Brigis / In association with Future Blogger.net
Category: Social Issues   Year: 2009   Rating: 1

michael_vassar.jpg

Recently appointed Singularity Institute for Artificial Intelligence (SIAI) President Michael Vassar, a hardcore proponent of science and reason, emphasizes the importance of "human rationality" when discussing the future, making clear that SIAI is an "analytical think tank and research organization, not an advocacy group".  Vassar says he's apprehensive about a "possible decrease in the quality of debate as the [Singularity] goes mainstream" and that he would find a public backlash against intelligent debate of a Singularity "odd".

Enjoy the candid and insightful interview.

FB: What are your main near-term goals at SIAI? 
  • Put on a 2009 summit and establish a regular schedule of summits on alternating coasts and with a regular format.
  • Develop a body of technical and popular position papers and analysis that reflect our current views.
  • Develop software to help interested people to explore the future forecasting consequences of a range of assumptions.
  • Organize, probably with the Future of Humanity Institute, an essay contest in order to identify novel global catastrophic risks deserving of more serious analysis and drawing attention to the idea of rational treatment of catastrophic possibilities.
  • Reinvent Enlightenment values by building a better forum than currently exists for rational deliberation and cooperative analysis and decision making.
  • Most critically, as always, identify and train potential friendly AI researchers.
FB: Has the organization undergone any significant strategic or tactical shifts since you assumed the Executive Director position?

MV: Our efforts to develop a rigorous theory of Friendly Artificial Intelligence will continue, but our public outreach efforts will focus less narrowly on AI and more on the Singularity more generally and on promoting human rationality.

FB: How do you go about increasing awareness about Strong AI and the Singularity on a regular basis?

MV: The Summits, academic papers, interviews, our media director Michael Anissimov's blog, blogging heads discussions by Eliezer Yudkowsky, and personal contact with interested parties.

FB: What are the most common obstacles that you encounter?

MV: Our largest single obstacle is probably fatalism. People today feel very powerless and don't expect anything they do to affect the larger world. Even the nuclear disarmament movement has largely lost the public's interest. In the 90s, the US and Russia were on pretty good terms, but the associated opportunity for disarmament was squandered. Nuclear war remains a very real and very concrete threat to civilization. AI is even more dangerous but is abstract by comparison. Motivating people to take action to avoid UnFriendly AI is thus that much harder.

FB: What are the most common misconceptions related to the Singularity?

MV: That it is all about AI. In reality, any technology that creates positive feedback loops via intelligence increasing intelligence will change everything fairly quickly unless there are adequate corresponding negative feedback loops.  Another misconception is that the Singularity requires a long term acceleration of the rate of change. I personally think that scientific progress has been slowing for generations, but many technologies or institutions could change that very suddenly. 

The most important way in which people are confused about the future is not specific to the Singularity.  People assume that there exists a coherent well developed common sense view of the probable long term future. The real common sense when speaking of one's own future is to assume that in 40 years you will be 40 years older and it will still be 2009. When speaking of the world's future, common sense is to take a single trend from the last 20 years and extrapolate it linearly, throwing in one random cliche taken from science fiction movies. When deciding what to do (nominally) about the future, common sense is to imitate the nominally future oriented behavior of those who you see as similar to yourself but slightly more successful with the opposite sex.

FB: What's your personal take on the Singularity?

MV: Americans are taught that everyone is entitled to their own opinion.  If my take on global warming is that carbon dioxide traps heat near the Earth, and your take is that it doesn't, we each have the right to think what we want to.  Well, legally they are right, but ethically, my personal take is that people are not doing what's right when they have a 'personal take' on a matter of fact.  

I think that certain rules for reasoning are fairly generally accepted in American culture, at least in theory.  I think that following those rules reliably leads people to agree fairly closely with one another regarding facts and that doing so produces more agreement regarding values than most people expect.  In particular, I think that following the standard rules of reasoning reliably leads people to the conclusion that if civilization survives, a fairly hard take-off Singularity is more likely than not within the 21st century.  When following the standard rules of reasoning, people also agree that in the absence of extremely careful planning a hard take-off Singularity will eliminate everything that humans would consider to be valuable.

Unfortunately, it is extremely uncommon for people to follow the generally accepted rules of reasoning many steps to a surprising conclusion and then to take that conclusion seriously. It is much more typical for people to fudge some step in the supposed reasoning process in order to choose a conclusion. They can then say whatever they expect to be most rewarded for saying.  Still more unfortunately, this is not usually considered to count as lying.  

FB: How do blockbuster movies like Terminator and indie films such as Transcendent man affect public perception of the Singularity?

MV: They insert invalid connotations into most discourse about their subjects. This makes it difficult to discuss related subjects intelligently with most people.

On the other hand, blockbuster movies make the subject less credible but significantly less uncool. Indie films make it more credible and slightly less uncool. Both make it easier to discuss related subjects unintelligently. Physicists resent the movie "What the Bleep Do We Know", but such movies do slightly increase the status of physicists.

FB: What's your degree of involvement with the new Singularity University?  Any thoughts on the new endeavor?

MV: I am thrilled with the amount of interest in the Singularity that the Singularity University's success implies, especially given the economic crisis. I know most of the people involved with the Singularity University, but my personal involvement has been limited to occasional discussions with them.

FB: How does SIAI's definition of the Singularity line up with the one offered by Ray Kurzweil?

MV: SIAI's definition focuses on positive feedback loops between intelligence and processes that could increase intelligence. Kurzweil emphasizes trend extrapolation, Moore's law and quantitative performance improvements in information technologies. Vernor Vinge's definition, the source of the word, emphasized the incomprehensibility and irresistible power that qualitatively superhuman intelligence implies.

FB: What do you see as the ongoing role of SIAI as Singularity awareness and debate of the topic increases?  How will the organization react to the inevitable popular backlash?

MV: We are an analytical think tank and research organization, not an advocacy group or corporate interest group. We exist to promote intelligent debate of the topic. We advocate reason, not a particular policy. Backlash against debate would be... odd...  A possible decrease in the quality of debate as the topic becomes more mainstream is a concern though.

FB: What should I tell my non-scientific, non-futurist friends about the Singularity?

MV: If you can stay alive for another 30-60 years you probably won't die from old age. The main risk of death for the currently young comes from failures of civilization such as nuclear war.

FB: Any word on when the next Singularity Summit will take place and who you've got lined up as speakers?  We can't wait to be there again.

MV: That will be announced in a week or two.  I can tell you that the Summit will be on October 3rd and 4th in Manhattan. 

 

 

Comment Thread (0 Responses)

Related content from the Future Scanner and Future Blogger