in summary: technological progress is bootstrapped by prior technological progress -- hence the curve of development captured in moore's law. we stand on the shoulders of giants to create the next generation of giants. the contention is that we now approach a state where the combined processing power of the hardward and software we create is reaching human capacity. at this point, we will be able to use these human-level tools to create machine intelligence that exceeds that of humanity. literally, superhuman intelligence. at this point, all bets are off. we cannot imagine, as humans, what a world with superhumans will be like. this is the singularity.
vinge and others who believe in different flavours of singularity (teilhard de chardin's omega point, various ideas of the human attainment of godhead, etc) think that it is fast approaching and inevitable. vinge's SALT lecture considered alternative situations in which the singularity does not take place. he gives several pathways by which this could happen. the most compelling was the one governed by incentives. the exponential growth of processing power that moore's law describes is generally driven by economic and strategic incentives and limited by physical constraints. if the economic or strategic drivers were countered by sufficiently large disincentives, growth in processing power would taper off. one example of how this might happen: in the first implementations of truly large scale automation, a major bug causes an enormous catastrophe (an n-way collision due to a bug in an air traffic control system).
several key thoughts emerging from this:
the singularity conceived as a purely technological event is disturbing. tools change and evolve on an accelerated timescale; tool generations are orders of magnitude shorter in duration than human generations and they can (and do) therefore change much faster than humans do. marshall mcluhan's conception of the extensions of man begins to break down in earnest around now, when tools are changing and growing in capacity at a phenomenal rate, far beyond the ability of most people to understand. bewilderment is the usual response, and one that we see all the time. virtualization hides this problem in part without solving it: we conceal tremendously powerful tools behind simple user interfaces (cars, search, etc), but the tools that are part of our daily lives are no longer generally within our ability to understand. a good statistic to have: over the course of history, the percentage of tools that can be taken apart and put together again by the average individual. software is, of course, a tool that, for most people, is an impenetrable mystery; computer hardware became almost impenetrably mysterious when we began to miniaturise transistors to the point where they became invisible. at technological singularity, the extensions of man truly transcend man's comprehension and control. actually, this is what mcluhan is saying, for his meta-message is that we are slowly becoming controlled by our tools. i've never understood why mcluhan is generally only thought of as a communications theorist instead of a sociologist of the interaction between technology and society.
as more and more of our species being is dictated by technology rather than genotype, the importance of mutative and combinant evolution declines dramatically. the main driver of change in the species is technological rather than genetic. in the long term, it's unclear what will happen as a result since this subverts the traditional method by which the genetic health of a species is preserved and enhanced. technology is progressively making more and more previously non-viable phenotypes viable, and thus causing them to remain in circulation, so to speak -- developing medication for cystic fibrosis dooms hundred, thousands, millions of unborn children to cystic fibrosis. without a countervailing force (gene therapy), we can expect to see vastly more of these genetic diseases in circulation as the centuries wear on.
the idea of long-term thinking also brings up a number of questions and thoughts:
we feel now, clearly, that we are on the cusp of great change. did it always feel like this? the renaissance must have had a similar sense of ferment.
repeating games optimizes performance. large-scale trials are games repeated in the same timeframe, as are models. is modeling capacity a substitute for experience?
interdisciplinarity yields orthogonal robustness.
my favourite quote from the night, and one which theme i've explored before,
you don't understand the destructive impulse of people who like to break things which are beautiful.