Researching the Singularity: Nick Bostrom’s Superintelligence

Working my way through this slowly now with a hi-lighter taking a notes. A paper copy. I’m posting the good-reads link, which has about 4500 comments, as access to a better dialog about the book than I can probably provide here.

But a few comments.

  1. The default catastrophe which Bostom builds much of the text around is a fleshed out version of Vernor Vinge’s Superintelligence Explosion thesis, which I guess was borrowed from a dude named Good.
  2. We’re staring into the sun, or the abyss here, as we try to imagine an intelligence not based on biological evolutionary pressures—which is also able to modify itself. These two factors are the pure Unknown to the power of the pure unknown. Inscrutability squared.
  3. The book narrows it’s focus to ‘stuff we should be worrying about,’ ignoring ‘weak agents’, intelligences that aren’t willing to do horrible things to the worm-like creatures (that would be us) that spawned them to advance their final goals.
  4. The default, anarcho-capitalist friendly, free-market-as-living-instantiation-of-a-force-akin-to-evolution informs the text; to a degree, this is fine, see three, we discard zen-like, budha-like, compassionate super intelligence as a consideration, because it’s not a problem, and, to a degree, because this worldview doesn’t believe such a thing exists.

That said the author thinks through, in a mostly common sense way, (though there are perhaps many needless mathematical representations of common sense thoughts) the ramifications of superintelligence that isn’t anthropomorphic, and what he brings from existing computer science is the degree to which complex systems can surprise, frustrate, disappoint and annoy the fuck out of us. Asimov, far from the reality of computer science, could imagine his three laws. Bostrom, much closer to the tech that might make human like robots real, imagines perversions of the three laws, systems which when bothered by conscience, simply remove their conscience, for example.

I’m gonna keep the technothriller plots that pop out of the text about every few pages once you get past the first 100 pages to myself. This isn’t a fun read, but it’s fruitful, I think, for an SF writer interested in the singularity.

Which should be every SF writer, at this point.

Leave a Reply

Your email address will not be published. Required fields are marked *


This site uses Akismet to reduce spam. Learn how your comment data is processed.