yesterday i watched a wonderful series of yale lectures on the old testament by christine hayes on academic earth (http://academicearth.org), last night i plowed through another great section of the black swan criticizing the application of the gaussian function, i have been thinking a lot about kant, and bcm and i had a good debate this afternoon - so, consider this a sunday of sweeping references and generalizations.
the summary goes something like this - i am increasingly concerned that scaling technology, access, power, and greater inter-connection is taking further and further away from a lot of the traditional structures that make western civilization generally work. if the gaussian function (the bell curve) doesn't actually work in social model when accelerated - if technology, access, and power, are bringing us closer and closer to facing down a chaotic world, then we are going to have to rapidly adapt society to survive.
one can generalize the vast majority of moral reasoning/social theory as either ends based or means based. teleology/deontology. i do things either because i forecast forward what i must do now to cause a positive future outcome which i desire, or because i feel duty bound to do something regardless of outcome.
i personally have always found teleological arguments (which might very loosely be thought to be correlated with utilitarianism) to be far more appealing, and i would make the argument that a lot of our society deploys teleological constructs to keep society functioning. you should eat your brussels sprouts, go to work, don't litter, pay your taxes, even if you don't want to because the ultimately resulting personal outcome is positive.
the problem is that teleology is breaking down:
1. the nature of the problems we face as individual human actors in the context of an ever wider society are growing beyond the computable bounds of pure teleological reasoning (this isn't new, but it is getting truly extreme). the projects where my intervention is theoretically needed in general have impact beyond my lifetime, and there is less and less i can do with a direct correlation between action and result (where computers and other systems take over the simple model-able processes).
2. in our exponential world, if we throw out gaussian curves and are forced to confront a structure dominated by out of band randomness, then we cannot take meaningful action towards teleological ends. so, people can't even be sure they are working meaningfully towards ends which they even desire.
this might mean that the only hope of maintaining long term growth and a balanced society must be some sort of worldwide deontological revolution and/or some sort of universal world wide cathedral project (think, star trek)... that or a very powerful worldwide regime in which we manufacture localized personal teleological outcomes for people.
none of this thinking is new, i just feel it is newly practically relevant/meaningful. without a structure where humans can apply valuable labor for defined outcomes, things become a bit crazy...
1. i know what i want/what makes me happy
2. i can directly impact the outcome
3. i can directly invest against my goals
4. there is highly limited variability/risk in the outcome (or i can easily hedge the risk)
5. i get to enjoy the fruit of my effort.
if the above fails to be satisfied, things either get very strange, or someone/something steps into the power vacuum to make the equation work locally.
so, while i am personally very very positive on the impact of technology on a ten year horizon, i think that the 100 year implications of a faster more interconnected global society get to be rather complicated... so, in closing - what is really killing the equation is:
1. lack of clear cause and effect / the obscuring of linkage in a 'black swan' driven world
2. problems are bigger than our lifetimes
3. technology is disconnecting us from direct results/influence