Scott Aaronson has misunderstood continental philosophy

It is first with delight and then with a growing feeling of sadness that I read Luke Muelhauser’s interview with the computer scientist Scott Aaronson at the Machine Intelligence Research Institute. As a computer scientist, Aaronson has contributed much to our understanding of complexity theory and other areas. He has even written popular science books on the field. I am happy to read that he seems to feel strongly about the links between computer science and philosophy. I agree with Aaronson about a lot of things. Certainly, computer science and philosophy are fields that cross-fertilise each other a lot, and my feeling is that this process is only getting started, much more can be done. Perhaps this mating of the two fields is even severely lagging behind what we would need in today’s world. Without a doubt, the study of formal models of rewriting and interpretation is extremely interesting and sheds light on questions about the nature of language, complexity, knowledge, understanding, communication, equipment, and the abilities of the human mind.

But then, just as I am about to call Aaronson one of my intellectual heroes, he stumbles:

By far the most important disease, I’d say, is the obsession with interpreting and reinterpreting the old masters, rather than moving beyond them.

And then he stumbles severely:

One final note: none of the positive or hopeful things that I said about philosophy apply to the postmodern or Continental kinds. As far as I can tell, the latter aren’t really “philosophy” at all, but more like pretentious brands of performance art that fancy themselves politically subversive, even as they cultivate deliberate obscurity and draw mostly on the insights of Hitler and Stalin apologists. I suspect I won’t ruffle too many feathers here at MIRI by saying this.

The unfortunate continental-analytic pseudo-divide

Who are the “Hitler and Stalin apologists”? I hope that this embarrassing epithet is not supposed to refer to Nietzsche and Marx, for example, since even a very casual reader of Nietzsche would quickly discover that he does not like nationalism or anti-semitism. Instead, his thinking was twisted and selectively misused by Nazi ideologists. It is true that thinkers like Heidegger and Foucault for a time supported Nazism and the Khomeini revolution, respectively, and there are other examples of controversial association. But using this as an excuse not to read these thinkers, let alone dismiss all of continental thinking, is very superficial.

This kind of comment itself is not normally worth a serious reply, and it seems Aaronson is just throwing it out a bit carelessly, expecting his audience to be people with views similar to his own. But since it is said by someone who is clearly very intelligent and who clearly wants to bridge philosophy and computer science (which I also want to do), I felt that I should counter the position I imagine that he is coming from. In doing so I will not be responding to Aaronson’s interview as a whole, which is basically an excellent read for the most part, full of interesting viewpoints. Instead, I will be focussing on these two unfortunate remarks only, and the misguided viewpoint that I believe generated them.

The artificial 20th century split between “continental” (French, German, etc) philosophy and “analytical” (mostly Anglo-Saxon) philosophy is extremely unfortunate and one hopes that it can be bridged one day. Aaronson exemplifies a general theme. He is anglo-saxon, a scientist and logician, has a limited interest in the humanities, and is thoroughly modern in that he has lost sight of the unitary origin of the scattered, fragmented array of academic fields and disciplines that we have today. The writers he likes are great ones but on one side of the continental-analytic divide only. He is doing great work, but he could potentially be doing so much more.

On solvent abuse

I believe that I understand Aaronson’s intellectual background to some degree. I studied for my undergraduate degree at Imperial College London, which, like MIT, is a place full of people who are very technically oriented. For me that was a great education in many ways, but it would not be an overstatement to say that very little attention was (and is, probably) given to the humanities there. This was by design. A certain deep but ultimately restricted kind of vision was cultivated there. The pure rationalist perspective functions exactly like bleach in the sense that it will disinfect, killing harmful bacteria, but it might kill healthy tissue too if applied too liberally. It also removes colour. For reasons unclear to me – perhaps partly as a reaction – my interest in humanities flickered to life during my final year there, however, and intensified when I begun my graduate studies here in Tokyo. I became very interested in the viewpoints that philosophy could offer me, and especially in continental writers. It is as someone who has made a difficult migration from a very restrictive logical/scientific viewpoint to a more inclusive one that I write these comments. My hope is that Aaronson will also make this leap and expand the range of his work to include the truly useful – if his funding agencies would let him, that is.

The unified root of knowledge

As Aaronson says, Einstein, Bohr, Godel and Turing had views outside of the scientific fields they are remembered for. It even seems that they might have been so successful in part because of their breadth. Blaise Pascal is remembered in some circles as a mathematician but we could equally well call him a philosopher who did some mathematics on the side. Francis Bacon thought not only scientifically but also meta-scientifically, imagining the limits of science and scientific method. The Pythagoreans approached mathematics not as something to be contemplated as a formal exercise at a desk with pen and paper, but as part of something esoteric and mystical. In ancient Greece, education emphasised an integrated, well-balanced body and mind and training in a wide range of theoretical and practical fields was important for one’s stature. The Greeks preferred this kind of multiplicity, and would have been horrified at the suggestion that the focus on specialties and separate disciplines that we have today should somehow be better. But today we have thoroughly rejected the idea that all knowledge and understanding is connected and stems from a single source.

In the first of the two remarks that I have singled out above, Aaronson complains that academic philosophy continually gets into the “hermeneutic trap” of reinterpreting the same passages by dead writers again and again. What is decisive here, as in so many things, is the attitude with which one carries out this interpretation. If this exercise is carried out for the sake of getting a grade at a modern university, passing a class, or for academic promotion, then the result can be nothing but junk, artificial, forced thinking and writing, and a bad reputation for the activity as a whole. The necessary attitude that gives this activity its true value is grounded in a desire to return to the origin – the root that many applied scientists mistakenly believe their branch of the tree constitutes – and then use the insights from there to bring society forward. The suggestion that this activity has no value is ridiculous. Would Aaronson also say that we don’t need to study history, that we should let every generation invent society anew? Maybe he’d recommend burning books older than 50 years? I’m very far from making some kind of blanket endorsement of conservatism, but I would certainly endorse a selective conservatism that critiques the past in order to learn from its experience and create a better future. Only the earnest interpretation of old texts can renew our connection with the origin of our thinking. (This is not to say that what goes on in humanities departments today is such an earnest interpretation, but that discussion belongs elsewhere.)

“Utility” and what is truly useful

Many of us moderns are obsessed with a particular notion of utility, which comes to dictate what is worth doing. Everybody understands that it is easy to fund computer science because it leads to applications, be they commercial, scientific or military, that can immediately be exchanged for money. (It is through luck that academics doing good work of true value are sometimes able to dress up their work as “useful” to the markets and funders. If this didn’t happen, institutional thinking would be even more diseased and withered than it already is.) It is difficult to fund a study of hermeneutics or existentialism, because the markets don’t care, consumers are not interested. But just as democracies are unable to make long term decisions but instead make decisions that will please voters today, what is “useful” from computer science in the short term, for example for fighting battles in Afghanistan or for making a new iPad, is not necessarily what is needed for the long term, which is: the furthering and evolution of culture, new, inspiring and vital visions for society and for the future, spiritual height. The suggestion by Clark Glymour that Aaronson refers to (but thankfully doesn’t endorse), that philosophy departments should be defunded unless they contribute something applicable to other disciplines, might be the single worst idea I have ever encountered.

Poetry, prose and contradiction; style as a conduit of meaning

Heidegger’s Being and Time is a very difficult text to read. Is it, to use Aaronson’s words, a pretentious brand of performance art? Is the difficulty there only for the sake of being difficult? To put it another way, is the difficulty accidental and contrived or is it essential?

Accidental difficulty should obviously be removed as much as possible from any work, so that it can be made more accessible. “As simple as possible, but no simpler”. But I contend that the difficulty in this and other, similar texts is an essential one. There is no simpler way of phrasing the argument. The arguments in mathematics, and to a large extent in computer science, can be phrased in a formal calculus and can be expressed with (apparent) elegance and simplicity. But philosophy would be severely limited if reduced to a formal calculus. The arguments made by Heidegger, for example, are in some way deeply bound up with language itself. In order to receive his teaching, it is necessary to feel and engage with his words and his phrasing. To reduce the arguments to simpler but apparently similar sentences would be to remove some of their essence. In other words, when reading this kind of work we should not insist on trying to separate “form” and “content”. This is to some degree true for all continental philosophers I’ve read, but especially clear with Heidegger – as anyone who has seriously tried to get through Being and Time would probably agree. There is also no doubt that this kind of writing sheds light on something. And who would dispute that illumination of the world and our conditions of existence is one essential aim of philosophy?

If one finds it difficult to read texts that do not present strictly logical arguments, but communicate meaning in other ways, then the only way around this difficulty would be to invest time, effort and patience into the reading process, just as one does when trying to understand or formulate a mathematical proof.

Reaching towards the extralogical

A typical reaction from someone who has spent too much time with “analytic” thinking exclusively and then encounters a continental thinker would be something like: “This makes no sense. I do not understand what facts are being stated or what propositions are being proven. The writer is even contradicting himself. How can anyone take this seriously?”.

In order to move beyond this kind of hasty judgment, it is necessary to step outside the realm of the mathematical. The following points may serve to indicate where this realm lies (here it is very much the case that it lies just before our eyes — actually, in our eyes, in our nerves, in our very being — and we do not see it).

1. There are things that can not be expressed in logic but are worth studying. The way that we approach ethics and “utility” is for the most part extralogical. One’s identity and sense of direction in life is extralogical. A logical system is not worth much without axioms or applications, i.e. without bridges into and out of it. Art is one of the most important sources of such bridges. Insisting on a fundamental separation of the artistic and the useful/valuable, in the way that Aaronson seems to do, is ridiculous.

2. Mathematics and even computer science depend vitally on artistic elements, however contrived, personal and inexpressible they might be, to receive their salience, their sense of height and gravity.

3. Do politics, world history, human society and biology move according to the rules of logic? Dubious. Should these things be enslaved to logic in an ideal world? Highly dubious!

4. Poetry can express meaning that cannot be captured in logical arguments. Poetry can circumscribe and indicate. Contradiction is one particular poetic element and as such it can carry meaning. This is one reason why it is not an argument against a philosophical text when it is self-contradicting.

5. Attitude, grasping, understanding, and vision that gives a particular kind of access to the world — these are complementary to and as important as facts that can be expressed as propositions. Questioning, having the ability to persist in uncertainty, is sometimes more valuable than definite propositions about something.

Conclusion

Computer science is now a rapidly growing scientific and cultural force, and computer scientists must be critical of their roots, their style of thinking, and their methods, to avoid making serious mistakes. Computer scientists should reach deeply into the humanities, just as the humanities should reach into computer science. One hopes that the Machine Intelligence Research Institute understands that machinery (and logic) is not an infinite space that encompasses everything intelligible. It is necessary to understand the boundaries of that space in order to work inside it and build good bridges to its exterior.

Having said all this, I feel somewhat guilty for having singled out Aaronson as a representative of a larger group of technologists who thumb their nose at the humanities (French and German humanities in particular). He is far from the worst in this category. My only excuse is that the sense of wasted potential that I get is especially great here – it would be sad if Aaronson went through the rest of his career never reaching into continental thinking. I would recommend Aaronson to read Nietzsche’s writings on appearance, masks, becoming and truth, and then reflect on complexity in the light of that, read Heidegger’s writings on being in order to get a new idea of what meaning is, and reflect on artificial intelligence in the light of that, and read Foucault’s writings on power, visibility and control, and reflect on the overall social role of computers in the light of that. As a bridge between mathematical and continental thinking, I recommend Manuel DeLanda, whose books truly touch both of the “continents”.

Thinking does not stop where logic ends, if indeed it has begun at that point.

 

Japan’s imitation of the West

Memes often travel between neighboring countries and cultures like genetic material travels between bacteria in a colony. The imitation by one culture of another is rarely a pure copying though, but usually a kind of creative act: a selection, curation, editing, emphasising, painting over. But the distance between some cultures is greater than between others. Edward Seidensticker’s wonderful book Tokyo: From Edo to Showa brought home to me that Japan worked very hard as a nation to become Western during the 20th century: to adopt Western values, ways of thinking and attitudes, though not indiscriminately. Underappreciated labour perhaps. Western societies should feel flattered by this thorough imitation.  And if we find fault with Japan, we should perhaps also look at home, at our own societies, and ask if the root of the problem does not lie in something that we exported. It may indeed be that the problems we point out most readily in Japan are the ones that remind us of our own problems in some way. And today, when the sustainability and long term vision for Western societies is increasingly in question, one hopes, for Japan’s sake, that the imitation did not go too far.

We should also look closely at China, which seems to be evolving into a different kind of hybrid of Western and Chinese thinking, perhaps a less wholehearted imitation. Quantitatively at least, for example in terms of speed, the change in China today appears to match anything Japan has gone through. But qualitatively it may be different and it must be anybody’s guess today what direction China will ultimately take.

Concert review: free jazz at Nanahari, Sep 19

The performers: Kevin McHugh from the US on piano, Hugues Vincent from France on cello, as well as an Australian clarinet player, and Japanese cello and flute players and a drummer.

The venue: Nanahari – “seven needles”, 七針 – a small basement in Hachobori, east Tokyo, in an authentically Showa-era building. We are partly transported into another time – the bubble era, echoes of it.

The audience: 10-15 people. Each of us brings something there: hopes and fears, personal histories, wine.

The context: up north, Fukushima seems to be having as deep problems as ever. Governments and companies over the world are embroiled in surveillance scandals and financial problems that seem to be piling higher year by year. The economic outlook in many places, and certainly Japan, is highly uncertain. The Japanese population is aging. For foreigners whose occupation is something unorthodox like music, getting a visa to stay in Japan is not trivial. In the middle east, conflicts are raging with no end in sight.

And yet. Here and now, these musicians from four countries manage to synthesise something that could have been done nowhere else and at no other time. The format is free jazz, one of the least restricted forms of music. McHugh and Vincent probe their instruments – piano and cello – deeply. McHugh dissects the piano and begins adjusting and interfering with its strings and other innards while playing, as he is wont to do. Vincent displays an intimacy and energy with his cello that is almost frightening. One fears what he might be capable of. Through the unorthodox playing, their instruments receive lobotomies, massages and savagery that allow them to produce soundscapes one must suspect they were never designed for. Yet in the hands of capable musicians like these, the result is plausible, amazing and profoundly unique.

The evening goes on for a few hours; various combinations of the present musicians (drawn by lottery) improvise together. The result is sometimes theatre, sometimes pure aesthetics, sometimes metaphysical. There are confrontations and compromises. Something is unconcealed; veils are lifted off. Intensely true and genuine narratives blend to form new and unique stories. The evening’s performances are one justification, one redemption of the mad state of the world today. And on some level, perhaps proof that the madness is not complete, that something healthy and vital is still alive, expressing itself.

Teilhard de Chardin, Nietzsche and individuation

On a friend’s recommendation I started reading Pierre Teilhard de Chardin’s The Phenomenon of Man. De Chardin was a Jesuit and a paleontologist who in this work attempted to reconcile his Christian beliefs with evolution and natural selection. The result is an intense work of great ambition, rich with vivid metaphors.

By chance I was leafing through Spinoza’s Ethics and Bergson’s Creative Evolution at the same time. These books seem to be dealing with similar themes but perhaps in very different ways and it seems a comparative reading might be fruitful. More on this later, I hope.

I’ve read about half of The Phenomenon of Man but wanted to write down some initial reactions. The reasoning, imagination and ambition here is truly great, and the book is a truly positive endeavour. I see no traces of hostility or reactive thinking here. De Chardin wants to build something new, not attack the old.

What makes the work Christian (it appears at this point) is a fundamental belief in meaning and progress towards a goal point (“the Omega point”). From this notion of progress, every part of history and the universe is supposed to be imbued with meaning teleologically. De Chardin goes so far as to posit a force that pulls beings towards greater complexity, a fundamental force of progress/nature. I’m not sure I’m convinced about this part. A more Nietzschean thinker would perhaps say that the Omega point De Chardin sees is only one of many possible points that would appear randomly in succession, with no control or method. Given enough time one would perhaps observe additional such points. (Again, this is based on my having read only half the book and I haven’t read the main thrust of the Omega point argument yet.)

De Chardin views evolution, rightly it seems, as a succession of phases that repeat themselves: first a given life form saturates itself, then delineations and nervures appear and the life form splits up, fragments, species/individuals appear. Eventually species vanish, but one member of a family may survive to form the root of the next fragmentation phase. This is the pattern that happens on the level of species and also, it seems, on the levels of national, organizational and individual history, behaviour and thought. This is how antifragility manifests itself.

Against this backdrop Nietzsche seems concerned mainly with the individuation part. His great fear is that society or mankind will collapse into a formless mass where everybody “loves their neighbor’s warmth”. This is why he emphasises hardness and distance between individuals, rather than fluidity and the amorphous unity of a crowd. He returns to this theme in practically all of his books. A society where no individuals venture out onto chilly mountaintops to expand the occupied space would be past its peak and at the beginning of its decline. Nietzsche is in some sense a philosopher of expansion, as is De Chardin, but they approach the theme differently.

It remains for me to see precisely how De Chardin will engage with Nietzsche. Hopefully more on this later.

Equipmental visibility and barriers to understanding

The following is an excerpt from a text I am currently in the process of writing, which may or may not be published in this form. The text is concerned with the role of software in the scientific research process, and what happens when researchers must interact with software instead of hardware equipment, and finally the constraints that this places on the software development process.

Technological development since the industrial revolution has made equipment more intricate. Where we originally had gears, levers and pistons, we progressed via tape, vacuum tubes and punch cards to solid state memory, CPUs and wireless networks. The process of the elaboration of technology has also been the process of its hiding from public view. An increasing amount of complexity is packed into compact volumes and literally sealed into “black boxes”. This does not render the equipment inaccessible, but it does make it harder to understand and manipulate as soon as one wants to go outside of the operating constraints that the designers foresaw. As we have already noted, this poses problems to the scientific method. Scientists are human, and they engage with their equipment through the use of their five senses. Let us suggest a simple rule of thumb: the more difficult equipment is to see, touch, hear etc., the more difficult it becomes to understand it and modify its function. The evolution of technology has happened at the expense of its visibility. The user-friendly interface that provides a simple means of interacting with a complex piece of machinery, which initially is very valuable, can often become a local maximum that is difficult to escape if one wants to put the equipment to new and unforeseen uses. We may note two distinct kinds of user-friendly interfaces: interfaces where the simplified view closely approximates the genuine internals of the machinery, and interfaces where the simplified view uses concepts and metaphors that have no similarity to those internals. The former kind of interface we will call an authentic simplification, the latter an inauthentic simplification.

Of course, software represents a very late stage in the progression from simple and visible to complex and hidden machinery. Again we see how software can both accelerate and retard scientific studies. Software can perform complex information processing, but it is much harder to interrogate than physical equipment: the workings are hidden, unseen. The inner workings of software, which reside in source code, are notoriously hard to communicate. A programmer watching another programmer at work for hours may not fully be able to understand what kind of work is being done, even if both are highly skilled, unless a disciplined coding style and development methodology is being used. Software is by its very nature something hidden away from human eyes: from the very beginning it is written in artificial languages, which are then gradually compiled into even more artificial languages for the benefit of the processor that is to interpret them. Irreversible, one-way transformations are essential to the process of developing and executing software. This leads to what might be called a nonlinearity when software equipment is being used as part of an experimental setup. Whereas visible, tangible equipment generally yields more information about itself when inspected, and whereas investigators generally have a clear idea how hard it is to inspect or modify such equipment, software equipment often requires an unknown expenditure of effort to inspect or modify – unknown to all except those programmers who have experience working with the relevant source code, and even they will sometimes have a limited ability to judge how hard it would be to make a certain change (software projects often finish over time and over budget, but almost never under time or under budget). This becomes a severe handicap for investigators. A linear amount of time, effort and resources spent understanding or modifying ordinary equipment will generally have clear payoffs, but the inspection and modification of software equipment will be a dark area that investigators, unless they are able to collaborate well with programmers, will instinctively avoid.

To some degree these problems are inescapable, but we suggest the maximal use of authentic simplification in interfaces as a remedy. In addition, it is desirable to have access to multiple levels of detail in the interface, so that each level is an authentic simplification of the level below. In such interface strata, layers have the same structure and only differ in the level of detail. Thus, investigators are given, as far as possible, the possibility of smooth progression from minimal understanding to full understanding of the software. The bottom level interface should in its conceptual structure be very close to the source code itself.