Sunday, July 8, 2012

How we ought to think

'One of the great pleasures of the philosopher's life,' wrote Jim Hankinson in The Bluffer's Guide to Philosophy, 'is being able to tell everyone (and not just children and dogs) what they ought to do. This is Ethics.'

On this reckoning, logic should afford even greater pleasure to its practitioners than ethics does insofar as it purports - at least on some accounts - to tell everyone how they ought to think. For example, consider this (from a text book for undergraduates): 'Logic is sometimes said to be the science of reasoning, but that assertion is somewhat misleading. Logic is not the empirical investigation of people's reasoning processes or the products of such processes. If it can be called a science at all, it is a normative science - it tells us what we ought to do, not what we do do.'

Or, as Gottlob Frege put it: 'the laws of logic are ... the most general laws, which prescribe universally the way in which one ought to think if one is to think at all.' (The Basic Laws of Arithmetic)

Frege, in fact, was something of a proto-fascist, and the above statement could be interpreted as having an authoritarian, even totalitarian, tenor. It could also be interpreted simply as an honest statement of the constraints of thought, reflecting Frege's noble goal of defining the bedrock of human reasoning.

It's no surprise that most attempts to articulate logic's normative role run into trouble. For what authority can the logician appeal to?

Formal logical systems are often seen as part of an attempt to systematize thinking, to improve (as it were) on ordinary thinking and the ordinary language on which it depends. And it is certainly true that ordinary language often deceives us and obscures the underlying logic (or structure) of an argument. Translating an argument into a formal language can reduce ambiguity, but those who have sought through the study of formal logical systems to illuminate the laws of thought or their foundations have been disappointed. Doubts surround not only the putative authority of a logical system but the very meaning of its symbols.

Technically, the meaning of what Rudolf Carnap called the fundamental mathematico-logical symbols (now usually called logical constants) derives from the explicit rules we lay down for their use, but in fact the question of their meaning remains obscure. One thing is clear: the whole exercise is paradoxically dependent on a prior understanding of the basic logical operations. Ordinary language use is also predicated on such an understanding: anyone lacking it would not be able to use language in anything like a normal way.

The work of Frege and his successors led, of course, to the development of digital computers in the mid-twentieth century, and in this sense it was spectacularly fruitful and successful. But it has not really led to a new understanding of human reasoning or established clear guidelines - as Frege hoped - for how we ought to think.

In fact, the attempt to create formal systems which can do what natural language can do has led to a renewed appreciation of the complexity, power, elegance and logical depth of the latter. Wittgenstein was right to warn against thinking of our everyday language as only approximating to something better, to some ideal language or calculus.

We need formal systems for dealing with mathematics and science and technology, but, as far as the fundamentals of logic are concerned, it's all there - implicitly at least - in the language of a five-year-old child.

7 comments:

  1. I do agree that "the attempt to create formal systems which can do what natural language can do has led to a renewed appreciation of the complexity, power, elegance and logical depth of the latter".

    As I see it, there are three parties here: Frege (pure formal logic), Wittgenstein (ordinary language and leaving the world as it is), and Dewey (critical thinking). Dewey wrote a book called "How to Think" which put CT into the frame. Mill also preached a CT message.

    I suspect that CT needs to be taught. It doesn't come naturally, even if kids have the basic concepts built in. This is the theme of Kahneman's recent work, judging from reviews I've seen.

    The Bluffer's Guide -- there's a book I would like to have written.

    ReplyDelete
    Replies
    1. Wittgenstein is certainly at one extreme and then you have various kinds and degrees of linguistic revisionism. But someone committed to formal logic need not necessarily have a negative or 'reformist' attitude to ordinary language.

      My focus here was on the fundamental logical operations (like conjunction, disjunction, etc.) and what justifies them and what they mean. Critical thinking is about complex, real-world reasoning, and so takes us beyond the fundamental building-blocks.

      Take the 'conjunction fallacy' (Daniel Kahneman's 'Linda problem'). It seems to me that the problem arises because an 'and' in an English sentence is not a logical '&'. There's a lot going on in ordinary language which formal logic blocks out. (I am thinking of what is studied by linguists under the name pragmatics and Grice's notion of 'implicature'.)

      Delete
  2. I agree. Simple words, even logical operators, do often have multiple meanings. And there's no easy way around this.

    I'm curious about conditionals. Every five year old can follow modus ponens, and we all use it automatically. But it's very hard to teach modus tollens persuasively. Why so? I imagine we also use MT almost automatically.

    ReplyDelete
    Replies
    1. It's as though these patterns of thought are hard-wired into the brain and work well enough on actual, concrete problems but when you make it abstract, intuitions fail.

      Delete
  3. Implicature (!!!) I love that (!!!)
    Ignites my fascinature (!!!)

    ReplyDelete
  4. Some interesting stuff here:

    https://sites.google.com/site/hugomercier/theargumentativetheoryofreasoning

    ReplyDelete
    Replies
    1. I'd certainly go along with a lot of what Mercier says here, though I think he might be over-generalizing. And the way he presents that example of the big roach-shaped chocolate versus the small heart-shaped one doesn't make any sense to me. Why wouldn't you prefer the large, expensive roach-shaped one??

      Delete