On Knowing

I’m no longer a huge fan of making a distinction between thinking and feeling. That’s a change for me. It’s not been a very long time since I considered–like I think we’re all socialised to–rational thought as some higher, purer mode of thinking, and emotional thought as a kind of untrustable reactionary force.

Emotions, in other words, were something that got in the way of thought.

But that’s clearly not the case. It’s certainly not how our brains work. If it were the case, that brand of clear-headed, rational thought would tend more toward some kind of truth. At least in theory.

Of course, it doesn’t. Unless you’re very, very invested in doing ruminative metacognition (that is to say, if you want to think about how you think a lot), you’re probably finding and making arguments that reinforce your preexisting commitments.

To put it another way, your rational thought is a collection of stories you tell to confirm the beliefs you already feel good about.

This is why, to give just one example, it’s so hard to argue committed antivaxxers or flatearthers out of their (obviously, to you) insane positions. They seek arguments as much as you do. They seek arguments that confirm their preconceptions, as much as you do. Their arguments feel true to them, just like your arguments feel true to you.

That’s not to say there isn’t a sort of formal logic that tends toward truth. There is. It’s just that almost nobody uses it. It’s too much work. And I don’t mean that in the traditional Puritan mode of “too much work”; you’re not lazy because you don’t spend all day engaging in formally proving all your positions. You can’t. Your brain just doesn’t work that way. It works on shortcuts and heuristics, because the amount of data it receives is massive, but its processing power, though immense, is not unlimited.

And, because your brain exists to help you survive. If you wish to survive a tiger springing out of the bush, to trot out a particularly threadbare example, you survive by reacting, not by formally proving that there is, indeed, a tiger. The heuristic is to assume there is a tiger and act accordingly.

The stakes are not always so high in our day-to-day, but we just tend to operate that way regardless.

Not to mention that we are creatures of memory. Our perceptions, our conclusions, all our thought that we care to hang on to, is recorded and saved for later. But not perfectly. We remember memories of memories. Memories can be twisted, manufactured, corrupted, and completely forgotten in that process.

Take something that you know. Something simple, something foundational, something we all learn very early in school:

1 + 1 = 2

Is this true? Of course. But here’s thing thing: How do you know? Have you ever proved it? Do you have access to that proof right now?

Of course not. This is an axiomatic mathematical expression. You can prove it, reasonably well, with some sticks or something (or, if you’re brave, from formal logic alone, though it will apparently take about 100 pages of proof to do so).

But you don’t do this. Unless you’re teaching it to someone who doesn’t know it, you’ll probably never do it.

And yet you are supremely confident that this claim is true. You realise, on some level, that if it isn’t true, your entire mental construction of the universe needs to be done. You’ve lived your life thus far labouring under the pretence that it’s true, and things have turned out fine so far.

When you access a truth claim, even a simple, axiomatic truth claim, you don’t have access to the truth of that claim in the moment, or even access to whether or not that truth claim is warranted. Instead, you’re accessing a memory of your own confidence about that truth claim. And what is confidence, in the end, but an emotion? You don’t know that 1 + 1 = 2. You feel that 1 + 1 = 2.

In this case, you’re correct. (Breathe a sigh of relief.) 1 plus 1 does indeed equal 2. But think about all the times you’ve had that same reaction of confidence in your opinion on something, or your confidence that something happened the way you remember it happening, or your confidence that that bit of knowledge you squirrelled away 20 years ago is still valid…

Confidence can absolutely be unwarranted. Memories are fragile. Things that seem axiomatic can be socialised conventions.

And knowing all of this doesn’t help you at all. You can do some metacognition stuff to inspect and attempt to correct your confidence, but this is the sort of ex post facto stuff that doesn’t really help you in the moment. It might help in some other moment. It’s also not guaranteed to make your life any easier; believing things that are capital-t True isn’t some secret shortcut to a fulfilling existence. Some True things are incredibly bothersome, even agonizing.

Anyways. You should interrogate your rationality the same way you interrogate your emotions. You don’t get to throw one away and keep the other. Thinking is thinking.

Leave a Reply

Your email address will not be published.