Tagged with rationality


This made me laugh:

If your track coach tells you to run faster, and you answer with something about e=mc^2 and the light speed barrier, you're making a pretty strong claim about your current abilities.

It's from today's ACX piece which argues that we can and should think better, and that "everyone is biased" is a bad argument against trying.

This is close to what I think is the reason I don't like relativism or postmodernism. Pointing out that someone or some argument is not perfectly right is very much not the same as saying it is as bad as something else. There are gradations and they matter.


Calibrating Beliefs

I mentioned Scout Mindset, the new book by Julia Galef, just the other day. Now I am two thirds through the audio book, read by herself, and can wholeheartedly recommend it once more. It is quite close to what I expected from it.

One chapter is about calibrating beliefs, that is not only being aware of what you belive to be right, but also how confident you are about that, by assigning a probability to beliefs being correct. Then, in her own words:

Being perfectly calibrated would mean that your “50% sure” claims are in fact correct 50 percent of the time, your “60% sure” claims are correct 60 percent of the time, your “70% sure” claims are correct 70 percent of the time, and so on. Perfect calibration is an abstract ideal, not something that’s possible to achieve in reality. Still, it’s a useful benchmark against which to compare yourself.

She then provides opportunity to practice by answering trivia questions. This was more fun than I expected and you should click there right now and spend a few minutes on this!

I say this not only because it just so happened that I did well this time: My guesses about which I was 55%, 65%, 75%, 85% and 95% certain, turned out to be 50%, 67%, 80%, 100% and 100% correct, respectively. Probably a fluke. Full score sheet.

Tagged ,

Scout Julia

I had seen her TED talk before and had noticed her name pop up in my timeline on Twitter from time to time. But I admit I did not really have Julia Galef's work on my radar until recently, when I added her podcast to my rotation. So far it has been very good and I intend to check-out the large backlog eventually.

But first, I'll dive into the book that she just released: The Scout Mindset. See her Twitter for teasers, or listen to her being interviewed about it here.

I did not mention it in my Star Trek rewatch, but Julia is certainly right that Spock is not very logical, and quite annoying.

P.S.: See her explain Scout Mindset through activist examples in this video.

Tagged , ,

Sympathetic Reading

Whenever we encounter some statement or proposition, we cannot help but react to it. This is often automatic, sometimes even subconscious. The kind of reaction we get depends to some extent on what is being said, certainly, but not only. To a larger extent our own mental state is more crucial.

How well does the statement fit into our current world model? How unexpected is it? How flattering or insulting is it? What is the intention of the speaker?

There is considerable freedom in interpretation, but it is a kind of mental freedom that is easy to overlook. The difference between adopting a positive reading or a negative one of someone's argument is huge. It is the difference between strawmanning and steelmanning, and the difference between getting offended and curious.

I find that, with a little practice, it becomes possible to take almost nothing personally and to notice my own reactions a bit more clearly, which lets me choose the direction, to some extent. The most sympathetic reading of what is being said is a good default to strive for, I believe. Not because I want to appear "nice", although that might be a welcome side-effect, but because it actually is less mental effort and frees me from being caught up in ruminations about some possible negative subtext that my brain manages to notice, or invent.

Tagged ,

Virtue Signalling

See what I just did there? I was telling you (and myself) what a good boy I am to not participate in consumerism! In other words, I was "signalling my virtue".

This is one of those concepts that, once you learn about it, it pops up everywhere. It is deeply intertwined with prestige and how much we care about what others think of us. And with mating behaviour, for example when males (of any species, including us) try to convince females that they would make a good mate, by whatever criterion that is relevant in the situation.

This podcast with Geoffrey Miller, who literally wrote the book called Virtue Signalling, is quite good as an overview, if I remember correctly - it has been a while since I listened to it. One of the points he stresses is that virtue signalling is generally a good thing! It is a way to build trust and allow for cooperation.

Recently however, the term has mostly been used in a derogatory sense, like accusing accusing someone to be "just virtue signalling" instead of being sincere in their proclamation. For example, a male calling himself "feminist" can be suspect to ulteriour motives, that is saying anything that "gets the girl"; he might even be a sneaky fucker.

Similarly, social justice activists' opinions, while certainly being convinced of their own noble motives, can easily appear repulsive to outsiders when attitudes get polarized within the group from everyone's trying to gain prestige by having the "purest" opinions on the subject matter at hand.

All this, just to say: I bought a new coffe machine after all, thereby negating the virtue I was signalling before. virtue-signalling

It works well, makes good coffee and I hope it lasts for as may years as the old one.

Tagged , ,

You Are Dreaming

While chopping wood earlier today, I listened to this FLI podcast with Joscha Bach in which he says:

We obviously live in a dream universe. And the dream is dreamed by a mind on a higher plane of existence and that is implemented in the skull of a primate. In the brain of some primate that is walking around in a physical universe. This is our best hypothesis that we have. And so we can explain all the magic that you’re experiencing by the fact that indeed we live in a dream generated in that skull.

And now the question is, how does consciousness come about? How is it possible that the physical system can be experiencing things? And the answer is no, it can’t. A physical system cannot experience anything. Experience is only possible in a dream. It’s a virtual property. Our existence as experiencing beings is entirely virtual, it’s not physical. Which means we are only experiencing things inside of the model. It’s part of the model that we experience something.

For the neurons, it doesn’t feel like anything to do this. For the brain, it doesn’t feel like anything. But it would be very useful for the brain [to know], what it would be like to be a person. So it generates a story about that person. About the feelings of that person, the relationship that this person has with the environment, and it acts on that model. And the model gets updated, as part of that behavior.

This is quite dense and almost incomprehensible without a lot of prior knowledge that is not yet obvious to "normies" like myself. I had to turn down the listening speed to 1x, which I rarely need to, but I think I got the gist of it.

It reminds me of Anil Seth's TED talk where he argues that coscious awareness is a hallucination contrained by reality, whereas a dream has no such contraints.

Mind-blowing stuff to think about, maybe literally so.

Tagged , ,

Change Your Mind

Is it possible to change your mind too much? I am generally trying to be actively aware of confirmation bias, that is fitting every new bit of information into one's existing set of opinions. Plus I like the feeling of novel thoughts and how they tickle your brain.

This sometimes manifests in an instinct to run in the opposite direction when everyone agrees on something. Everyone thinks climate change will be catastrophic soon? Sure, but what about other solutions than admonishing people to behave differently? And aren't some the activists claims exaggerated? (Apocalypse Never by Michael Shellenberger is a book that tries to distinguish the science from the hyperbole; I should finish it some day.)

That same gut reaction seems to be my default in many areas. If I wanted to flatter myself, I would call it a "scientific mindset", to always question common wisdom, but I am not sure it really is that. Also it becomes potentially "dangerous" socially, in the sense that one can easily come across as obnoxious and unnecessarily crontrarian. And in the wrong context it can send the wrong signals as to how one gets sorted into the bins labelled good guy or bad guy.

Thus I am genuinely unsure whether I tend overcompensate when trying to correct my confirmation bias. After all, there often are good reasons that there is a widely accepted view, and quickly throwing out a strong prior is bad Bayesian thinking. In an extreme case, it would make me gullible, accepting new arguments or framings without weighing them properly against what I thought before.

Coincidentally, Scott Alexander wrote about the exact opposite today, trapped priors. Quite likely I have some of these, too, and I see no immediate reason why one cannot have too weak priors concerning one subject matter, and too strong ones in another.

Added 2021-03-12: Also closely related, I just heard Rob Wiblin in his podcst say the following, which gave me a chuckle:

I feel like I can notice a perverse aspect of this in myself when [...] you’ve kind of settled on what is kind of probably the true boring thing, the unexpected boring real conclusion just to some issue that has been controversial to you in the past. Then you’re like, “I’m bored of this. I have to find new takes, something new to say about this issue. [.. ] But what’s left?” What’s left is just bad takes, like dumb, unexpected contrarian takes.

Tagged ,