The Scout Mindset, by Galef

Sunday May 2, 2021

The metaphor is soldier vs. scout. The advice is broadly good. With no real discussion of epistemology but an implicit (roughly positivist?) assumed worldview, I felt interesting foundational discussion was elided. There was also little directly on “hard” (not yet known or unknowable) questions. My critique is principally that I wanted more.

cover


Quotes and notes


"our judgment isn’t limited by knowledge nearly as much as it’s limited by attitude." (page x)


"In “Persuasion,” we saw that law students who are randomly assigned to one side of a moot court case become confident, after reading the case materials, that their side is morally and legally in the right. But that confidence doesn’t help them persuade the judge. On the contrary, law students who are more confident in the merits of their own side are significantly less likely to win the case—perhaps because they fail to consider and prepare for the rebuttals to their arguments." (page 27)

This cites Eigen and Listokin, and I don’t have access or inclination to read the paper just now, but it seems like there’s a claim here like “confidence causes bad performance” and I wonder whether possible confounds have been considered. To me, “lower quality lawyer causes both confidence and bad performance” seems plausible.


"Having an accurate map doesn’t help you very much when you’re allowed to travel only one path." (page 40)


Page 45 starts an exploration of Kahan’s famous paper on scientific polarization increasing with education. Perhaps in an effort to avoid alienating any readers, the tools of the scout are not applied to settle the question of whether global warming is real. The opportunity to engage one way or another with the idea of naive realism is not taken.

In Harford’s treatment of Kahan, I really appreciated the emphasis on curiosity being essential for “scout-like” thinking.


I think I hadn’t seen the idea of Blind Data Analysis mentioned on page 55, citing Nuzzo. Nice! Will keep this in mind.


"It [being critical of a study with undesirable results] prompted me to go back through the studies I had been planning to cite in my favor, and scrutinize their methodology for flaws, just as I had done with the pro-soldier mindset study. (Sadly, this ended up disqualifying most of them.)" (page 68)

I’m not sure to what extent this is a joke; I thought it was funny as I read it. But seriously, I wish people generally would say more about determinations such as this.


I might be a soldier on this, but I don’t love quantifying uncertainty in the manner of the calibration game introduced starting on page 75. I thought a little bit about why.

I’m not sure I have any really coherent argument here. I agree with the general idea of being aware of how sure you are. Somehow I don’t like the exercise of writing down numbers for it.

There is an interesting topic of decision-making in the face of low confidence. What do you do when you know you’re not sure? (Ramble, seems to be my answer.) Maybe out of scope for the book.


"The reality is that there’s no clear divide between the “decision-making” and “execution” stages of pursuing a goal. Over time, your situation will change, or you’ll learn new information, and you’ll need to revise your estimate of the odds." (page 110)

I really agree with this. Planning can be valuable, but following the plan to the letter is often not.


Galef discusses low (10%, 30%) early estimates of “success” from Musk and Bezos (starting page 111). Exploring why they would take such chances, she mentions both expected value (10% of huge is still big) and the idea that even “failure” would be fairly positive. I think expected value is almost always the wrong way to think about significant choices (especially one-shot choices with unclear odds) and I don’t really believe it’s how people tend to think (or should). I think the question of whether something is worth doing, even if it fails is the right question. So I think the balance of emphasis is off here. Expected value is a simple tool, a hammer that people reach for too often, simplifying problems too far. I wouldn’t even mention it in this setting.


"You might think these principles sound obvious and that you know them already. But “knowing” a principle, in the sense that you read it and say, “Yes, I know that,” is different from having internalized it in a way that actually changes how you think." (page 144)


"In his book Sources of Power, decision researcher Gary Klein cites this [explaining away signs of a problem] as one of the top three causes of bad decisions. He calls it a “de minimus error,” an attempt to minimize the inconsistency between observations and theory. Each new piece of evidence that doesn’t fit a doctor’s medical diagnosis can be explained away or dismissed as a fluke, so the doctor never realizes her initial diagnosis was wrong." (page 165)


Exposure to opposing views on social media can increase political polarization cited in chapter 12.


Keep your identity small cited in chapter 14.


"They [a group of citizen scientists] also dove into the politics of government research, familiarizing themselves with how funding was structured and how the drug trials were conducted. The disorganization they discovered alarmed them. “It sort of felt like reaching the Wizard of Oz,” said one activist named Mark Harrington. “You’ve gotten to the center of the whole system and there’s just this schmuck behind a curtain.”" (page 212)

Cites How to Survive a Plague.


Slightly expanded table of contents