I had to explain Daniel Kahneman's System 1 & System 2 to my nephew to help him process some advice. Here’s the analogy I used.
We use our brain to navigate the world and get what we want. Our brain is great at solving problems, but it's slow and gets tired easily. We can't afford to dilly-dally when facing down a lion in the savanna. That's where our System 1 comes in. The System 1 is like a VR headset, say the Apple Vision Pro, which has an inner display and a set of low-res cameras that look out at the world.
Your headset's job is to stitch together what it sees and come up with a story that explains the world. System 2 is your brain which uses this explanation to act in the world and get what it wants. (In reality both System 1 & System 2 are different parts of the same brain. It's like you're wearing a headset that you can't take off.)
There's a lot of world out there, way too much for the headset's low-res cameras to take in. It needs to produce an explanation quickly so that the brain can make a decision and take action without delay. So the headset samples reality and fills in the gaps using shortcuts to generate a simple story that feels right and makes sense. It autocompletes the gaps with whatever is recent, available and familiar. If you've been paying attention to irrelevant or bad data, your headset is going to hallucinate, just like an LLM.
Your headset's display is sharp enough that your brain thinks it's just looking though a pair of clear glasses at the "real" world rather than a rendered image projected onto a screen. There's no reason for your brain to suspect that it's being fed a simple explanation about a world that isn't simple at all. You go about your day confident that what you see is all there is.
This might have been ok in ancestral environments. But a brain that doesn't realize it's getting fed false certainty from its hallucinating headset is going to get tripped up in a modern world that's increasingly complex and filled with hidden variables.
“It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.” — Mark Twain
In a complex world, there's a common shortcut that headsets take: just follow other headsets. There's an abundance of people sharing their explanations of the world with you. Very few of them are aware of their own headset though. When you see advice from someone who hasn't tested their stories against reality, you don't know if you're getting a useful explanation or just hallucination. Content spreads fastest when it's tuned for headset stickiness. If your advisor isn't accounting for their headset, and yours, they might just be trying to convince you of a sticky hallucination they came across, rather than helping your brain get what it wants.
So when you come across a new explanation, test it against reality and not by how it makes you feel. If you must go about collecting explanations that never get tested, do it as someone who's looking to tell better stories. There is value in effective storytelling. A sticky story is the carrier for getting a useful explanation past the headset and deep into the brain. Good stories build coherence and consensus.
At the end of the day, you just need a handful of useful explanations to get what you want. I hope this story about your headset will be one of them.