Facts don't change minds, structure does
CS Peirce has a famous essay "The Fixation of Belief" where he describes various processes by which we form beliefs and what it takes to surprise/upset/unsettle them.
The essay: https://www.peirce.org/writings/p107.html
This blog post gestures at that idea while being an example of what Peirce calls the "a priori method". A certain framework is first settled upon for (largely) aesthetic reasons and then experience is analyzed in light of that framework. This yields comfortable conclusions (for those who buy the framework, anyhow).
For Peirce, all inquiry begins with surprise, sometimes because we've gone looking for it but usually not. About the a priori method, he says:
“[The a priori] method is far more intellectual and respectable from the point of view of reason than either of the others which we have noticed. But its failure has been the most manifest. It makes of inquiry something similar to the development of taste; but taste, unfortunately, is always more or less a matter of fashion, and accordingly metaphysicians have never come to any fixed agreement, but the pendulum has swung backward and forward between a more material and a more spiritual philosophy, from the earliest times to the latest. And so from this, which has been called the a priori method, we are driven, in Lord Bacon's phrase, to a true induction.”
Self-interest and identity is more important than what the article touches on. People form beliefs based on what serves their self-interest, with made-up lies as a front.
This is why you can accurately predict people's political beliefs by simply knowing their demographics. It is downstream of self-interest.
You can't change people's minds if doing so undermines their identity or their self-interest.
my understanding (which is definitely not exhaustive!) is that the case between Galileo and the church was way more nuanced than is popularly retold, and had nothing whatsoever to do with Biblical literalism like the passage in Joshua about making the sun stand still.
Paul Feyerabend has a book called Against Method in which he essentially argues that it was the Catholic Church who was following the classical "scientific method" of weighing evidence between theories, and Galileo's hypothesis was rationally judged to be inferior to the existing models. Very fun read.
Very engaging look at a very difficult topic to approach analytically.
I'm reminded of something I learned about the founder of Stormfront, the internet's first white supremacist forum. His child went on to attend college away from home, her first time away from her family, and over a period of roughly two years, she attended dinners with a group of Jewish students who challenged each of her beliefs one at a time. Each time, as she accepted the evidence her friends presented to her about a particular belief, she nonetheless would integrate the new information with her racist worldview. This continued piece by piece until there was nothing left of her racist worldview at all.
It's both heartening and disheartening at the same time, because if this person can change her mind after almost two decades of constant indoctrination during her formative years, then surely anyone can change their mind. That's the heartening part: the disheartening part is, of course, that the effort it took is far from scalable at present and much more difficult to apply to someone who remains plugged into whatever information sources they are getting their current fix of nonsense from.
Good article, but I found it a little too abstract to derive much actionable information. IMO the most useful point is this quote:
So when you encounter someone whose worldview seems impenetrable, remember: you’re not just arguing with a person, you’re engaging with a living, self-stabilizing information pattern—one that is enacted and protected by the very architecture of human cognition.
To the author: I love this idea, but your blog has two problems that made it less enjoyable for me to read. The first is the pull quotes. I find them confusing and unnecessary, especially when they repeat sentences in the preceding paragraph. The second is that I got stuck on the moving graphs while scrolling on my phone. I suggest making them smaller with a different background color or simply make them static images.
Some of the core ideas here seem good, but the node/edge distinction feels too fuzzy. The node "Climate Change Threat" is a claim. Is the node "Efficiency" a claim? Can one challenge the existence of Efficiency? If one instead challenges the benefit of Efficiency, isn't that an edge attack?
I could give a bunch of other examples where the nodes in the article don't feel like apples-to-apples things. I feel less motivated to try to internalize the article due to this.
In 'Thus Spoke Zarathustra' the argument is made that the most important cultural changes happen outside the debate, where new structures of thought are being built without being noticed. As without a competing thought structure we are unable to even perceive the new structure. It is the dissonances and the debates that lets us introspect our own ideas. Without the dissonance we do not notice new ideas taking hold of us and changing ourselves, and it is only unnoticed that truly radical changes can take place.
I'm wary of making an "arguments are soldiers" assumption where facts are mostly useful for making arguments, in an attempt to change people's minds.
We should be curious about what's going on in the world regardless of what ideologies we might find appealing. Knowing what's going on in the world is an end in itself. An article with some interesting evidence in it is useful even if you disagree with the main argument.
Facts may not change minds, but we should still support people who do the reporting that brings us the facts.
In practice I think people often don't see the full structure of their own belief graph. Parts of it are clear but for 99% of important issues, it's more fuzzy than portrayed in the figures here. I still think this is an illuminating way of looking at it!
Another major factor is that while the graph may be fuzzy, the people we trust are clear. Only those people are allowed to "fill in" the missing pieces, and I think it takes a lot of work to do that, so it totally makes sense.
If the takeaway is "don't expect conflicting facts to convince your audience" I agree with that, but the reason is they don't trust you, not the conflicting graphs, and the trust is not really a consequence of the graph structure.
(Also, I was writing about similar stuff recently here: https://blog.griffens.net/blog/no-one-reads-page-28/)
Feelings aren't facts but they are important for persuasion. The methods most able to create radical change are the gentlest
https://en.wikipedia.org/wiki/Rogerian_argument
I disagree with Rapoport's taxonomy, not least "Chinese brainwashing" in the Korean war was not Pavolivan and was rather closer to the T-group method developed in Bethel, ME.
If you found this interesting, I highly recommend reading "The Righteous Mind" by Jonathan Haidt. It's deeply impacted how I think of morality and politics from a societal and psychological point of view.
Some ideas in the book:
- Humans are tribal, validation-seeking animals. We make emotional snap judgments first and gather reasons to support those snap judgments second.
- The reason the political right is so cohesive (vs the left) is because they have a very consistent and shared understanding and definitions of what Haidt calls the 5 "moral taste receptors" - care, fairness, loyalty, authority, sanctity. Whereas the left trades off that cohesive understanding with diversity.
In the "Climate Change Threat" example, one vector of attack is when the policy changes do not lead to renewable energy adoption or to reduced emissions.
That justifies the questioning of whether the climate change was really motivating the policy change or just being used as pretext.
I have come to believe that there is no such thing as 'true rationality' in the universe. There are true events and true facts, but rationality is a shared framework for communication. Rationality exists between people.
People always have a framing story or perspective or viewpoint or system prompt for how they understand facts and events.
If you want to influence beliefs you have to understand the framing story that a person is using - even when that framing story is invalid or untrue.
Also, if you want to influence beliefs, you have to provide some emotional validation. You can't remove a load bearing core belief from someone's story, you can only replace it.
---
Another partial explanation is trauma - you can think about 'conspiracy theories' in a number of ways, but these low information, high satisfaction theories often arise after traumatic experiences. You can't properly address the facts of the situation while a person is hurting.
We should expect to see more conspiracy theories after natural and unnatural disasters. Think wildfires caused space lasers, floods caused by cloud seeding, storms caused by radar installations, melting of steel beams by various means. The people who believe these things are generally not having a good time in life.
---
BONUS Link: Tim Minchin - Confirmation Bias
I was on board until I realized there can be an infinite number of "nodes" (as defined in the post) between one another. The idea of destabilization/destruction of ideas works in the macro examples defined in the post, but may not be effective in practice, where the amount of major nodes between one idea and the next is opaque.
Any mapping of the destruction of one idea node in hindsight will suffer survivorship bias with the mapping seemingly sublimely simple. Hindsight is, as they say, always 20/20.
It’s worth asking ourselves “When was the last time I changed my mind?” It’s hard to really recall because the belief rewiring required seems to play havoc with our memory.
"Belief" doesn't actually mean "to believe", as in "I think A is true and B is false". "Belief" is the faith, trust, or alignment with in an idea planted in one's head. It has nothing to do with factual or true information. You can simultaneously know something is untrue, and have belief in it.
You can't change minds because "the mind" (in this context) is the personal identity and ego of an individual, of which their "tribe" is a huge part. Any information that conflicts with the narrative of their identity or tribe will be rejected, because it threatens their identity or tribe. To question one's identity causes a crisis which most people are not capable of dealing with. The more you attack those things, the stronger they will defend them.
The "culture war" is literally just that: one culture attacking another culture on its fundamental nature. This is like Christians vs Muslims. The only way to "win" that war is complete destruction. If you want the war to end without that, you're gonna have to stop fighting and come to some kind of truce.
Is a fair summary of this that in a belief system attacking any of its individual components can compromise the system itself? I would not find this surprising, actually rather intuitive. The insights I would be finding really interesting are the unexpected/unassessed on my end, e.g. how much harder it is to attack each of the individual components by their attributes, or if there is a type of component that is easier to compromise (e.g. edges vs nodes). Or how different systems compose over time (e.g. the venn diagram between flat earthers and Christians has significantly changed since Galileo's time).
The Galileo example is messy. I don't think they cared deeply about the issue as implied here. There's obvious power in being the only ones allowed to say what God thinks about an issue. They wanted to maintain that monopoly.
In the seminaries of the world they don't teach how to make up clever stories to entrap people. For that you go to Marketing, PR or Sales school. And ofcourse the people who come out of these schools think they are very clever because they sold some widgets or politicians to the masses by some deadline.
But have you heard of a sales org or a marketing dept that has been running for thousand years? They barely ever survive few decades as a coherent unit if ever.
For the curious go check what the neighborhood seminary teaches.
The Church (and all other religious systems) haven't stood for thousands of year through the fall of empires, nations, civil wars, revolutions, plagues, famines, collapse of economic systems, internal schisms, enlightenment, progress in science/tech etc because of the stories they tell.
In fact the stories have been rewritten, branched, mutated, merged with other stories thousands of times to the point we have thousands of different versions of these stories. There is no "narrative domination".
The Church has survived because when people Suffer due to the fall of empire/nations/banks/economies, war, plagues, famine, disasters etc where else do people go?
Do they all head to house of the local system analyst/graph theorist?
This seems way too logical. Humans are not, for the most part, rational and logical creatures.
> Please don't use Hacker News for political or ideological battle. It tramples curiosity.
TFA is about the meta level of what persuasive arguments look like.
I see several examples in the comments here of people appearing to share their favourite object examples of how such and such nefarious force is causing people to believe bad things with propaganda — according to them and the sources they trust. If you do this, you are missing the point completely.
Instead, consider privately examining the opposed memeplex to understand why someone else might find it convincing — how their values might be understood, charitably. Re-evaluate how you know what you know; recognize the basis of your own position, and assess the soundness of that "structure" (as the author terms it). Recognize who you need to implicitly trust, and how much, in order to accept that reasoning. Consider why other people might not trust the same authorities you do. (Consider the possibility that other people might be able to trace direct harm done to themselves, to those authorities.) Recognize that reasoning from entirely absurd premises is still reasoning; consider that others do reason. This is why your own (sane, to you) premise does not resonate: it does not fit in that framework.
> So when you encounter someone whose worldview seems impenetrable, remember: you’re not just arguing with a person, you’re engaging with a living, self-stabilizing information pattern—one that is enacted and protected by the very architecture of human cognition.
> Truth matters—but it survives and spreads only when it is woven into a structure that people can inhabit.
Time spent on the Internet complaining about others' structures, is not time spent weaving truth into them. On the contrary, should those others see you, you will only activate their defense mechanisms.
Hesitate to bring up politics here, but it's hard not to view the divide in the US in terms of this framework.
I'm very much a progressive. But it seems focusing on lots of different (and divisive) issues affecting small parts of the population .. it exposes a lot of weaknesses to the other side to exploit. On the other hand, liberal politicians have ignored basic things like jobs, wages, housing (yes, the Uniparty is in the pockets of the oligarchs). Those are things which affect a lot of people on both sides and are worth much more effort. The results of the NYC mayoral primary show this clearly. Yet, the Democrat leaders are all in hiding, and even worse, working to undermine the NYC primary. Together with the so called liberal newspaper. Sad!
Lovely article, and he hints at something that's been on my mind lately, about how the internet enables collisions between groups that cannot (and often should not) mix. For example, the quiet, thoughtful academic giving insightful analysis of Plato's Symposium getting shouted down and called rude names. Or a rowdy bunch of young gamer kids being scolded by a priggish group of college kids for being politically incorrect. The loss of friction, the loss of gate-keeping, sounds good but feels really bad. It's like how we value biodiversity and so lament and control "invasive species" to keep these unique and interesting pockets of the biosphere alive. As a society, we benefit from having quieter, softer, kinder places where sensitive, smart people can do intense work, and yet we are ALSO served by the louder, harder, harsher places where the fighters go. But if we allow these two spaces to mix, the former is quickly eradicated, the latter loses not just its purpose, but eventually the former can no longer offer better fighting tools to the latter. Perhaps this effect has a name, or has been talked about by a more articulate author?
For an overview of the psychology of how people understand things (and don't!) I highly recommend this paper. It highlights a lot of ways our brains take shortcuts in terms of actually understanding things. And that facts play only one particular role amongst many other factors.
Keil, F. C. (2006). Explanation and understanding. Annu. Rev. Psychol., 57(1), 227-254. https://pmc.ncbi.nlm.nih.gov/articles/PMC3034737/pdf/nihms26...
Minds can’t be changed when a decision was based on emotion.
One day I told a friend how I could make no sense of my girlfriend’s behavior in some situation.
He said to me “you still think people make decisions on logic. Many people make their decisions on how they feel emotionally. Logic and facts have nothing to do with it.”
Suddenly a light switched on and i realized that I’m a typical computer person who thinks that everything is based on logic and if you can just explain clearly enough, explain the facts, then the other person will change their mind when they see the facts. It doesn’t work that way.
Computer people have real trouble getting their head around this concept.
Aside: For the Mermaid graph, what library or how is it being shown like this?
Structure for facts and information is just communication.
You can have the facts but not be persuasive due to poor communication skills.
For some reason the article seems to really like the — symbol, even as far as replacing most of it's commas with it
The core of the problem lies not in facts failing to persuade, but in our obsession with trying to change minds.
We've developed systems to facilitate this. Parliamentary debate, for instance, was meant to force parties to justify their positions through public reasons, not private convictions. Religious institutions, too, have long shaped minds with varying degrees of success.
But attempts to reshape humanity, especially on a grand scale, have consistently produced devastating and unintended consequences.
We now live in an age where political expedience trumps truth; what matters is not whether something is right, but whether it plays well. The public is expected to absorb politicized half-truths while being shielded from the real issues....because complexity isn’t expedient. The current obsession with labeling ideas as “misinformation” or “disinformation” is a desperate, often incoherent attempt to control discourse, and it breeds more cynicism than clarity.
In the end, good ideas tend to survive, but not on any schedule we can manage. Trying to micromanage thought or the flow of information is not only futile, it’s unworthy of the very rationality we claim to protect.
Galileo and the church were both correct.
This accurately describes how my brain works. My thought process is like a bunch of graph nodes, and when new information doesn't "fit", it puts tensions on the links, and I want to resolve that tension. I can...feel it happening inside my mind when I think, more or less? -- It's hard to describe
Resolving that tension may occur in several ways, in order of increasing significance:
- Rejecting the new information
- Refining the graph (splitting a node representing a concept into multiple sub-nodes representing sub-concepts with their own relationships)
- Making local modifications to the graph
- Making sweeping architectural changes to the graph as a whole
The author seems to imply that cognitive biases are an inherent qualitative problem that is fundamentally forced to arise from this graph structure. I personally respectfully disagree. In my view, cognitive biases are a quantitative problem, incorrectly setting the threshold at which a large reorganization should occur. "Extraordinary claims require extraordinary proof" is qualitatively a sound epistemological principle -- but to correctly apply it, you must quantitatively set a reasonable threshold for "extraordinary."
I feel like we need to get better at understanding the graph structure of people we disagree with. The best example I can think of is the abortion debate [1]: If you accept the premise "Life begins at conception" [2], the pro-life camp has an enormously strong case; the rest of the graph between that premise and "Abortion should be illegal" is very strong (it's mostly tremendously well-reinforced nodes in near-universal moral foundations, like "Do unto others" or "Murder should be illegal").
Arguments against abortion are frequently just bad when looked at from the graph point of view: They often don't directly confront the premise "Life begins at conception," nor do they attack the graph between the premise and conclusion. [3]
[1] I'm personally in the pro-choice camp; I do not accept the premise that a human fetus has the same moral status as a fully grown human.
[2] "Life" here is not in the technical biological sense, but something more akin to "The ethical standing of human-equivalent sentience." (Bacteria and protozoa and so on are biologically alive, but nobody moralizes about killing them en masse by, e.g., cooking your food.)
[3] If you're curious about my own views on this specific subject, I've talked about them here before: https://news.ycombinator.com/item?id=36255493#36270990
I've had this mental model for a while now, but this post lays it out better than I could've. I think the most important part of the post is this part of the conclusion:
> For years, our main defense against misinformation and manipulation has been to double down on “truth”—to fact-check, debunk, and moderate. These efforts are important, but they rest on the assumption that truth is the main determinant of what people believe. The evidence, and the argument of this post, suggest otherwise: structure, coherence, and emotional resonance are far more important for the persistence and spread of beliefs.
I'm still friends with one or two people who are hat-wearing MAGA supporters. WE stopped talking politics after 2018 or so, but between 2016 and 2018, and still occasionally since, I get a glimpse into their belief graph. Sometimes their facts are incorrect, but that's less common than simply them interpreting the same facts in a different light. Occasionally they'll have an interpretation of a fact pattern that I find more compelling than the interpretation I find in more liberal spaces. (The Democrat party is, after all, not the best at hypocrisy.) These patterns are the place where the point of the blog post comes out most clearly: most people aren't motivated by facts and logic; they're motivated by a vast network of feelings and emotions where each point reinforces all the other points and an individual fact is less important for its truth than its reinforcement of the overall belief graph.
The most interesting thing about the MAGA belief graph though is its overall structure and maintenance. There is approximately a third of the US that simply has an entirely different basis of belief in the world than the other two thirds. How is it maintained? How does normal everyday contact between the two groups not reconcile the foundations of the two belief systems? It's not a difference in facts, although that does come up occasionally. For example, the sudden change in the truth of the Epstein client list and the effort of the MAGA belief system maintainers (news orgs, influencers, etc) to excise it from the belief graph has had some interesting effects.
But the interesting part is the methods used, the way the belief system reacts to influencers and others that shape the belief system, and how particular facts and opinions are used to reinforce the effects of both new and existing parts of the belief graph. Looking at my MAGA acquaintances and seeing their belief system from the outside has made those methods and reactions more legible, and has allowed me to notice some of the times those same methods and reactions pop up in other communities. For example, I dislike the focus on fact-checking, because too often the facts are the same on both sides, and it's only a difference in interpretation. Then people who agree with the fact checkers prove to themselves that the other side is unable to see truth, while people who disagree with the fact checkers prove to themselves that the other side twists truth into lies. Yet people still push for fact checking despite the fact it only reinforces both sides opinion of themselves rather than having any chance of changing the mind of anyone on the other side.
Unfortunately I am lazy or else I would've taken notes of examples of the methods and reactions used to reinforce a belief system, rather than just vague half-recollected memories that form my own belief graph. Regardless, I think it's important for people to look at their own belief system and, when presented with new facts or arguments, examining them and how they fit into their belief system, and see if maybe the argument is relying less on pure facts and more on emotional ties to the rest of their belief system.
I'm struggling to understand this article. I think it's for a couple of reasons:
1. The capitalism graph seems OK but the climate change graph doesn't look right. I've never heard anyone argue that "resilient communities" automatically lead to "policy changes". What does that mean? If you have a resilient community already, why would you need to change anything? It seems to suggest that people with this belief system would end up in an infinite loop of wanting to change policies even when the original motivating problem is solved, which sounds like a very uncharitable view of climate activists.
2. After setting up this very abstract argument, the author ends by claiming, "The evidence, and the argument of this post, suggest [truth doesn't determine what people believe]: structure, coherence, and emotional resonance are far more important for the persistence and spread of beliefs". But he hasn't supplied any arguments. He outlined an abstract theoretical model, but it makes no testable predictions and he doesn't try to prove it's correct. Then he claims there are no real debates in the west about climate change, vaccines, or race, it's all driven by the evil Ruskies "creating social chaos". This claim isn't linked in any way to the first part with the graphs.
I've written about this belief twice in the past.
https://blog.plan99.net/fake-science-part-ii-bots-that-are-n...
https://blog.plan99.net/did-russian-bots-impact-brexit-ad66f...
It's all based on a bunch of academic papers that don't replicate and which use pseudo-scientific methodologies. They misuse ML in ways that generate noise, identify random people as "Russian bots", conclude that "Russian bots" support every possible opinion simultaneously and from there assume there must be some nefarious psychological strategy behind it. In reality they're just doing bad social science and casting the results through the prism of their ideological biases. It works because social science is full of people who are easily impressed by maths they don't understand, and who are surrounded by people with identical ideologies to themselves (often extreme ones). So there's nobody to give them a reality check. Eventually people who understand computer science come along and write a rebuttal, but academia is a closed system so they just ignore it and keep pumping journalists/politicians full of conspiracy theories and misinformation.
Given that, it's kind of ironic that the author is writing about the difficulty of changing people's minds with truth.
Yet another skimmable post from someone very aligned to one corporate/US party centric set of views that believes everyone else is "simply manipulated" and therefore they'll just try to "reach them" via more manipulation.
Or maybe you're just wrong in peddling government/corporate bullshit as is? Maybe others have lived through things and don't align themselves to corporate or political power A or B and therefore will seem contrarian against the drowning flow of "good propaganda"?
Many of us still remember how "believe in science" was used to shut people up about simple corruption and hygiene theater.
If you weren't in the right side of history (anti authoritarian) then, your complains ring hollow.
The whole "<X country> bad" is also such a tired narrative from the Anglo countries that can't stop invading countries for oil and supporting genocide every chance they get.
[flagged]
[flagged]
> Sun, stand thou still upon Gibeon; and thou, Moon, in the valley of Ajalon. And the sun stood still, and the moon stayed
Who am I to doubt the Church's interpretation but this seems like it literally reads as if the scripture is saying that the sun stood still and the Earth revolves around it?
> If we want to counter manipulation and polarization, we need to focus on strengthening the structural integrity and resilience of our own belief systems. This means fostering internal coherence, building bridges between different templates, and cultivating narratives that are not just factually accurate, but also emotionally compelling and structurally robust.
I fear we are already increasingly too late on much of these things because there also exists communities that maintain structural integrity by resisting bridge building between different cultural templates. I.E. You must refuse bridge building in order to maintain your own community, such as wizard, blackpill, or MGTOW actively discouraging bonding with women as equal people to men.
This is a good blog post. Two thoughts about it:
- Contradictory facts often shouldn't change beliefs because it is extremely rare for a single fact in isolation to undermine a belief. If you believe in climate change and encounter a situation where a group of scientists were proven to have falsified data in a paper on climate change, it really isn't enough information to change your belief in climate change, because the evidence of climate change is much larger than any single paper. It's only really after reviewing a lot of facts on both sides of an issue that you can really know enough to change your belief about something.
- The facts we're exposed to today are often extremely unrepresentative of the larger body of relevant facts. Say what you want about the previous era of corporate controlled news media, at least the journalists in that era tried to present the relevant facts to the viewer. The facts you are exposed to today are usually decided by an algorithm that is trying to optimize for engagement. And the people creating the content ("facts") that you see are usually extremely motivated/biased participants. There is zero effort by the algorithms or the content creators to present a reasonably representative set of facts on both sides of an issue