Why Attempts To Control Opinion Cause Opinion To Diverge

Why Attempts To Control Opinion Cause Opinion To Diverge

From Mars, John Carter reports in a wonderful essay discussing the Regime’s attempts at manipulation says this:

People don’t like being gaslit. The psychological techniques used by salesmen, con men, and pickup artists are highly effective right up until the point at which the target becomes aware of the game. Awareness brings emotional blowback, making further manipulation effectively impossible because the target now regards every piece of information originating from the manipulator with hostile suspicion.

Now this is true and most know how it applies in daily life. Interactions with salesmen, especially of sketchy products, starts with disbelief because of this. With politicians, too.

What’s interesting is that this phenomenon can be understood in terms of probability. So can the next related idea, which will turn out to be important in government control of “misinformation” and “disinformation”.

Take some topic that is not being touted by propagandists, the government, universities or businesses in any large way, but about which there is not total ignorance neither. Say, Russia’s stance toward Ukraine before the start of its war. But pick in your mind any example you like.

At Day One of the war, and before the media wrapped its tentacles around the public mind, if we did a survey (one of, for instance, people who read at least one book a year) about the proposition “Russia’s view towards Ukraine is Russian self-preservation”, we’d have a range of opinions. Some would say the proposition is highly likely to be true, and some would say highly unlikely. Most would probably hover somewhere in the middle.

Here’s where theory enters. It is thought that as information about a topic increases, and of course is understood, that wherever people’s opinions are at the beginning, as they learn more they must come to greater agreement. This is, for example, the theory behind everything from universal education to “raising awareness”, and its implementation leads to ideas like “nudging” and “disinformation” control.

But, as my minor well-poisoning hinted, that’s not always what happens. As the media got involved in the Russian war, for instance, opinions “polarized”, and we saw many Ukrainian flags in social media bios, but also new Russian ones. The great middle largely fled to the extremes. Opinions did not coalesce, but grew apart.

It turns out that this can be analyzed in a surprising way, that we all first learned about from ET Jaynes, in his “Queer Uses for Probability Theory”—Chapter 5 in Probability Theory.

The reason some think opinions should coalesce is because of theory. I need only one small equation to demonstrate this. It’s easy: stick with me.

We’ll call the proposition of interest Y, as in Y = “Russia’s view towards Ukraine is Russian self-preservation”. But it can be anything. Jaynes used that a certain lady had ESP powers, always a favorite subject. Now all probabilities have to have conditions, evidence that is assumed, is, or is accepted as true. Let that be called E. E can be a long string of propositions itself, and is, especially on complex subjects.

So for person number i we have:

     Pr(Y | E_i)

which reads “The probability Y is true assuming E is true, for individual number i”. Simple as promised, yes?

This need not be a number, and only is if E_i has inside it assumptions which tell us how to quantify the probability. But, for a first cut, we can assume the answer is at least a rough number. It doesn’t really matter.

If we look at Pr(Y|E_i) for any number of people, i = 1, 2, 3, …, then we’ll find estimates all over the place, regardless of Y, unless the same E is accepted by all or most. Then Pr(Y|E_i) = Pr(Y|E_j), for any two individuals i and j.

The theory is this.

New information arises that everybody in our pool of people sees. This, too, is in the form of propositions. We can call the first new piece, say, R. R is a bit of news, or an announcement, or fact. Anything that all now see. This means everybody’s augmented probabilities now look like this:

     Pr(Y | R E_i)

We can get to this using Bayes’s formula, if it helps. But it’s not necessary. Bayes is only a useful tool, and nothing more. This is now the probability of Y given R and E_i.

Then a second piece of information arrives. To keep track, we’ll call the first piece R_1 and the second R_2. Then we have:

     Pr(Y | R_1 R_2 E_i)

which is now, as you can guess, the probability Y is true given R_1 and R_2 and E_i.

You have the idea (I hope). The more R’s we add, the more everybody’s information comes to resemble each other’s, so that even if we start at different places, because of those E_i, we end more or less the same place. In notation:

     Pr(Y | R_1 R_2 … R_m E_i) ~ Pr(Y | R_1 R_2 … R_m E_j)

which reads the probability Y is true given all those Rs and E_i is about equal to the probability Y is true given all those Rs (there are m of them) and E_j, for some second person.

Jaynes works all this out using Bayes, to show the math is scrupulous, which it is. Theory thus seems to say that as new information arises everybody comes to at least rough agreement.

It’s a true theory, too. And it sometimes works. But only when Y is not “controversial” and the source of the Rs is trustworthy.

All we have to do to break the theory is add one more premise to the right hand side. A premise many hold in certain situations. This one (or a variant): L = “The source of the R is full of it. They would lie about Y if it is to their benefit, either by omission or commission.”

Let’s use a homelier example. Let’s let Y = “Locking down will keep me from catching a communicable respiratory disease.”

Before the covid panic hit if you asked folks what was their belief about Y, you’d have a range of opinion, perhaps with many estimates giving Y low probability. After all, locking healthy people inside their homes had never been tried before, and it makes little sense given that sickness peaks every January (in the Northern Hemisphere) when everybody goes inside to spread their diseases.

Then the panic hit, and hit hard. And we all heard this: R = “Two weeks to stop the spread!”

Many believed the source, and their opinion about Y went way up. But not for everybody. A few held the additional premise, L = “This is all modeling bullshit, put out by a character who has been serially wrong, and who is obviously wrong again”. Those who held L, and there were not many, then had lower opinions about Y.

As the panic progressed, and it became clear Experts were (to quote a pop source) stuffed absolutely full of wild blueberry muffins, more people adopted L or one of its obvious variants. Then, even as Experts issued more and more Rs about how wonderful it was to have “non-essential” people off the street, those who held L had lower and lower opinions about Y.

You know where we are now. There are still many who hold a high value for Y, ever trusting in Experts as they do, but there are many more who do not. Opinions have diverged. Indeed, the more information issued by Experts, the greater the divergence.

What Jaynes showed was that the math for this phenomenon is also scrupulous, and explains better what happens in these case than the naive theory.

This has many important consequences, some of which you’ll be able to see if you’re familiar with what is called the “heuristics and biases” literature in economics (if you know it, think of Linda the activist bank teller example). What some researchers hold are errors in thinking turn out to be nothing more than people answering different questions than those posed by researchers.

All this will also turn out to be of great interest the more the Regime tries to push Official Truths and anathematizes Official Disinformation. Because many don’t trust the source, these efforts will backfire in a very predictable way, driving opinions further apart, not closer.

I have two small papers coming which extends these ideas in a minor way. Which I’ll put up and explain when they’re published (should be soon). But they are mathematical and I didn’t want to dump them without first explaining the idea in (what I hope are) simpler terms.

The gist: the harder distrusted sources try to control “the narrative”, the more damage they do to the sources.

Subscribe or donate to support this site and its wholly independent host using credit card click here. Or use the paid subscription at Substack. Cash App: $WilliamMBriggs. For Zelle, use my email: matt@wmbriggs.com, and please include yours so I know who to thank.

9 Comments

  1. Pk

    Taking the example of two individuals, each with his own set of E and both with the same set of R. You say their Ps converge. Yet, throw in an L, and they diverge. The L could already be in one of their E lists.

    I agree the controllers damage the sources, but not to the extent that it changes the majority of minds. The descent into abbys continues.

  2. Yep – and let me add (you knew I’d mention Festinger again, right?) that the process accelerates when the people in each of the diverging groups start to select for information sources that support their views while eschewing neutral sources and viciously attacking sources of contrary information.

    and.. in the context of answering the wrong question (Linda can be a bank teller and nuts, but it’s the nutiness that dominates our view of her) note that lockdowns are the right thing to do if the disease is easily communicated, clearly recognizeable, and takes effect very quickly. e.g. ebola has a short incubation, is highly infectious, and is easily recognized. Lockdowns were wrong for Covid because covid met none of those conditions.

  3. Hagfish Bagpipe

    Thanks for the link to Carter, had not read him, interesting writer. Watched the whole Mike Benz interview last week, confirms what we’ve witnessed, adds much fine detail. The Empire of Lies is built on sand. The more they try and prop it up the worse it sags.

  4. Cary D Cotterman

    Paul Murphy–it’s become harder and harder, maybe even impossible, to know what information sources are neutral, or if any neutral sources exist.

  5. Tars Tarkas

    Instead of the conjunction fallacy, Linda the Activist Teller should be called the Dr Spock fallacy. I communicate in some circles where everyone believes people act in logical ways. Very few (I’ve never met one) people act like Dr Spock. Absolutely nobody, including statisticians see the problem of A and B being more likely than B, especially given the construction of the problem. Given what we were told about Linda, we chose B because that is what we see in the world. Very few people have 1 strongly progressive view. Almost nobody says, well, gee, it’s logical that A is more likely than A plus B….derp derp derp…

  6. Pk

    On your related post below today’s article, I read the 2010 Ayn Rand post. I wonder if your view on the conclusion has changed since then. I tend to trust that if I disagree with your conclusion, then I am missing something important. I cannot see how Rand was wrong. The authors of The Bell Curve used the very same argument and were pilloried for it.

  7. Normies are already back to sleep. It’s like the coof and vaxx and lockdowns were/are some kind of movie disconnected from their present lives for normies . Normies are incorrigible they are the embodiment of the concept of the P-zombie. Beatings will continue until morale rises.

  8. Wm Arthurs

    St John Henry Newman wrote on a similar mechanism in his 1848 novel ‘Loss & Gain: the Story of a Convert’, ch. IX

    ‘The reader has to be told that there was at that time a system of espionage prosecuted by various well-meaning men, who thought it would be doing the University a service to point out such of its junior members as were what is called “papistically inclined.” They did not perceive the danger such a course involved of disposing young men towards Catholicism, by attaching to them the bad report of it, and of forcing them farther by inflicting on them the inconsistencies of their position. Ideas which would have lain dormant or dwindled away in their minds were thus fixed, defined, located within them; and the fear of the world’s censure no longer served to deter, when it had been actually incurred.’

Leave a Reply

Your email address will not be published. Required fields are marked *