Richard Feynman used the term “cargo cult science” to describe something that has the superficial trappings of science but operate closer to the level of storytelling. “How the Leopard Got its Spots” is not science, but the explanation might satisfy Deepak Chopra.

This is from Feynman’s 1974 commencement speech at Caltech:

During the Middle Ages there were all kinds of crazy ideas, such as that a piece of rhinoceros horn would increase potency. Then a method was discovered for separating the ideas—which was to try one to see if it worked, and if it didn’t work, to eliminate it. This method became organized, of course, into science. And it developed very well, so that we are now in the scientific age. It is such a scientific age, in fact, that we have difficulty in understanding how witch doctors could ever have existed, when nothing that they proposed ever really worked—or very little of it did.

But even today I meet lots of people who sooner or later get me into a conversation about UFO’s, or astrology, or some form of mysticism, expanded consciousness, new types of awareness, ESP, and so forth. And I’ve concluded that it’s not a scientific world.

I’m not so dismissive of indigenous cultural practices that I would ignore the efforts of healers and shamans out of hand. Though, I do acknowledge the differences in those ancestral approaches when compared to the process of scientific inquiry. Feynman only mentions those practices as an aside. He’s much more concerned with the bunk science and bunk scientists he saw around him. They were the leaders of the scientific cargo cults from which he hoped to save those graduating students.

Cargo cult refers to the practices of some Melanesian islanders in the South Pacific. During World War II, the United States established a number of strategic bases on small inhabited islands throughout the South Pacific. Those soldiers brought manufactured goods, foods, and practices that were entirely foriegn to the indigenous people living on those islands. When the war ended, the soldiers left and abandoned the bases. In their absence, the islanders built effigies of the weapons and equipment that they’d seen the soldiers use. In some cases, they built jeeps, planes, and cut new runways out of the jungle.

Thought leaders and public intellectuals

There isn’t a word that I can think of the would help separate the types of intellectuals the are just looking to capitalize on their audience and those that are much more sincerely trying to solve problems. Of course, all of them exist on a spectrum with some of them on the more honest end and most of the rest at the worst extreme.

The mechanisms and motives of thought leaders are different than those of the islanders but the results are much the same. Thought leaders use the right words, cite studies to support their conclusions, but something is missing. We have few public intellectuals remaining. They’re still around, but mostly they’re kept permanently buried in the dark corners of Twitter. They’ve largely been displaced by thought leaders who prioritize celebrity status and growing their online following over rigorously pursuing ideas and rethinking old conclusions. The same pattern is most visible with some politicians, who seem to believe deeply in whatever their electoral base currently values. Choosing what to think about and how to think about things so that the product will fit an audience invariably leads to shallower work.

For a thought leader, branding doesn’t need to be intentional, only embraced. A tenant of good branding is to focus on easy to consume messaging. Easy to consume messaging often lacks meaning. To be easily consumable, it needs to flatten out details and ignore complexity. Consider the “Learn to code” initiative. It’s a great example of an exceedingly arrogant and simplistic solution to the complicated problems faced by those who experience homelessness.

Thought gurus can vary wildly in type and quality

There are a few general types of thought leaders.

  1. Sincere Promotor or Enthusiast
    This is the only genuine type of thought leader. Examples of this rare breed are Zeynep Tufekci, Jonathan Haidt, and Laura Seay. If you’ve never heard of them, it’s probably because they spend more time developing expertise than building an audience. Of course, there are many others but that’s the thing—they’re not easy to find.
  2. Clout chasers
    Thought leadership status then becomes an ouroboros with no higher goal than to maintain relevance. Examples are Fareed Zakaria with his plagiarism scandal and the bizarre transformation of Glen Greenwald.
  3. Professional contrarian
    This thought leader is always in opposition to the prevailing views and imagines themselves as an iconoclast in opposition to the prevailing thought of a particular group. Examples are Eric Weinstein and his brother Brett Weinstein. Brett Weinstein is one half of a contrarian power couple along with his wife Heather Heying.
  4. Corporate Shill
    The main goal is to give intellectual cover to ultra wealthy benefactors by justifying a corporatist worldview. These people on someone’s payroll, even if indirectly.
    Exemplified by the one and only Fred Singer.
  5. Con Artist or Grifter
    This type always has an angle and there’s always something to sell. Hold onto your credit cards. Both the evangelical preacher Jim Bakker and Alex Jones are great examples of this type.

Any of these types of thought leaders, under the right conditions, and with enough incentive, can morph into any of the others. That’s the hidden danger. You can start following what might be a very sincere thinker who’s earnestly wrestling with some problems. As time goes on though, he or she can become the next Weinstein (either of them). I have experience with many of these types. I’ve fallen into and climbed out from more information sinkholes and dodged more would-be intellectual cult leaders than I can easily remember. I have followed the work of self-proclaimed marketing gurus, tech thought leaders, plugged-in politicos, and founder-philosophers. Not only that, I’ve also been caught up in some of their spells — right up until the moment when I’m not.

  1. Attraction
  2. Realization
  3. Disinterest
  4. (Sometimes) Revulsion

There is a pattern of attraction, disinterest, and sometimes even revulsion that is often the same. It typically begins with encountering a few tweets or hearing a chance interview on someone else’s podcast. Joe Rogan is a huge gateway to some of these intellectual posers. That me would think, “Now, that was interesting. I wonder what else they have to say? Oh! They have their own podcast!” Follow and subscribe. Still full of innocence, I would be on my way, sometimes without a compass, into this unknown fanboy territory. Compelled and maybe even sightly enchanted, I would search out more interviews, read more tweets, and do my part to consume their content as fast as they could create it.

That first blush of attraction is often so subtle that it passes unnoticed. Without realizing where this path could lead, I’m in. For the next days or weeks or months, I consume all of their content. So long as I don’t encounter anything completely insane such as deep state conspiracies or underground child sex slave rings, then this honeymoon might last for quite a long time.

The pattern of falling into a personality cult is very much like being groomed by a con man. Each one is trying to sell you something. Though, not every personality cult begins intentionally. They don’t necessarily require that its leader consciously tries to run it as a cult. A leader can stumble into that role simply through self-promotion and prolifically creating content. Next comes the audience and after that, they think about how to monetize. Finding out what the audience wants, and more importantly, what will they pay for can become the priority. These once sincere personalities are the most difficult to avoid.

Edward H. Smith, a famous confidence man in the early 20th century, describes a “fine con” in the terms of a stage drama. In his words, every con “has its introduction, development, climax, dénouement, and close, just like any good play.” His comparison doesn’t end there. He insists that other theatrical elements such as background and setting are often meticulously planned in both good theater and a good confidence game.

Stages of a Con

For comparison to the pattern of falling under the influence of an intellectual cargo cult, here are Smith’s colorfully described stages of a con:

  1. Foundation work
    Preparations are made in advance of the game, including the hiring of any assistants required and studying the background knowledge needed for the role.
  2. Approach
    The victim is approached or contacted.
  3. Build-up
    The victim is given an opportunity to profit from participating in a scheme. The victim’s greed is encouraged, such that their rational judgment of the situation might be impaired.
  4. Pay-off or convincer
    The victim receives a small payout as a demonstration of the scheme’s purported effectiveness. This may be a real amount of money or faked in some way. In a gambling con, the victim is allowed to win several small bets. In a stock market con, the victim is given fake dividends.
  5. The “hurrah”
    A sudden manufactured crisis or change of events forces the victim to act or make a decision immediately. This is the point at which the con succeeds or fails. With a financial scam, the con artist may tell the victim that the “window of opportunity” to make a large investment in the scheme is about to suddenly close forever.
  6. The in-and-in
    A conspirator (in on the con, but assumes the role of an interested bystander) puts an amount of money into the same scheme as the victim, to add an appearance of legitimacy. This can reassure the victim, and give the con man greater control when the deal has been completed.

From Confessions of a Confidence Man by Edward H. Smith

Using Smith’s terms, the foundation for the personality cult leader is the image they’ve cultivated. Their image can be built from tweets, books, podcasts, or any other content they’ve created. All of these elements are more convincing if they were once the products of honest efforts.

For this type of operation to work, they can’t approach you actively. Personality cult leaders need you to come to them. They may run ads to bring in new people, but mostly, they allow you to find them on your own. After a while, the gravity of their existing following begins to almost work on autopilot by capturing unaware passers-by.

The build-up is the promise they make to deliver you from the dinginess surrounding you into a place more rarefied. This is where the con of the personality cult leader ends — with a promise. It doesn’t stop with only one promise, though. This first promise is just a taste, but it has to be a good one, and it has to mostly come true. Otherwise, the spell would fizzle before it could burrow its way in.

Why do we fall for these information hucksters? The answer is so obvious, well-known, and generally unsexy that we all miss it. Every one of them is selling the same thing. Answers.

We have questions about everything. We’re confused about everything. All of them, down to the last one, are peddling the same thing. They’re all pushing answers to every one of your nagging concerns. Not sure how to understand why people stormed the capitol? Here are a few tweets that explain everything. Wonder how you can lose the 15 pounds you put on during the pandemic? Here’s a YouTube channel that can help.

Feynman concluded his Caltech speech with a story about a psychologist he knew.

He had a long corridor with doors  all along one side where the rats came in, and doors along the other side  where the food was.  He wanted to see if he could train the rats to go in  at the third door down from wherever he started them off.  No.  The rats  went immediately to the door where the food had been the time before.

The question was, how did the rats know, because the corridor was so beautifully built and so uniform, that this was the same door as before? Obviously there was something about the door that was different from the  other doors. So he painted the doors very carefully, arranging the textures on the faces of the doors exactly the same. Still the rats could tell. Then he thought maybe the rats were smelling the food, so he used chemicals to change the smell after each run. Still the rats could tell. Then he realized the rats might be able to tell by seeing the lights and the arrangement in the laboratory like any  commonsense person. So he covered the corridor, and still the rats could tell.

He finally found that they could tell by the way the floor sounded when they ran over it. And he could only fix that by putting his corridor in sand. So he covered one after another of all possible  clues and finally was able to fool the rats so that they had to learn  to go in the third door. If he relaxed any of his conditions, the rats could tell.

Now, from a scientific standpoint, that is an A-number-one experiment. That is the experiment that makes rat-running experiments sensible, because it uncovers that clues that the rat is really using—not what you think it’s using. And that is the experiment that tells exactly what conditions you have to use in order to be careful  and control everything in an experiment with rat-running.

I looked up the subsequent history of this research. The next  experiment, and the one after that, never referred to Mr. Young. They never used any of his criteria of putting the corridor on sand, or being very careful. They just went right on running the rats in the same old  way, and paid no attention to the great discoveries of Mr. Young, and his papers are not referred to, because he didn’t discover anything about  the rats. In fact, he discovered all the things you have to do to discover something about rats. But not paying attention to experiments like that is a characteristic example of cargo cult science.

That last part, where Feynman says, “In fact, he discovered all the things you have to do to discover something about rats”. That part really gets me.

I’m compelled to admit that, in a way, I’m dragging you into my own personal sinkhole now. I’m telling you something that I believe to be a truth and trying to convince you to believe it. If you’re starting to be convinced that I’m onto something, then I’ve slowly softened the ground around your feet. You might notice that you’re sliding down just a bit.

Don’t worry. There’s not an ad for Miracle Mineral Supplement at the end of this piece.

Posted in