Why you should not rely on weak signals to predict the future

Can seemingly weak signals have the potential to reveal significant future events? The answer lies in the delicate balance between detection and interpretation, where our assumptions shape our understanding.

The idea of weak signals is that an event is always announced by information that allows us to anticipate it. If we can detect these signals and interpret them correctly, we can anticipate the event. For example, the police monitor fertilizer purchases because they contain nitrate, which is used to make bombs. In another area, some cosmetics brands send product managers to trendy locations to identify future fashion trends.

The exercise is useful, but it has its limitations. First, weak signals can be used to deceive the enemy. Bin Laden was known to spread many rumors of attacks in order to exhaust Western services. To be overly sensitive to weak signals is to overreact. Then their usefulness quickly diminishes. If the terrorists know that the police are monitoring the purchase of fertilizer, which eventually happens, they will find other ways to carry out their attacks (with a simple Kalashnikov, for example). A game of cat and mouse ensues, in which the terrorists always have the initiative, and in which yesterday’s interesting signals are at best useless, at worst misleading (a lack of fertilizer purchases can make one believe that the attacks are over).

But the main limitation of weak signals is not there. It is in their interpretation. Two months before the attacks of September 11, 2001, an FBI agent in Arizona alerted his headquarters that several Middle Easterners were training at a flight school and recommended a national investigation. In fact, they did not want to learn how to take off or land, which is strange! A classic example of a weak signal. And yet no one cared, and after the attack it sparked a major controversy about the alleged incompetence of the police and intelligence services. And yet the inaction can be explained. On the one hand, the owner of the flight school later explained that he had many wealthy clients who just wanted to have fun by taking control of their private jet for a few minutes during the flight, without making any effort to learn how to land or take off. On the other hand, an FBI agent’s job is to witness strange things all day long, so he has to pick and choose which ones to pursue and which ones to let go because he has too much time on his hands. One strange thing follows another. The agent then filled out his report and gave it to his supervisor, who felt that there were more urgent and important things to deal with (drug trafficking, bank robberies, violence) than student pilots who did not want to learn how to take off or land.

The role of hypotheses

Of course, it is easy to criticize this decision after the fact, but before, in the “fog of action,” when faced with dozens of potential cases involving murder, trafficking, violence, etc., it is understandable that the officer has to make a choice. The question is: what is the basis for those decisions? American researcher Roberta Wohlstetter sheds interesting light on this question. In her groundbreaking work on the Japanese surprise attack on the American base at Pearl Harbor on December 7, 1941, she showed that the American failure was not due to a lack of attention to weak signals. In fact, the U.S. Navy had broken the codes of the Japanese Navy. So they had massive signals in the form of conversations of Japanese admirals. But they found the hypothesis of a Pearl Harbor attack so absurd that they refused to consider it. An exercise on the subject in the spring of 1941 was even rejected. Although it has fueled conspiracy theories ever since, if we look at a map, we can see how crazy (or daring!) the Japanese were: Pearl Harbor is actually more than 3,000 miles (6,300 kilometers) from Japan, and launching an armada that far from its bases pushed the technology available to the Japanese at the time to the limits of feasibility. Objectively, it was madness. The U.S. Navy therefore based its refusal to consider such an attack on two strong assumptions that seemed quite reasonable at the time: first, the technical impossibility, and second, the fact that it seemed suicidal that a small country like Japan felt capable of attacking a country as powerful as the United States. The lesson? The weak signals are thus sorted and interpreted on the basis of assumptions that one develops about the system under study. The Americans projected their rationality onto the Japanese, when in fact the Japanese were completely unaware that they were awakening a giant. The surprise is not a problem of weak signals, but of assumptions. It’s because their assumptions were wrong that the Americans were surprised. They had all the signals they needed.

The same goes for the 9/11 flight school. If the FBI had told its agents that it suspected an attack on airliners, thus changing its working hypothesis, the officer who reported strange behavior at a flight school would surely have been listened to. But without a hypothesis, it is impossible to extract the relevant weak signals from the mass of information that this officer is drowning in every day. And with big data spewing out mountains of data all the time, it does not get better over time. We interpret the world based on assumptions (but also on beliefs and values), and these guide our work.

Peter Drucker, one of the greatest writers on management, wrote: “Managers who make effective decisions know that we do not start with facts. We start with assumptions… It is impossible to start with facts. There is no fact unless we have a criterion of relevance”. The only way to distinguish useful signals from noise is through assumptions, and assumptions must be regularly clarified and re-examined in light of new facts. Without assumptions to sort through, all we have is a huge pile of data with which we do not know what to do, or worse, with which we can do whatever we want. In the end, despite the theoretical interest of the concept of the weak signal, its practical scope remains limited and its use risky, since nowadays the amount of noise before an event is large.

🆕 📖 This article is an excerpt from my book “Welcome to Uncertainty

📬 If you enjoyed this article, don’t hesitate to subscribe to receive future articles via email (“I subscribe” in the upper right corner of the home page). You can also follow me on linkedIn and Twitter.

🇫🇷 French version of this article here.

One thought on “Why you should not rely on weak signals to predict the future

Leave a Reply