On February 12, 2002, then-US Secretary of Defense Donald H. Rumsfeld held a press conference, and while responding to a question about the lack of evidence linking Iraq to possession of "weapons of mass destruction," Rumsfeld declared:

As we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – there are things we do not know we don't know.

This quote was baffling to most people and was widely mocked in the media. It even ended up getting turned into a poem with line-breaks and anthologized in the "poetry" collection Pieces of Intelligence: The Existential Poetry of Donald Rumsfeld, as compiled by Hart Seely:

The Unknown

As we know,
There are known knowns.
There are things we know we know.
We also know
There are known unknowns.
That is to say
We know there are some things
We do not know.
But there are also unknown unknowns,
The ones we don't know we don't know.

February 12, 2002, Department of Defense news briefing


However, despite all the mockery, the "unknown unknown" is actually a powerful concept in risk analysis, military strategy, and the philosophy of knowledge, and Rumsfeld's seemingly nonsensical statement actually provided a pretty cogent explanation of this concept.

In any situation involving risk or uncertainty, "known knowns" - in other words those things you know, and know that you know - are obviously easy to manage and account for. Although slightly more dangerous, "known unknowns" are also rather manageable, because if you know that there is something important that you do not know, you can at least set about finding out what that is. However, by far the most dangerous is the "unknown unknown" - those things that you do not know that you do not know or have never encountered before, sometimes called "black swans" or "fat tails." Processes known collectively as "risk management" are fundamentally concerned with seeking out unknown unknowns, and converting them into known unknowns, or at the very least building robust systems that can sustain the emergence of black swans or fat-tail events.

A simple example can be seen in the case of scholars, academics, and researchers who maintain large libraries of books they have never read. Laymen often wonder, why would somebody buy so many books and never even read them. But to the researcher, this is an important process of turning unknown unknowns into known unknowns - once you know a book exists and have it on your shelf, you can always consult it at a moment's notice - and thereby mitigating the risk of getting too deep into a research project before realizing that an important source on that topic already exists.

Diverse fields such as military planning, space exploration, espionage, disaster response, cybersecurity, criminal investigation, economic forecasting, financial risk management, medical diagnosis, and scientific experimentation similarly rely on seeking out unknown unknowns and converting them into known unknowns.

Donald Rumsfeld himself cited NASA administrator William Graham as his source for learning about the concept of the unknown unknown, but the concept had long been widespread in the US aerospace and national security communities, and dates back at least to American psychologists Joseph Luft and Harrington Ingham's creation of the concept of the "Johari window" in 1955.

However, the concept has likely been around for centuries in various forms. For example, the 14th century Persian poet Ibn Yamin (1286-1367) wrote the following poem:


One who knows and knows that he knows...his horse of wisdom will reach the skies.
One who knows, but doesn't know that he knows...he is fast asleep, so you should wake him up!
One who doesn't know, but knows that he doesn't know...his limping mule will eventually get him home.
One who doesn't know and doesn't know that he doesn't know...he will be eternally lost in his hopeless oblivion!