Finding Angels in the Details: The Value of Information


Photo by Stuck in Customs

Many of the most powerful inventions throughout human history, from language to the modern computer, were those that enabled people to better generate, capture, and consume data and information.  And at no time in our history have we captured more data than we do today.  McKinsey estimates that global data is growing at a rate of 40% per year.  The more data captured, the more opportunity for enhanced decision-making.  But in order for that to happen, data must be turned into information, and information must be turned into knowledge. 

Information is data that is aggregated to a level where it makes sense for decision support (usually in the shape of reports, tables, or lists).  Knowledge is information that has been analyzed and interpreted.  With the growth of “big data,” the race is on to convert this data to beneficial knowledge.  Almost all aspects of life are being “datafied,” or turned into data.

In our efforts to extract knowledge from data, we have to understand the value of information.  There is only so much information that can provide clear benefit.  As an extension of Zipf’s Law, each successive attempt to dig deeper into a data set will yield exponentially weaker meaning.  At some point, there is Cost/Benefit relationship that needs to be considered when digging deeply into data.  For this reason, we value information – we want to know how important each piece of information will be.  And there are only three basic reasons why information ever has value to a business:

  1. Information reduces uncertainty about decisions that have economic consequences
  2. Information affects the behavior of others, which has economic consequences
  3. Information sometimes has its own market value

We will focus on the first two reasons in this post, as they directly influence decision-making inside an organization.  Given the why, we can now look at the how.  For this explanation, I am relying heavily on the (pretty much) amazing book How to Measure Anything by Douglas Hubbard.

If we’re trying to value information for the affect it has on the behavior of others, the value is exactly equal to the value of the difference in human behavior.  This is actually fairly logical.

If we’re trying to value information for its ability to reduce uncertainty in decision-making, there is a bit more to it.  We first need to understand what is called the “Expected Opportunity Loss” or “EOL” for a particular strategy.  The EOL is the chance of being wrong multiplied by the cost of being wrong (for each scenario in a given strategy).  For example, let’s say there is a project that is being reviewed for approval.  It can either be approved or rejected.  We place a cost on each possibility – approval being the actual cost of the project and rejection being the foregone gain of the project.  We then place a likelihood of each option happening.  With this information, we can calculate the EOL for each scenario in the project.

Let’s say that (in the example above) the EOL for approval is $2 million and the EOL for rejection is $24 million.  We can now calculate what’s called the “Expected Value of Perfect Information” or “EVPI.”  This is simply the EOL before any information is introduced.  In this example, we have yet to introduce information, so the EVPI for project approval is $2MM and the EVPI for project rejection is $24MM.  Another way to think about EVPI is the gain received from eliminating uncertainty.

If we can only reduce but not eliminate uncertainty, we still want a way to value that reduction.  The value of this reduction (the difference between the EOL before a measurement and the EOL after a measurement) is called the “Expected Value of Information” or “EVI.”  It’s purpose is to value the reduction in risk, which is the value of information.

For the vast majority of variables that we can examine, the current level of uncertainty is acceptable.  In other words, the vast majority of variables have an information value of zero.  But for those that do have information value, we devote measurement attention through concepts like EOL, EVPI, and EVI.  Oftentimes, the economic value of measuring a variable is Inversely proportional to how much measurement attention it gets.  There can be great value in attending to previously unattended information.

As more and more data is compiled, there will be more opportunities for better measurement of information.  We often start the measurement process by reviewing the available historical information, and this is fine.  Even though we have no logical reason for believing the future will resemble the past, using historical data to measure (or infer) is probably an improvement on unaided human judgment.  And after all, we have to start somewhere.