Entropy as risk

The variation in prices for an asset, the volatility or risk, is defined by the standard deviation of the prices, or more strictly the standard deviation of returns. Recently there has been some interest about an alternative measure of riskiness, entropy 1.

The standard deviation measures the magnitude of variation, since it is calculated as the square root of the average squared difference between each value and the mean of those values. Entropy measures instead the number and distribution of values or 'states'.

Two different entropy measures are often used, Shannon entropy (H1) and Rényi entropy (H2). They are similar, which can be confusing: using entropy as a risk measure is novel, so there is no consensus as yet on which is most appropriate. I will use both in this article for comparison. For those familiar with kdb+/q, the measures are defined as:

h1: { neg sum x * log x }
h2: { neg log sum x * x }

where the x's are probabilities. Let's work through some examples to explain how the entropy calculation works. Imagine being given some playing cards. The table below shows some possible combinations of cards and the associated entropy figures for the 'hand'.

HandShannon entropy H1Rényi entropy H2Remarks
1 × ♠
1 × ♣
0.6931 0.6931 The simplest case, only two cards in the hand, one of each suit. The probability of choosing any suit is 0.5
1 × ♠
1 × ♣
1 × ♥
1 × ♦
1.3863 1.3863 Adding one card each of the other suits increases the entropy of the hand. There are now four suits or 'states', each of probability 0.25
6 × ♠
1 × ♣
1 × ♥
1 × ♦
1.0027 0.7309 Adding more spades lowers the entropy of the hand. There are still four states, but three each have probability 0.11 and spades has probability 0.67. The H1 and H2 measures are now different
13 × ♠
1 × ♣
1 × ♥
1 × ♦
0.6886 0.3977 Adding all the remaining spades lowers the entropy of the hand further

To simplify, adding more states increases entropy, while increasing the likelihood of one state over any other decreases entropy.

Now let's swap playing cards for prices. Looking at the HKEx trade data for August 29, 2014 2, for each listing we can identify the 'states' as each price level and the probability as the relative number of trades at that price, compute the entropy then plot that against the traditional risk measure, price volatility - see chart 1 below.

Chart 1: Shannon entropy versus traded price volatility for HKEx listings on August 29, 2014. Listings with more than one standard deviation of trades per price level are highlighted.

The range in trading is significant: trades per listing ranges from 1 to 16,625; number of traded prices per listing ranges from 1 to 291. Across all listings the mean trades per price level is 20 with a standard deviation of 43. Chart 1 highlights those listings with more than the mean plus one standard deviation of trades per price level to reveal the highly traded listings. For comparison, chart 2 below plots the Rényi entropy for the same listings.

Chart 2: Rényi entropy versus traded price volatility.

The charts show that entropy adds to the picture of risk. For any given trade price volatility there is a significant range of entropy values, revealing the differences in how the listings have traded.

  1. See 1501.01155 on arxiv.org.
  2. Including only main board listings and automatically matched orders, excluding auctions.