*Grails Finance 1.4*

Entropy is the central entity in thermodynamics. The equivalent in finance would be information entropy. However, you don’t hear much about it. One could argue, that entropy is a measure of risk just like volatility. What are the advantages of entropy measurement?

## Open source

Somebody asked for the code of Grails Finance. If you are interested in that I shared it on Google Code. This software comes with no warranty or support. I hope that we are clear on that point :).

## Entropy

I calculated the information entropy, normal distribution entropy and ln of the standard deviation. You can see the data and plots in the spreadsheet below

Entropy might be a good measure of uncertainty, however it depends on determination of probability distribution. My method involves histograms, which are sensitive for bin size, therefore entropy calculation is not very accurate. It might have been best to stick to the square root choice. I did not deviate from that too much anyway.

## Python excursion in intraday land

I am really impressed by NumPy and MatPlotLib. So let’s have a look at the intraday data of IBM stocks from 29 September till today. Specifically I investigated entropy and the claim that the Fisher transform a.k.a arctanh makes distributions more “normal”. I also ran into some problems with the “eggs” I downloaded. This website helped me fix the problem.

I calculated and plotted the “rolling” entropies of the Volume Price Trend and Money Flow indicators. The code should make it more clear.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 | #!/usr/bin/env python from numpy import * from pylab import * o,h,l,c,v = loadtxt('IBM_1.csv', delimiter=',', usecols=(3,4,5,6,7), unpack=True) nbins = 30 def entropy(arr,nbins): H = [] for i in range(nbins,len(arr)): p,bins = histogram(c[0:i], bins=nbins,normed=True) p = clip(p, 0.000000001, 1) entropy = -1 * p * log(p) entropy = sum(entropy) H.append(entropy) return H def normalEntropy(arr,nbins): ne = [] for i in range(nbins,len(arr)): stddev= std(arr[0:i]) natentropy = 0.5 * log(2 * pi * e * stddev * stddev) ne.append(natentropy) return ne vec = v[1:] * diff(c) / c[1:] vec = vec/max(v) H = apply_along_axis(entropy, 0, vec, nbins) ne = apply_along_axis(normalEntropy, 0, vec, nbins) t = arange(nbins, len(vec), 1) subplot(421) title('Volume Price Trend Entropy and Normal Entropy') ylabel('Entropy') plot(t, H) plot(t, -1 * ne) vec = v[1:] * diff(h + l) / (h[1:] + l[1:]) vec = vec/max(v) H = apply_along_axis(entropy, 0, vec, nbins) ne = apply_along_axis(normalEntropy, 0, vec, nbins) t = arange(nbins, len(vec), 1) subplot(422) title('Money Flow Entropy and Normal Entropy') ylabel('Entropy') plot(t, H) plot(t, -1 * ne) subplot(425) title('Close price') plot(t,c[nbins+1:]) subplot(427) title('Volume') plot(t, v[nbins+1:]) vec = v[1:] * diff(c) / c[1:] vec = arctanh(vec/max(v)) H = apply_along_axis(entropy, 0, vec, nbins) ne = apply_along_axis(normalEntropy, 0, vec, nbins) t = arange(nbins, len(vec), 1) subplot(423) title('Fisherized Volume Price Trend Entropy and Normal Entropy') ylabel('Entropy') plot(t, H) plot(t, -1 * ne) vec = v[1:] * diff(h + l) / (h[1:] + l[1:]) vec = arctanh(vec/max(v)) H = apply_along_axis(entropy, 0, vec, nbins) ne = apply_along_axis(normalEntropy, 0, vec, nbins) t = arange(nbins, len(vec), 1) subplot(424) title('Fisherized Money Flow Entropy and Normal Entropy') ylabel('Entropy') plot(t, H) plot(t, -1 * ne) subplot(426) title('Close price') plot(t,c[nbins+1:]) subplot(428) title('Volume') plot(t, v[nbins+1:]) show() |

As you can see in the plot the Fisher transform doesn’t make any difference at all.

I did the same for the diffs/deltas of entropies/close price. Not for the relative change, because I got a division by 0 error. So that’s left as an exercise for the reader :). Unfortunately we don’t get an indication of a trend.

## Random links of interest

- Compressed pointers – supposedly improves performance, how about obfuscation?
- Investor calendar
- TA lib
- Qtstalker
- Active Quant

## Recommended Dutch movie

A colleague recommended the Dutch movie Win/Win (maybe with subtitles). When I read the plot summary I almost fell off my chair.

Ivan is a true number cruncher and ‘surfs the waves of the stock market’ like a natural trader. He rakes in big profits for the bank. But all is not well. The new job gives Ivan sleepless nights. As Ivan rapidly becomes the most successful trader in town, he feels increasingly alienated from himself and the world around him. In spite of his unprecedented success Ivan has to get out. Before it’s too late…

How does the normal entropy differ from informational entropy? How is it calculated in this case?