Entropy

The calculation of the entropy of a distribution is available as a function called entropy as well as the traditional H:

Imports and implicits

import edu.uci.ics.jung.graph.DirectedSparseGraph

import cats.implicits._

import spire.math._
import spire.algebra._
import spire.implicits.DoubleAlgebra

import axle._
import axle.stats._
import axle.quanta.Information
import axle.algebra.modules.doubleRationalModule
import axle.jung.directedGraphJung
import axle.quanta.UnitOfMeasurement
import axle.game.Dice.die

implicit val informationConverter = Information.converterGraphK2[Double, DirectedSparseGraph]

Usage

Entropy of fair 6-sided die

val d6 = die(6)
// d6: axle.stats.Distribution0[Int,spire.math.Rational] = ConditionalProbabilityTable0(Map(5 -> 1/6, 1 -> 1/6, 6 -> 1/6, 2 -> 1/6, 3 -> 1/6, 4 -> 1/6),d6)

string(H(d6))
// res4: String = 2.5849625007211565 b

Entropy of fair and biased coins

val fairCoin = coin()
// fairCoin: axle.stats.Distribution[Symbol,spire.math.Rational] = ConditionalProbabilityTable0(Map('HEAD -> 1/2, 'TAIL -> 1/2),coin)

string(H(fairCoin))
// res5: String = 1.0 b

val biasedCoin = coin(Rational(7, 10))
// biasedCoin: axle.stats.Distribution[Symbol,spire.math.Rational] = ConditionalProbabilityTable0(Map('HEAD -> 7/10, 'TAIL -> 3/10),coin)

string(entropy(biasedCoin))
// res6: String = 0.8812908992306927 b

See also the Coin Entropy example.