Copyright | Peter Robinson 2014 |
---|---|
License | LGPL |
Maintainer | Peter Robinson <peter.robinson@monoid.at> |
Stability | experimental |
Portability | portable |
Safe Haskell | None |
Language | Haskell98 |
This module implements an n
-sided dice and provides sampling from a given
integer range.
The algorithm uses rejection sampling and attempts to keep the total number
of used random bits as close as possible to the information theoretic lower
bound of ln(n) / ln(2)
(for a range of size n
).
The implementation exposes streams of random values as conduits, see
diceRolls
and randomRs
. We also provide IO wrappers around these
functions, see getDiceRolls
and getRandomRs
.
The conduit interface allows us to use a specific entropy source, which has
type Producer
IO
Word
.
Usage:
If we wanted to use the system-specific entropy source (systemEntropy
) to
produce 10 dice rolls of a 6-sided dice (i.e. range [0,5]), we could write:
> systemEntropy $$ diceRolls 6 =$= CL.take 10 [5,1,3,3,0,5,3,2,2,1]
The function testPerformance
yields the actual number of consumed random
bits:
> testPerformance 12 10000 Generated 10000 random samples in range [0,11] Average number of bits used: 3.5904 Entropy lower bound on the number of required bits: 3.5849625007211565 Performance ratio: 1.0015167520658164
Documentation
diceRolls :: Int -> Conduit Word8 IO Int Source
Produces a stream of random integer values in the range [0,n-1]
, for a
given n <= 2^55
.
This conduit needs to be attached to an entropy source such as
systemEntropy
.
Produces a stream of random integer values within a range.
This conduit needs to be attached to an entropy source such as
systemEntropy
.
Generates k
rolls of an n
sided dice.
Generates a list of random integer values in the specified range.
Compute the performance of the algorithm in terms of used random bits versus produced random values.
systemEntropy :: Producer IO Word8 Source
A source of entropy. By default, we use the getEntropy
function from
the entropy package, see systemEntropy
.
Warning: When combining a source of entropy with another conduits, it is
important to ensure that there is no "backflow" due to leftover values that
are being returned to the
source from the conduit. This can be done by fusing the conduit with the
identity map, e.g: myEntropySrc $$ Data.Conduit.List.map id =$= myConduit