Skip to content

Experimental project that collects randomness from dice rolls and estimate its entropy.

Notifications You must be signed in to change notification settings

APonce911/secure-rolls

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 

Repository files navigation

SecureRolls

Description

This is an experimental project for educational purposes. It's a small script to collect randomness from dice rolls and estimate its entropy.

Entropy

Entropy is a measure of how unpredictable your random source is.

Why

I started this project to explore generating entropy from dice rolls at home. My goal is to produce a cryptographically secure amount of entropy — at least 128 bits — from dice rolls. In this script I assume a heavily biased die to test how much entropy can be extracted for different numbers of rolls and probability distributions.

Ultimately, my goal is to validate a methodology to generate the entropy needed to create Bitcoin wallets securely at home without relying solely on the system PRNG(pseudo-random number generators) or CSPRNG(cryptografically secure pseudo-random number generators).

Requirements

You will need Ruby installed. I recommend using rbenv.

Run

ruby generate_data.rb

Test

sudo apt-get install ent
sudo apt-get install dieharder

ent raw_entropy.bin
dieharder -a -f raw_entropy.bin

Quick guide to reading ent results

  • Entropy (bits per byte): range 0–8. Closer to 8 is better. Multiply by file size to get total bits (bits/byte × bytes). This is a Shannon-style estimate — useful but optimistic for cryptographic use.
  • Compression estimate: shows how much the file could be reduced. Large reduction (e.g., 80%+) means a lot of predictable structure — not good for cryptographic randomness.
  • Chi-square: tests whether byte frequencies match what we'd expect from truly random data. Very large values (and tiny p-values like <0.01%) indicate clear bias.
  • Arithmetic mean: should be near 127.5 for uniformly random bytes. Large deviation means bytes lean toward certain values.
  • Monte Carlo Pi: a rough uniformity test. Values far from 3.14159 mean non-uniform distribution.
  • Serial correlation: should be close to 0. Values far from 0 mean bytes depend on previous bytes (not independent).

How to judge validity (simple rules):

  • Small sample sizes (hundreds of bytes) are unreliable — collect much more data (thousand bytes to megabytes) before trusting results.
  • Passing ent tests does not guarantee cryptographic strength. ent shows statistical issues but does not prove min-entropy (the conservative measure cryptography needs).
  • If you see low entropy, large compression potential, high chi-square, mean far from 127.5, big Monte Carlo error, or strong serial correlation → your source is biased or dependent and not ready for seeding wallets.

Next steps

  • Run tests using large datasets(Kbs or MBs of data).
  • Use a conservative min-entropy estimator and assume the worst-case when counting usable bits.
  • Test vetted extractors (HMAC/HKDF-SHA256 or similar) to distill entropy — hashing helps extract but does not increase true entropy.
  • Test combination of raw results with CSPRNG.
  • Validate with larger test suites (NIST STS, Dieharder).

About

Experimental project that collects randomness from dice rolls and estimate its entropy.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages