/entropy

Algorithmic information theory: notes and code

Primary LanguageTeX

entropy

Notes and code related to algorithmic information theory. Under construction.

Thus far, this repository contains:

  • Notes from EE376A, Information Theory, (Prof. Tsachy Weissman).
  • "The principle of maximum entropy," an article I wrote as part of the Stanford math department's Directed Reading Program. Advised by Yuval Wigderson.
  • "Information-theoretic trust," a research note on how information-theoretic tools can be used to quantify trust in AI systems.

In progress

  • "The MacKay compendium" - a compilation of notes and solutions from MacKay's seminal textbook.
  • entropy in Haskell: a set of types and functions that make it easy to compute information-theoretic quantities from probability distributions.