Prelegent: **TOMASZ MASZCZYK**

2023-02-22 17:15

The Shannon entropy was introduced as a statistical measure of information loss but appeared in other fields of mathematics as well. We plan to sketch its relations with polylogarithms and motives after Cathelineau, Dupont, Bloch, Goncharov, Elbaz-Vincent, and Gangl, a cohomological interpretation by Kontsevich, and the information cohomology after Baudot and Bennequin. In the latter approach, Shannon entropy is a one-cocycle. Next, we survey the Faddeev algebraic-characterization theorem and the Fundamental Equation of Information Theory after Tverberg, Kendall, and Lee. Then we will sketch Gromov’s program and comment on the categorical interpretation by Baez, Fritz, and Leinster. Finally, we plan to present another cohomological derivation of Shannon entropy based on a new kind of Hochschild cohomology we construct for abstract convexity. The latter admits a cohomological interpretation of extensions of convex bodies by vector spaces, which are parallel to Hochschild extensions of associative algebras by square-zero ideals. Then, the Shannon entropy arises from a two-cocycle where the cocycle condition is the Fundamental Equation of Information Theory.

2022-12-08