Basic Mathematics You Should Mastered
2017-08-17 21:22:40
1. Statistical distance
In statistics, probability theory, and information theory, a statistical distance quantifies the distance between two statistical objects, which can be two random variables, or two probability distributions or samples, or the distance can be between an individual sample point and a population or a wider sample of points.
2. Pinsker's inequality
In information theory, Pinsker's inequality, named after its inventor Mark Semenovich Pinsker, is an inequality that bounds the total variation distance (or statistical distance) in terms of theKullback–Leibler divergence. The inequality is tight up to constant factors.[1]
3. Total variation distance of probability measures
4. Total variation distance of probability measures
5. σ-代数
6. The definition of TV: