| |
| |
Preface to the second edition | |
| |
| |
Preface to the first edition | |
| |
| |
| |
Introduction | |
| |
| |
| |
Chance and information | |
| |
| |
| |
Mathematical models of chance phenomena | |
| |
| |
| |
Mathematical structure and mathematical proof | |
| |
| |
| |
Plan of this book | |
| |
| |
| |
Combinatorics | |
| |
| |
| |
Counting | |
| |
| |
| |
Arrangements | |
| |
| |
| |
Combinations | |
| |
| |
| |
Multinomial coefficients | |
| |
| |
| |
The gamma function | |
| |
| |
Exercises | |
| |
| |
Further reading | |
| |
| |
| |
Sets and measures | |
| |
| |
| |
The concept of a set | |
| |
| |
| |
Set operations | |
| |
| |
| |
Boolean algebras | |
| |
| |
| |
Measures on Boolean algebras | |
| |
| |
Exercises | |
| |
| |
Further reading | |
| |
| |
| |
Probability | |
| |
| |
| |
The concept of probability | |
| |
| |
| |
Probability in practice | |
| |
| |
| |
Conditional probability | |
| |
| |
| |
Independence | |
| |
| |
| |
The interpretation of probability | |
| |
| |
| |
The historical roots of probability | |
| |
| |
Exercises | |
| |
| |
Further reading | |
| |
| |
| |
Discrete random variables | |
| |
| |
| |
The concept of a random variable | |
| |
| |
| |
Properties of random variables | |
| |
| |
| |
Expectation and variance | |
| |
| |
| |
Covariance and correlation | |
| |
| |
| |
Independent random variables | |
| |
| |
| |
I.I.D. random variables | |
| |
| |
| |
Binomial and Poisson random variables | |
| |
| |
| |
Geometric, negative binomial and hypergeometric random variables | |
| |
| |
Exercises | |
| |
| |
Further reading | |
| |
| |
| |
Information and entropy | |
| |
| |
| |
What is information? | |
| |
| |
| |
Entropy | |
| |
| |
| |
Joint and conditional entropies; mutual information | |
| |
| |
| |
The maximum entropy principle | |
| |
| |
| |
Entropy, physics and life | |
| |
| |
| |
The uniqueness of entropy | |
| |
| |
Exercises | |
| |
| |
Further reading | |
| |
| |
| |
Communication | |
| |
| |
| |
Transmission of information | |
| |
| |
| |
The channel capacity | |
| |
| |
| |
Codes | |
| |
| |
| |
Noiseless coding | |
| |
| |
| |
Coding and transmission with noise - Shannon's theorem | |
| |
| |
| |
Brief remarks about the history of information theory | |
| |
| |
Exercises | |
| |
| |
Further reading | |
| |
| |
| |
Random variables with probability density functions | |
| |
| |
| |
Random variables with continuous ranges | |
| |
| |
| |
Probability density functions | |
| |
| |
| |
Discretisation and integration | |
| |
| |
| |
Laws of large numbers | |
| |
| |
| |
Normal random variables | |
| |
| |
| |
The central limit theorem | |
| |
| |
| |
Entropy in the continuous case | |
| |
| |
Exercises | |
| |
| |
Further reading | |
| |
| |
| |
Random vectors | |
| |
| |
| |
Cartesian products | |
| |
| |
| |
Boolean algebras and measures on products | |
| |
| |
| |
Distributions of random vectors | |
| |
| |
| |
Marginal distributions | |
| |
| |
| |
Independence revisited | |
| |
| |
| |
Conditional densities and conditional entropy | |
| |
| |
| |
Mutual information and channel capacity | |
| |
| |
Exercises | |
| |
| |
Further reading | |
| |
| |
| |
Markov chains and their entropy | |
| |
| |
| |
Stochastic processes | |
| |
| |
| |
Markov chains | |
| |
| |
| |
The Chapman-Kolmogorov equations | |
| |
| |
| |
Stationary processes | |
| |
| |
| |
Invariant distributions and stationary Markov chains | |
| |
| |
| |
Entropy rates for Markov chains | |
| |
| |
Exercises | |
| |
| |
Further reading | |
| |
| |
Exploring further | |
| |
| |
| |
Proof by mathematical induction | |
| |
| |
| |
Lagrange multipliers | |
| |
| |
| |
Integration of exp (-1/2 x<sup>2<sup>) | |
| |
| |
| |
Table of probabilities associated with the standard normal distribution | |
| |
| |
| |
A rapid review of matrix algebra | |
| |
| |
Selected solutions | |
| |
| |
Index | |