Skip to content

Probability and Information An Integrated Approach

Best in textbook rentals since 2012!

ISBN-10: 052172788X

ISBN-13: 9780521727884

Edition: 2nd 2008

Authors: David Applebaum

List price: $66.99
Shipping box This item qualifies for FREE shipping.
Blue ribbon 30 day, 100% satisfaction guarantee!
what's this?
Rush Rewards U
Members Receive:
Carrot Coin icon
XP icon
You have reached 400 XP and carrot coins. That is the daily max!


This new and updated textbook is an excellent way to introduce probability and information theory to students new to mathematics, computer science, engineering, statistics, economics, or business studies. Only requiring knowledge of basic calculus, it begins by building a clear and systematic foundation to probability and information. Classic topics covered include discrete and continuous random variables, entropy and mutual information, maximum entropy methods, the central limit theorem and the coding and transmission of information. Newly covered for this edition is modern material on Markov chains and their entropy. Examples and exercises are included to illustrate how to use the theory…    
Customers also bought

Book details

List price: $66.99
Edition: 2nd
Copyright year: 2008
Publisher: Cambridge University Press
Publication date: 8/14/2008
Binding: Paperback
Pages: 290
Size: 6.85" wide x 9.72" long x 0.55" tall
Weight: 1.298
Language: English

Preface to the second edition
Preface to the first edition
Chance and information
Mathematical models of chance phenomena
Mathematical structure and mathematical proof
Plan of this book
Multinomial coefficients
The gamma function
Further reading
Sets and measures
The concept of a set
Set operations
Boolean algebras
Measures on Boolean algebras
Further reading
The concept of probability
Probability in practice
Conditional probability
The interpretation of probability
The historical roots of probability
Further reading
Discrete random variables
The concept of a random variable
Properties of random variables
Expectation and variance
Covariance and correlation
Independent random variables
I.I.D. random variables
Binomial and Poisson random variables
Geometric, negative binomial and hypergeometric random variables
Further reading
Information and entropy
What is information?
Joint and conditional entropies; mutual information
The maximum entropy principle
Entropy, physics and life
The uniqueness of entropy
Further reading
Transmission of information
The channel capacity
Noiseless coding
Coding and transmission with noise - Shannon's theorem
Brief remarks about the history of information theory
Further reading
Random variables with probability density functions
Random variables with continuous ranges
Probability density functions
Discretisation and integration
Laws of large numbers
Normal random variables
The central limit theorem
Entropy in the continuous case
Further reading
Random vectors
Cartesian products
Boolean algebras and measures on products
Distributions of random vectors
Marginal distributions
Independence revisited
Conditional densities and conditional entropy
Mutual information and channel capacity
Further reading
Markov chains and their entropy
Stochastic processes
Markov chains
The Chapman-Kolmogorov equations
Stationary processes
Invariant distributions and stationary Markov chains
Entropy rates for Markov chains
Further reading
Exploring further
Proof by mathematical induction
Lagrange multipliers
Integration of exp (-1/2 x<sup>2<sup>)
Table of probabilities associated with the standard normal distribution
A rapid review of matrix algebra
Selected solutions