Skip to content

Elements of Information Theory

Best in textbook rentals since 2012!

ISBN-10: 0471241954

ISBN-13: 9780471241959

Edition: 2nd 2006 (Revised)

Authors: Thomas M. Cover, Joy A. Thomas, Melanie Tracy Burns

List price: $132.95
Shipping box This item qualifies for FREE shipping.
Blue ribbon 30 day, 100% satisfaction guarantee!

Rental notice: supplementary materials (access codes, CDs, etc.) are not guaranteed with rental orders.

what's this?
Rush Rewards U
Members Receive:
Carrot Coin icon
XP icon
You have reached 400 XP and carrot coins. That is the daily max!

Description:

Elements of Information Theory, Second Edition, covers the standard topics of information theory, such as entropy, data compression, channel capacity, rate distortion, multi-user theory and hypothesis testing. It presents applications to communications, statistics, complexity theory, and investment. Chapters 1-9 cover the asymptotic equipartition property, data compression and channel capacity culminating in the capacity of the Gaussian channel. Chapters 10-17 include rate distortion, the method of types, Kolmogorov complexity, network information theory, universal source coding and portfolio theory. The first edition of this book is the most successful book on Information Theory on the…    
Customers also bought

Book details

List price: $132.95
Edition: 2nd
Copyright year: 2006
Publisher: John Wiley & Sons, Incorporated
Publication date: 7/18/2006
Binding: Hardcover
Pages: 784
Size: 6.40" wide x 9.50" long x 1.62" tall
Weight: 2.948
Language: English

THOMAS M. COVER, PHD, is Professor in the departments of electrical engineering and statistics, Stanford University. A recipient of the 1991 IEEE Claude E. Shannon Award, Dr. Cover is a past president of the IEEE Information Theory Society, a Fellow of the IEEE and the Institute of Mathematical Statistics, and a member of the National Academy of Engineering and the American Academy of Arts and Science. He has authored more than 100 technical papers and is coeditor of Open Problems in Communication and Computation.JOY A. THOMAS, PHD, is the Chief Scientist at Stratify, Inc., a Silicon Valley start-up specializing in organizing unstructured information. After receiving his PhD at Stanford,…    

THOMAS M. COVER, PHD, is Professor in the departments of electrical engineering and statistics, Stanford University. A recipient of the 1991 IEEE Claude E. Shannon Award, Dr. Cover is a past president of the IEEE Information Theory Society, a Fellow of the IEEE and the Institute of Mathematical Statistics, and a member of the National Academy of Engineering and the American Academy of Arts and Science. He has authored more than 100 technical papers and is coeditor of Open Problems in Communication and Computation.JOY A. THOMAS, PHD, is the Chief Scientist at Stratify, Inc., a Silicon Valley start-up specializing in organizing unstructured information. After receiving his PhD at Stanford,…    

Preface to the Second Edition
Preface to the First Edition
Acknowledgments for the Second Edition
Acknowledgments for the First Edition
Introduction and Preview
Preview of the Book
Entropy, Relative Entropy, and Mutual Information
Entropy
Joint Entropy and Conditional Entropy
Relative Entropy and Mutual Information
Relationship Between Entropy and Mutual Information
Chain Rules for Entropy, Relative Entropy, and Mutual Information
Jensen's Inequality and Its Consequences
Log Sum Inequality and Its Applications
Data-Processing Inequality
Sufficient Statistics
Fano's Inequality
Summary
Problems
Historical Notes
Asymptotic Equipartition Property
Asymptotic Equipartition Property Theorem
Consequences of the AEP: Data Compression
High-Probability Sets and the Typical Set
Summary
Problems
Historical Notes
Entropy Rates of a Stochastic Process
Markov Chains
Entropy Rate
Example: Entropy Rate of a Random Walk on a Weighted Graph
Second Law of Thermodynamics
Functions of Markov Chains
Summary
Problems
Historical Notes
Data Compression
Examples of Codes
Kraft Inequality
Optimal Codes
Bounds on the Optimal Code Length
Kraft Inequality for Uniquely Decodable Codes
Huffman Codes
Some Comments on Huffman Codes
Optimality of Huffman Codes
Shannon-Fano-Elias Coding
Competitive Optimality of the Shannon Code
Generation of Discrete Distributions from Fair Coins
Summary
Problems
Historical Notes
Gambling and Data Compression
The Horse Race
Gambling and Side Information
Dependent Horse Races and Entropy Rate
The Entropy of English
Data Compression and Gambling
Gambling Estimate of the Entropy of English
Summary
Problems
Historical Notes
Channel Capacity
Examples of Channel Capacity
Noiseless Binary Channel
Noisy Channel with Nonoverlapping Outputs
Noisy Typewriter
Binary Symmetric Channel
Binary Erasure Channel
Symmetric Channels
Properties of Channel Capacity
Preview of the Channel Coding Theorem
Definitions
Jointly Typical Sequences
Channel Coding Theorem
Zero-Error Codes
Fano's Inequality and the Converse to the Coding Theorem
Equality in the Converse to the Channel Coding Theorem
Hamming Codes
Feedback Capacity
Source-Channel Separation Theorem
Summary
Problems
Historical Notes
Differential Entropy
Definitions
AEP for Continuous Random Variables
Relation of Differential Entropy to Discrete Entropy
Joint and Conditional Differential Entropy
Relative Entropy and Mutual Information
Properties of Differential Entropy, Relative Entropy, and Mutual Information
Summary
Problems
Historical Notes
Gaussian Channel
Gaussian Channel: Definitions
Converse to the Coding Theorem for Gaussian Channels
Bandlimited Channels
Parallel Gaussian Channels
Channels with Colored Gaussian Noise
Gaussian Channels with Feedback
Summary
Problems
Historical Notes
Rate Distortion Theory
Quantization
Definitions
Calculation of the Rate Distortion Function
Binary Source
Gaussian Source
Simultaneous Description of Independent Gaussian Random Variables
Converse to the Rate Distortion Theorem
Achievability of the Rate Distortion Function
Strongly Typical Sequences and Rate Distortion
Characterization of the Rate Distortion Function
Computation of Channel Capacity and the Rate Distortion Function
Summary
Problems
Historical Notes
Information Theory and Statistics
Method of Types
Law of Large Numbers
Universal Source Coding
Large Deviation Theory
Examples of Sanov's Theorem
Conditional Limit Theorem
Hypothesis Testing
Chernoff-Stein Lemma
Chernoff Information
Fisher Information and the Cramer-Rao Inequality
Summary
Problems
Historical Notes
Maximum Entropy
Maximum Entropy Distributions
Examples
Anomalous Maximum Entropy Problem
Spectrum Estimation
Entropy Rates of a Gaussian Process
Burg's Maximum Entropy Theorem
Summary
Problems
Historical Notes
Universal Source Coding
Universal Codes and Channel Capacity
Universal Coding for Binary Sequences
Arithmetic Coding
Lempel-Ziv Coding
Sliding Window Lempel-Ziv Algorithm
Tree-Structured Lempel-Ziv Algorithms
Optimality of Lempel-Ziv Algorithms
Sliding Window Lempel-Ziv Algorithms
Optimality of Tree-Structured Lempel-Ziv Compression
Summary
Problems
Historical Notes
Kolmogorov Complexity
Models of Computation
Kolmogorov Complexity: Definitions and Examples
Kolmogorov Complexity and Entropy
Kolmogorov Complexity of Integers
Algorithmically Random and Incompressible Sequences
Universal Probability
Kolmogorov complexity
[Omega]
Universal Gambling
Occam's Razor
Kolmogorov Complexity and Universal Probability
Kolmogorov Sufficient Statistic
Minimum Description Length Principle
Summary
Problems
Historical Notes
Network Information Theory
Gaussian Multiple-User Channels
Single-User Gaussian Channel
Gaussian Multiple-Access Channel with m Users
Gaussian Broadcast Channel
Gaussian Relay Channel
Gaussian Interference Channel
Gaussian Two-Way Channel
Jointly Typical Sequences
Multiple-Access Channel
Achievability of the Capacity Region for the Multiple-Access Channel
Comments on the Capacity Region for the Multiple-Access Channel
Convexity of the Capacity Region of the Multiple-Access Channel
Converse for the Multiple-Access Channel
m-User Multiple-Access Channels
Gaussian Multiple-Access Channels
Encoding of Correlated Sources
Achievability of the Slepian-Wolf Theorem
Converse for the Slepian-Wolf Theorem
Slepian-Wolf Theorem for Many Sources
Interpretation of Slepian-Wolf Coding
Duality Between Slepian-Wolf Encoding and Multiple-Access Channels
Broadcast Channel
Definitions for a Broadcast Channel
Degraded Broadcast Channels
Capacity Region for the Degraded Broadcast Channel
Relay Channel
Source Coding with Side Information
Rate Distortion with Side Information
General Multiterminal Networks
Summary
Problems
Historical Notes
Information Theory and Portfolio Theory
The Stock Market: Some Definitions
Kuhn-Tucker Characterization of the Log-Optimal Portfolio
Asymptotic Optimality of the Log-Optimal Portfolio
Side Information and the Growth Rate
Investment in Stationary Markets
Competitive Optimality of the Log-Optimal Portfolio
Universal Portfolios
Finite-Horizon Universal Portfolios
Horizon-Free Universal Portfolios
Shannon-McMillan-Breiman Theorem (General AEP)
Summary
Problems
Historical Notes
Inequalities in Information Theory
Basic Inequalities of Information Theory
Differential Entropy
Bounds on Entropy and Relative Entropy
Inequalities for Types
Combinatorial Bounds on Entropy
Entropy Rates of Subsets
Entropy and Fisher Information
Entropy Power Inequality and Brunn-Minkowski Inequality
Inequalities for Determinants
Inequalities for Ratios of Determinants
Summary
Problems
Historical Notes
Bibliography
List of Symbols
Index