Elements of Information Theory

ISBN-10: 0471062596

ISBN-13: 9780471062592

Edition: 99th 1991 (Revised)

List price: $110.50
30 day, 100% satisfaction guarantee

If an item you ordered from TextbookRush does not meet your expectations due to an error on our part, simply fill out a return request and then return it by mail within 30 days of ordering it for a full refund of item cost.

Learn more about our returns policy


Following a brief introduction and overview, early chapters cover the basic algebraic relationships of entropy, relative entropy and mutual information, AEP, entropy rates of stochastics processes and data compression, duality of data compression and the growth rate of wealth. Later chapters explore Kolmogorov complexity, channel capacity, differential entropy, the capacity of the fundamental Gaussian channel, the relationship between information theory and statistics, rate distortion and network information theories. The final two chapters examine the stock market and inequalities in information theory. In many cases the authors actually describe the properties of the solutions before the presented problems.
what's this?
Rush Rewards U
Members Receive:
You have reached 400 XP and carrot coins. That is the daily max!
Study Briefs

Limited time offer: Get the first one free! (?)

All the information you need in one place! Each Study Brief is a summary of one specific subject; facts, figures, and explanations to help you learn faster.

Customers also bought

Book details

List price: $110.50
Edition: 99th
Copyright year: 1991
Publisher: John Wiley & Sons, Incorporated
Publication date: 8/26/1991
Binding: Hardcover
Pages: 576
Size: 6.25" wide x 9.50" long x 1.25" tall
Weight: 2.090
Language: English

THOMAS M. COVER, PHD, is Professor in the departments of electrical engineering and statistics, Stanford University. A recipient of the 1991 IEEE Claude E. Shannon Award, Dr. Cover is a past president of the IEEE Information Theory Society, a Fellow of the IEEE and the Institute of Mathematical Statistics, and a member of the National Academy of Engineering and the American Academy of Arts and Science. He has authored more than 100 technical papers and is coeditor of Open Problems in Communication and Computation.JOY A. THOMAS, PHD, is the Chief Scientist at Stratify, Inc., a Silicon Valley start-up specializing in organizing unstructured information. After receiving his PhD at Stanford, Dr. Thomas spent more than nine years at the IBM T. J. Watson Research Center in Yorktown Heights, New York. Dr. Thomas is a recipient of the IEEE Charles LeGeyt Fortescue Fellowship.

Preface to the Second Edition
Preface to the First Edition
Acknowledgments for the Second Edition
Acknowledgments for the First Edition
Introduction and Preview
Preview of the Book
Entropy, Relative Entropy, and Mutual Information
Joint Entropy and Conditional Entropy
Relative Entropy and Mutual Information
Relationship Between Entropy and Mutual Information
Chain Rules for Entropy, Relative Entropy, and Mutual Information
Jensen's Inequality and Its Consequences
Log Sum Inequality and Its Applications
Data-Processing Inequality
Sufficient Statistics
Fano's Inequality
Historical Notes
Asymptotic Equipartition Property
Asymptotic Equipartition Property Theorem
Consequences of the AEP: Data Compression
High-Probability Sets and the Typical Set
Historical Notes
Entropy Rates of a Stochastic Process
Markov Chains
Entropy Rate
Example: Entropy Rate of a Random Walk on a Weighted Graph
Second Law of Thermodynamics
Functions of Markov Chains
Historical Notes
Data Compression
Examples of Codes
Kraft Inequality
Optimal Codes
Bounds on the Optimal Code Length
Kraft Inequality for Uniquely Decodable Codes
Huffman Codes
Some Comments on Huffman Codes
Optimality of Huffman Codes
Shannon-Fano-Elias Coding
Competitive Optimality of the Shannon Code
Generation of Discrete Distributions from Fair Coins
Historical Notes
Gambling and Data Compression
The Horse Race
Gambling and Side Information
Dependent Horse Races and Entropy Rate
The Entropy of English
Data Compression and Gambling
Gambling Estimate of the Entropy of English
Historical Notes
Channel Capacity
Examples of Channel Capacity
Symmetric Channels
Properties of Channel Capacity
Preview of the Channel Coding Theorem
Jointly Typical Sequences
Channel Coding Theorem
Zero-Error Codes
Fano's Inequality and the Converse to the Coding Theorem
Equality in the Converse to the Channel Coding Theorem
Hamming Codes
Feedback Capacity
Source-Channel Separation Theorem
Historical Notes
Differential Entropy
AEP for Continuous Random Variables
Relation of Differential Entropy to Discrete Entropy
Joint and Conditional Differential Entropy
Relative Entropy and Mutual Information
Properties of Differential Entropy, Relative Entropy, and Mutual Information
Historical Notes
Gaussian Channel
Gaussian Channel: Definitions
Converse to the Coding Theorem for Gaussian Channels
Bandlimited Channels
Parallel Gaussian Channels
Channels with Colored Gaussian Noise
Gaussian Channels with Feedback
Historical Notes
Rate Distortion Theory
Calculation of the Rate Distortion Function
Converse to the Rate Distortion Theorem
Achievability of the Rate Distortion Function
Strongly Typical Sequences and Rate Distortion
Characterization of the Rate Distortion Function
Computation of Channel Capacity and the Rate Distortion Function
Historical Notes
Information Theory and Statistics
Method of Types
Law of Large Numbers
Universal Source Coding
Large Deviation Theory
Examples of Sanov's Theorem
Conditional Limit Theorem
Hypothesis Testing
Chernoff-Stein Lemma
Chernoff Information
Fisher Information and the Cram�er-Rao Inequality
Historical Notes
Maximum Entropy
Maximum Entropy Distributions
Anomalous Maximum Entropy Problem
Spectrum Estimation
Entropy Rates of a Gaussian Process
Burg's Maximum Entropy Theorem
Historical Notes
Universal Source Coding
Universal Codes and Channel Capacity
Universal Coding for Binary Sequences
Arithmetic Coding
Lempel-Ziv Coding
Optimality of Lempel-Ziv Algorithms
Historical Notes
Kolmogorov Complexity
Models of Computation
Kolmogorov Complexity: Definitions and Examples
Kolmogorov Complexity and Entropy
Kolmogorov Complexity of Integers
Algorithmically Random and Incompressible Sequences
Universal Probability
Kolmogorov complexity
Universal Gambling
Occam's Razor
Kolmogorov Complexity and Universal Probability
Kolmogorov Sufficient Statistic
Minimum Description Length Principle
Historical Notes
Network Information Theory
Gaussian Multiple-User Channels
Jointly Typical Sequences
Multiple-Access Channel
Encoding of Correlated Sources
Duality Between Slepian-Wolf Encoding and Multiple-Access Channels
Broadcast Channel
Relay Channel
Source Coding with Side Information
Rate Distortion with Side Information
General Multiterminal Networks
Historical Notes
Information Theory and Portfolio Theory
The Stock Market: Some Definitions
Kuhn-Tucker Characterization of the Log-Optimal Portfolio
Asymptotic Optimality of the Log-Optimal Portfolio
Side Information and the Growth Rate
Investment in Stationary Markets
Competitive Optimality of the Log-Optimal Portfolio
Universal Portfolios
Shannon-McMillan-Breiman Theorem
Historical Notes
Inequalities in Information Theory
Basic Inequalities of Information Theory
Differential Entropy
Bounds on Entropy and Relative Entropy
Inequalities for Types
Combinatorial Bounds on Entropy
Entropy Rates of Subsets
Entropy and Fisher Information
Entropy Power Inequality and Brunn-Minkowski Inequality
Inequalities for Determinants
Inequalities for Ratios of Determinants
Historical Notes
List of Symbols
Free shipping on orders over $35*

*A minimum purchase of $35 is required. Shipping is provided via FedEx SmartPost® and FedEx Express Saver®. Average delivery time is 1 – 5 business days, but is not guaranteed in that timeframe. Also allow 1 - 2 days for processing. Free shipping is eligible only in the continental United States and excludes Hawaii, Alaska and Puerto Rico. FedEx service marks used by permission."Marketplace" orders are not eligible for free or discounted shipping.

Learn more about the TextbookRush Marketplace.