x

Our Privacy Policy has changed. By using this site, you agree to the Privacy Policy.

Elements of Information Theory

ISBN-10: 0471062596
ISBN-13: 9780471062592
Edition: 99th 1991 (Revised)
List price: $110.50
30 day, 100% satisfaction guarantee

If an item you ordered from TextbookRush does not meet your expectations due to an error on our part, simply fill out a return request and then return it by mail within 30 days of ordering it for a full refund of item cost.

Learn more about our returns policy

Description: Following a brief introduction and overview, early chapters cover the basic algebraic relationships of entropy, relative entropy and mutual information, AEP, entropy rates of stochastics processes and data compression, duality of data compression  More...

what's this?
Rush Rewards U
Members Receive:
coins
coins
You have reached 400 XP and carrot coins. That is the daily max!
You could win $10,000

Get an entry for every item you buy, rent, or sell.

Study Briefs

Limited time offer: Get the first one free! (?)

All the information you need in one place! Each Study Brief is a summary of one specific subject; facts, figures, and explanations to help you learn faster.

Add to cart
Study Briefs
Calculus 1 Online content $4.95 $1.99
Add to cart
Study Briefs
SQL Online content $4.95 $1.99
Add to cart
Study Briefs
MS Excel® 2010 Online content $4.95 $1.99
Add to cart
Study Briefs
MS Word® 2010 Online content $4.95 $1.99

Customers also bought

Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading

Book details

List price: $110.50
Edition: 99th
Copyright year: 1991
Publisher: John Wiley & Sons, Incorporated
Publication date: 8/26/1991
Binding: Hardcover
Pages: 576
Size: 6.25" wide x 9.50" long x 1.25" tall
Weight: 2.090
Language: English

Following a brief introduction and overview, early chapters cover the basic algebraic relationships of entropy, relative entropy and mutual information, AEP, entropy rates of stochastics processes and data compression, duality of data compression and the growth rate of wealth. Later chapters explore Kolmogorov complexity, channel capacity, differential entropy, the capacity of the fundamental Gaussian channel, the relationship between information theory and statistics, rate distortion and network information theories. The final two chapters examine the stock market and inequalities in information theory. In many cases the authors actually describe the properties of the solutions before the presented problems.

THOMAS M. COVER, PHD, is Professor in the departments of electrical engineering and statistics, Stanford University. A recipient of the 1991 IEEE Claude E. Shannon Award, Dr. Cover is a past president of the IEEE Information Theory Society, a Fellow of the IEEE and the Institute of Mathematical Statistics, and a member of the National Academy of Engineering and the American Academy of Arts and Science. He has authored more than 100 technical papers and is coeditor of Open Problems in Communication and Computation.JOY A. THOMAS, PHD, is the Chief Scientist at Stratify, Inc., a Silicon Valley start-up specializing in organizing unstructured information. After receiving his PhD at Stanford, Dr. Thomas spent more than nine years at the IBM T. J. Watson Research Center in Yorktown Heights, New York. Dr. Thomas is a recipient of the IEEE Charles LeGeyt Fortescue Fellowship.

Preface to the Second Edition
Preface to the First Edition
Acknowledgments for the Second Edition
Acknowledgments for the First Edition
Introduction and Preview
Preview of the Book
Entropy, Relative Entropy, and Mutual Information
Entropy
Joint Entropy and Conditional Entropy
Relative Entropy and Mutual Information
Relationship Between Entropy and Mutual Information
Chain Rules for Entropy, Relative Entropy, and Mutual Information
Jensen's Inequality and Its Consequences
Log Sum Inequality and Its Applications
Data-Processing Inequality
Sufficient Statistics
Fano's Inequality
Summary
Problems
Historical Notes
Asymptotic Equipartition Property
Asymptotic Equipartition Property Theorem
Consequences of the AEP: Data Compression
High-Probability Sets and the Typical Set
Summary
Problems
Historical Notes
Entropy Rates of a Stochastic Process
Markov Chains
Entropy Rate
Example: Entropy Rate of a Random Walk on a Weighted Graph
Second Law of Thermodynamics
Functions of Markov Chains
Summary
Problems
Historical Notes
Data Compression
Examples of Codes
Kraft Inequality
Optimal Codes
Bounds on the Optimal Code Length
Kraft Inequality for Uniquely Decodable Codes
Huffman Codes
Some Comments on Huffman Codes
Optimality of Huffman Codes
Shannon-Fano-Elias Coding
Competitive Optimality of the Shannon Code
Generation of Discrete Distributions from Fair Coins
Summary
Problems
Historical Notes
Gambling and Data Compression
The Horse Race
Gambling and Side Information
Dependent Horse Races and Entropy Rate
The Entropy of English
Data Compression and Gambling
Gambling Estimate of the Entropy of English
Summary
Problems
Historical Notes
Channel Capacity
Examples of Channel Capacity
Symmetric Channels
Properties of Channel Capacity
Preview of the Channel Coding Theorem
Definitions
Jointly Typical Sequences
Channel Coding Theorem
Zero-Error Codes
Fano's Inequality and the Converse to the Coding Theorem
Equality in the Converse to the Channel Coding Theorem
Hamming Codes
Feedback Capacity
Source-Channel Separation Theorem
Summary
Problems
Historical Notes
Differential Entropy
Definitions
AEP for Continuous Random Variables
Relation of Differential Entropy to Discrete Entropy
Joint and Conditional Differential Entropy
Relative Entropy and Mutual Information
Properties of Differential Entropy, Relative Entropy, and Mutual Information
Summary
Problems
Historical Notes
Gaussian Channel
Gaussian Channel: Definitions
Converse to the Coding Theorem for Gaussian Channels
Bandlimited Channels
Parallel Gaussian Channels
Channels with Colored Gaussian Noise
Gaussian Channels with Feedback
Summary
Problems
Historical Notes
Rate Distortion Theory
Quantization
Definitions
Calculation of the Rate Distortion Function
Converse to the Rate Distortion Theorem
Achievability of the Rate Distortion Function
Strongly Typical Sequences and Rate Distortion
Characterization of the Rate Distortion Function
Computation of Channel Capacity and the Rate Distortion Function
Summary
Problems
Historical Notes
Information Theory and Statistics
Method of Types
Law of Large Numbers
Universal Source Coding
Large Deviation Theory
Examples of Sanov's Theorem
Conditional Limit Theorem
Hypothesis Testing
Chernoff-Stein Lemma
Chernoff Information
Fisher Information and the Cram�er-Rao Inequality
Summary
Problems
Historical Notes
Maximum Entropy
Maximum Entropy Distributions
Examples
Anomalous Maximum Entropy Problem
Spectrum Estimation
Entropy Rates of a Gaussian Process
Burg's Maximum Entropy Theorem
Summary
Problems
Historical Notes
Universal Source Coding
Universal Codes and Channel Capacity
Universal Coding for Binary Sequences
Arithmetic Coding
Lempel-Ziv Coding
Optimality of Lempel-Ziv Algorithms
Compression
Summary
Problems
Historical Notes
Kolmogorov Complexity
Models of Computation
Kolmogorov Complexity: Definitions and Examples
Kolmogorov Complexity and Entropy
Kolmogorov Complexity of Integers
Algorithmically Random and Incompressible Sequences
Universal Probability
Kolmogorov complexity
Universal Gambling
Occam's Razor
Kolmogorov Complexity and Universal Probability
Kolmogorov Sufficient Statistic
Minimum Description Length Principle
Summary
Problems
Historical Notes
Network Information Theory
Gaussian Multiple-User Channels
Jointly Typical Sequences
Multiple-Access Channel
Encoding of Correlated Sources
Duality Between Slepian-Wolf Encoding and Multiple-Access Channels
Broadcast Channel
Relay Channel
Source Coding with Side Information
Rate Distortion with Side Information
General Multiterminal Networks
Summary
Problems
Historical Notes
Information Theory and Portfolio Theory
The Stock Market: Some Definitions
Kuhn-Tucker Characterization of the Log-Optimal Portfolio
Asymptotic Optimality of the Log-Optimal Portfolio
Side Information and the Growth Rate
Investment in Stationary Markets
Competitive Optimality of the Log-Optimal Portfolio
Universal Portfolios
Shannon-McMillan-Breiman Theorem
Summary
Problems
Historical Notes
Inequalities in Information Theory
Basic Inequalities of Information Theory
Differential Entropy
Bounds on Entropy and Relative Entropy
Inequalities for Types
Combinatorial Bounds on Entropy
Entropy Rates of Subsets
Entropy and Fisher Information
Entropy Power Inequality and Brunn-Minkowski Inequality
Inequalities for Determinants
Inequalities for Ratios of Determinants
Summary
Problems
Historical Notes
Bibliography
List of Symbols
Index

×
Free shipping on orders over $35*

*A minimum purchase of $35 is required. Shipping is provided via FedEx SmartPost® and FedEx Express Saver®. Average delivery time is 1 – 5 business days, but is not guaranteed in that timeframe. Also allow 1 - 2 days for processing. Free shipping is eligible only in the continental United States and excludes Hawaii, Alaska and Puerto Rico. FedEx service marks used by permission."Marketplace" orders are not eligible for free or discounted shipping.

Learn more about the TextbookRush Marketplace.

×