Elements of Information Theory

ISBN-10: 0471241954
ISBN-13: 9780471241959
Edition: 2nd 2006 (Revised)
List price: $124.00 Buy it from $17.52
This item qualifies for FREE shipping

*A minimum purchase of $35 is required. Shipping is provided via FedEx SmartPost® and FedEx Express Saver®. Average delivery time is 1 – 5 business days, but is not guaranteed in that timeframe. Also allow 1 - 2 days for processing. Free shipping is eligible only in the continental United States and excludes Hawaii, Alaska and Puerto Rico. FedEx service marks used by permission."Marketplace" orders are not eligible for free or discounted shipping.

30 day, 100% satisfaction guarantee

If an item you ordered from TextbookRush does not meet your expectations due to an error on our part, simply fill out a return request and then return it by mail within 30 days of ordering it for a full refund of item cost.

Learn more about our returns policy

Description: Elements of Information Theory, Second Edition, covers the standard topics of information theory, such as entropy, data compression, channel capacity, rate distortion, multi-user theory and hypothesis testing. It presents applications to  More...

Used Starting from $75.30
New Starting from $118.08
what's this?
Rush Rewards U
Members Receive:
coins
coins
You have reached 400 XP and carrot coins. That is the daily max!

Study Briefs

Limited time offer: Get the first one free! (?)

All the information you need in one place! Each Study Brief is a summary of one specific subject; facts, figures, and explanations to help you learn faster.

Add to cart
Study Briefs
SQL Online content $4.95 $1.99
Add to cart
Study Briefs
MS Excel® 2010 Online content $4.95 $1.99
Add to cart
Study Briefs
MS Word® 2010 Online content $4.95 $1.99
Add to cart
Study Briefs
MS PowerPoint® 2010 Online content $4.95 $1.99

Customers also bought

Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading

Book details

List price: $124.00
Edition: 2nd
Copyright year: 2006
Publisher: John Wiley & Sons, Incorporated
Publication date: 7/18/2006
Binding: Hardcover
Pages: 792
Size: 6.25" wide x 9.25" long x 1.25" tall
Weight: 2.398
Language: English

Elements of Information Theory, Second Edition, covers the standard topics of information theory, such as entropy, data compression, channel capacity, rate distortion, multi-user theory and hypothesis testing. It presents applications to communications, statistics, complexity theory, and investment. Chapters 1-9 cover the asymptotic equipartition property, data compression and channel capacity culminating in the capacity of the Gaussian channel. Chapters 10-17 include rate distortion, the method of types, Kolmogorov complexity, network information theory, universal source coding and portfolio theory. The first edition of this book is the most successful book on Information Theory on the market today. Adoptions have remained strong since the book's publication in 1991.

THOMAS M. COVER, PHD, is Professor in the departments of electrical engineering and statistics, Stanford University. A recipient of the 1991 IEEE Claude E. Shannon Award, Dr. Cover is a past president of the IEEE Information Theory Society, a Fellow of the IEEE and the Institute of Mathematical Statistics, and a member of the National Academy of Engineering and the American Academy of Arts and Science. He has authored more than 100 technical papers and is coeditor of Open Problems in Communication and Computation.JOY A. THOMAS, PHD, is the Chief Scientist at Stratify, Inc., a Silicon Valley start-up specializing in organizing unstructured information. After receiving his PhD at Stanford, Dr. Thomas spent more than nine years at the IBM T. J. Watson Research Center in Yorktown Heights, New York. Dr. Thomas is a recipient of the IEEE Charles LeGeyt Fortescue Fellowship.

Preface to the Second Edition
Preface to the First Edition
Acknowledgments for the Second Edition
Acknowledgments for the First Edition
Introduction and Preview
Preview of the Book
Entropy, Relative Entropy, and Mutual Information
Entropy
Joint Entropy and Conditional Entropy
Relative Entropy and Mutual Information
Relationship Between Entropy and Mutual Information
Chain Rules for Entropy, Relative Entropy, and Mutual Information
Jensen's Inequality and Its Consequences
Log Sum Inequality and Its Applications
Data-Processing Inequality
Sufficient Statistics
Fano's Inequality
Summary
Problems
Historical Notes
Asymptotic Equipartition Property
Asymptotic Equipartition Property Theorem
Consequences of the AEP: Data Compression
High-Probability Sets and the Typical Set
Summary
Problems
Historical Notes
Entropy Rates of a Stochastic Process
Markov Chains
Entropy Rate
Example: Entropy Rate of a Random Walk on a Weighted Graph
Second Law of Thermodynamics
Functions of Markov Chains
Summary
Problems
Historical Notes
Data Compression
Examples of Codes
Kraft Inequality
Optimal Codes
Bounds on the Optimal Code Length
Kraft Inequality for Uniquely Decodable Codes
Huffman Codes
Some Comments on Huffman Codes
Optimality of Huffman Codes
Shannon-Fano-Elias Coding
Competitive Optimality of the Shannon Code
Generation of Discrete Distributions from Fair Coins
Summary
Problems
Historical Notes
Gambling and Data Compression
The Horse Race
Gambling and Side Information
Dependent Horse Races and Entropy Rate
The Entropy of English
Data Compression and Gambling
Gambling Estimate of the Entropy of English
Summary
Problems
Historical Notes
Channel Capacity
Examples of Channel Capacity
Noiseless Binary Channel
Noisy Channel with Nonoverlapping Outputs
Noisy Typewriter
Binary Symmetric Channel
Binary Erasure Channel
Symmetric Channels
Properties of Channel Capacity
Preview of the Channel Coding Theorem
Definitions
Jointly Typical Sequences
Channel Coding Theorem
Zero-Error Codes
Fano's Inequality and the Converse to the Coding Theorem
Equality in the Converse to the Channel Coding Theorem
Hamming Codes
Feedback Capacity
Source-Channel Separation Theorem
Summary
Problems
Historical Notes
Differential Entropy
Definitions
AEP for Continuous Random Variables
Relation of Differential Entropy to Discrete Entropy
Joint and Conditional Differential Entropy
Relative Entropy and Mutual Information
Properties of Differential Entropy, Relative Entropy, and Mutual Information
Summary
Problems
Historical Notes
Gaussian Channel
Gaussian Channel: Definitions
Converse to the Coding Theorem for Gaussian Channels
Bandlimited Channels
Parallel Gaussian Channels
Channels with Colored Gaussian Noise
Gaussian Channels with Feedback
Summary
Problems
Historical Notes
Rate Distortion Theory
Quantization
Definitions
Calculation of the Rate Distortion Function
Binary Source
Gaussian Source
Simultaneous Description of Independent Gaussian Random Variables
Converse to the Rate Distortion Theorem
Achievability of the Rate Distortion Function
Strongly Typical Sequences and Rate Distortion
Characterization of the Rate Distortion Function
Computation of Channel Capacity and the Rate Distortion Function
Summary
Problems
Historical Notes
Information Theory and Statistics
Method of Types
Law of Large Numbers
Universal Source Coding
Large Deviation Theory
Examples of Sanov's Theorem
Conditional Limit Theorem
Hypothesis Testing
Chernoff-Stein Lemma
Chernoff Information
Fisher Information and the Cramer-Rao Inequality
Summary
Problems
Historical Notes
Maximum Entropy
Maximum Entropy Distributions
Examples
Anomalous Maximum Entropy Problem
Spectrum Estimation
Entropy Rates of a Gaussian Process
Burg's Maximum Entropy Theorem
Summary
Problems
Historical Notes
Universal Source Coding
Universal Codes and Channel Capacity
Universal Coding for Binary Sequences
Arithmetic Coding
Lempel-Ziv Coding
Sliding Window Lempel-Ziv Algorithm
Tree-Structured Lempel-Ziv Algorithms
Optimality of Lempel-Ziv Algorithms
Sliding Window Lempel-Ziv Algorithms
Optimality of Tree-Structured Lempel-Ziv Compression
Summary
Problems
Historical Notes
Kolmogorov Complexity
Models of Computation
Kolmogorov Complexity: Definitions and Examples
Kolmogorov Complexity and Entropy
Kolmogorov Complexity of Integers
Algorithmically Random and Incompressible Sequences
Universal Probability
Kolmogorov complexity
[Omega]
Universal Gambling
Occam's Razor
Kolmogorov Complexity and Universal Probability
Kolmogorov Sufficient Statistic
Minimum Description Length Principle
Summary
Problems
Historical Notes
Network Information Theory
Gaussian Multiple-User Channels
Single-User Gaussian Channel
Gaussian Multiple-Access Channel with m Users
Gaussian Broadcast Channel
Gaussian Relay Channel
Gaussian Interference Channel
Gaussian Two-Way Channel
Jointly Typical Sequences
Multiple-Access Channel
Achievability of the Capacity Region for the Multiple-Access Channel
Comments on the Capacity Region for the Multiple-Access Channel
Convexity of the Capacity Region of the Multiple-Access Channel
Converse for the Multiple-Access Channel
m-User Multiple-Access Channels
Gaussian Multiple-Access Channels
Encoding of Correlated Sources
Achievability of the Slepian-Wolf Theorem
Converse for the Slepian-Wolf Theorem
Slepian-Wolf Theorem for Many Sources
Interpretation of Slepian-Wolf Coding
Duality Between Slepian-Wolf Encoding and Multiple-Access Channels
Broadcast Channel
Definitions for a Broadcast Channel
Degraded Broadcast Channels
Capacity Region for the Degraded Broadcast Channel
Relay Channel
Source Coding with Side Information
Rate Distortion with Side Information
General Multiterminal Networks
Summary
Problems
Historical Notes
Information Theory and Portfolio Theory
The Stock Market: Some Definitions
Kuhn-Tucker Characterization of the Log-Optimal Portfolio
Asymptotic Optimality of the Log-Optimal Portfolio
Side Information and the Growth Rate
Investment in Stationary Markets
Competitive Optimality of the Log-Optimal Portfolio
Universal Portfolios
Finite-Horizon Universal Portfolios
Horizon-Free Universal Portfolios
Shannon-McMillan-Breiman Theorem (General AEP)
Summary
Problems
Historical Notes
Inequalities in Information Theory
Basic Inequalities of Information Theory
Differential Entropy
Bounds on Entropy and Relative Entropy
Inequalities for Types
Combinatorial Bounds on Entropy
Entropy Rates of Subsets
Entropy and Fisher Information
Entropy Power Inequality and Brunn-Minkowski Inequality
Inequalities for Determinants
Inequalities for Ratios of Determinants
Summary
Problems
Historical Notes
Bibliography
List of Symbols
Index

×
Free shipping on orders over $35*

*A minimum purchase of $35 is required. Shipping is provided via FedEx SmartPost® and FedEx Express Saver®. Average delivery time is 1 – 5 business days, but is not guaranteed in that timeframe. Also allow 1 - 2 days for processing. Free shipping is eligible only in the continental United States and excludes Hawaii, Alaska and Puerto Rico. FedEx service marks used by permission."Marketplace" orders are not eligible for free or discounted shipping.

Learn more about the TextbookRush Marketplace.

×