Example: tourism industry

Elements of Information Theory (Wiley Series in ...

Elements OFINFORMATION THEORYS econd EditionTHOMAS M. COVERJOY A. THOMASA JOHN WILEY & SONS, INC., PUBLICATIONELEMENTS OFINFORMATION THEORYELEMENTS OFINFORMATION THEORYS econd EditionTHOMAS M. COVERJOY A. THOMASA JOHN WILEY & SONS, INC., PUBLICATIONC opyright 2006 by John Wiley & Sons, Inc. All rights by John Wiley & Sons, Inc., Hoboken, New simultaneously in part of this publication may be reproduced, stored in a retrieval system, or transmitted in anyform or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise,except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, withouteither the prior written permission of the Publisher, or authorization through payment of theappropriate per-copy fee to the Copyright Clearance Center, Inc.

Historical Notes 100 5 Data Compression 103 5.1 Examples of Codes 103 5.2 Kraft Inequality 107 5.3 Optimal Codes 110 5.4 Bounds on the Optimal Code Length 112 5.5 Kraft Inequality for Uniquely Decodable Codes 115 5.6 Huffman Codes 118 5.7 Some Comments on Huffman Codes 120 5.8 Optimality of Huffman Codes 123 5.9 Shannon–Fano–Elias Coding 127

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Elements of Information Theory (Wiley Series in ...

1 Elements OFINFORMATION THEORYS econd EditionTHOMAS M. COVERJOY A. THOMASA JOHN WILEY & SONS, INC., PUBLICATIONELEMENTS OFINFORMATION THEORYELEMENTS OFINFORMATION THEORYS econd EditionTHOMAS M. COVERJOY A. THOMASA JOHN WILEY & SONS, INC., PUBLICATIONC opyright 2006 by John Wiley & Sons, Inc. All rights by John Wiley & Sons, Inc., Hoboken, New simultaneously in part of this publication may be reproduced, stored in a retrieval system, or transmitted in anyform or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise,except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, withouteither the prior written permission of the Publisher, or authorization through payment of theappropriate per-copy fee to the Copyright Clearance Center, Inc.

2 , 222 Rosewood Drive, Danvers,Requests to the Publisher for permission should be addressed to the Permissions Department,John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008,Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best effortsin preparing this book, they make no representations or warranties with respect to the accuracy orcompleteness of the contents of this book and specifically disclaim any implied warranties ofmerchantability or fitness for a particular purpose. No warranty may be created or extended by salesrepresentatives or written sales materials. The advice and strategies contained herein may not besuitable for your situation. You should consult with a professional where appropriate.

3 Neither thepublisher nor author shall be liable for any loss of profit or any other commercial damages, includingbut not limited to special, incidental, consequential, or other general Information on our other products and services or for technical support, please contact ourCustomer Care Department within the United States at (800) 762-2974, outside the United States at(317) 572-3993 or fax (317) also publishes its books in a variety of electronic formats. Some content that appears in printmay not be available in electronic formats. For more Information about Wiley products, visit our webLibrary of Congress Cataloging-in-Publication Data:Cover, T. M., 1938 Elements of Information Theory /by Thomas M. Cover, Joy A. Thomas. 2nd cm.

4 A Wiley-Interscience publication. Includes bibliographical references and 978-0-471-24195-9 ISBN-10 0-471-24195-41. Information Theory . I. Thomas, Joy A. II. 2005003 .54 dc222005047799 Printed in the United States of at onlineat 01923,(978)750-8400,fax (978)750-4470,or on the web at to the Second EditionxvPreface to the First EditionxviiAcknowledgments for the Second EditionxxiAcknowledgments for the First Editionxxiii1 Introduction and Preview of the Book 52 Entropy, Relative Entropy, and Mutual Entropy Joint Entropy and Conditional Entropy Relative Entropy and Mutual Information Relationship Between Entropy and MutualInformation Chain Rules for Entropy, Relative Entropy,and Mutual Information Jensen s Inequality and Its Consequences Log Sum Inequality and Its Applications Data-Processing Inequality Sufficient Statistics Fano s Inequality 37 Summary 41 Problems 43 Historical Notes 54vviCONTENTS3 Asymptotic Equipartition Asymptotic Equipartition Property Theorem Consequences of the AEP: Data Compression High-Probability Sets and the Typical Set 62 Summary 64 Problems 64 Historical Notes 694 Entropy Rates of a Stochastic Markov Chains Entropy Rate Example.

5 Entropy Rate of a Random Walk on aWeighted Graph Second Law of Thermodynamics Functions of Markov Chains 84 Summary 87 Problems 88 Historical Notes 1005 Data Examples of Codes Kraft Inequality Optimal Codes Bounds on the Optimal Code Length Kraft Inequality for Uniquely DecodableCodes Huffman Codes Some Comments on Huffman Codes Optimality of Huffman Codes Shannon Fano Elias Coding Competitive Optimality of the ShannonCode Generation of Discrete Distributions fromFair Coins 134 Summary 141 Problems 142 Historical Notes 157 CONTENTSvii6 Gambling and Data The Horse Race Gambling and Side Information Dependent Horse Races and Entropy Rate The Entropy of English Data Compression and Gambling Gambling Estimate of the Entropy of English 173 Summary 175 Problems 176 Historical Notes 1827 Channel Examples of Channel Capacity Noiseless Binary Channel Noisy Channel with NonoverlappingOutputs Noisy Typewriter Binary Symmetric Channel Binary Erasure Channel Symmetric Channels Properties of Channel Capacity Preview of the Channel Coding Theorem Definitions Jointly Typical Sequences Channel Coding Theorem Zero-Error Codes Fano s Inequality and the Converse to the CodingTheorem Equality in the Converse to the Channel CodingTheorem Hamming Codes

6 Feedback Capacity Source Channel Separation Theorem 218 Summary 222 Problems 223 Historical Notes 240viiiCONTENTS8 Differential Definitions AEP for Continuous Random Variables Relation of Differential Entropy to DiscreteEntropy Joint and Conditional Differential Entropy Relative Entropy and Mutual Information Properties of Differential Entropy, Relative Entropy,and Mutual Information 252 Summary 256 Problems 256 Historical Notes 2599 Gaussian Gaussian Channel: Definitions Converse to the Coding Theorem for GaussianChannels Bandlimited Channels Parallel Gaussian Channels Channels with Colored Gaussian Noise Gaussian Channels with Feedback 280 Summary 289 Problems 290 Historical Notes 29910 Rate Distortion Quantization Definitions Calculation of the Rate Distortion Function Binary Source Gaussian Source Simultaneous Description of IndependentGaussian Random Variables Converse to the Rate Distortion Theorem Achievability of the Rate Distortion Function Strongly Typical Sequences and Rate Distortion Characterization of the Rate Distortion Function Computation of Channel Capacity and the RateDistortion

7 Function 332 Summary 335 Problems 336 Historical Notes 34511 Information Theory and Method of Types Law of Large Numbers Universal Source Coding Large Deviation Theory Examples of Sanov s Theorem Conditional Limit Theorem Hypothesis Testing Chernoff Stein Lemma Chernoff Information Fisher Information and the Cram er RaoInequality 392 Summary 397 Problems 399 Historical Notes 40812 Maximum Maximum Entropy Distributions Examples Anomalous Maximum Entropy Problem Spectrum Estimation Entropy Rates of a Gaussian Process Burg s Maximum Entropy Theorem 417 Summary 420 Problems 421 Historical Notes 42513 Universal Source Universal Codes and Channel Capacity Universal Coding for Binary Sequences Arithmetic Coding Lempel Ziv Coding Sliding Window Lempel ZivAlgorithm Tree-Structured Lempel ZivAlgorithms Optimality of Lempel Ziv Algorithms Sliding Window Lempel ZivAlgorithms Optimality of Tree-Structured Lempel ZivCompression 448 Summary 456 Problems 457 Historical Notes 46114 Kolmogorov Models of Computation Kolmogorov Complexity.

8 Definitionsand Examples Kolmogorov Complexity and Entropy Kolmogorov Complexity of Integers Algorithmically Random and IncompressibleSequences Universal Probability Kolmogorov complexity Universal Gambling Occam s Razor Kolmogorov Complexity and UniversalProbability Kolmogorov Sufficient Statistic Minimum Description Length Principle 500 Summary 501 Problems 503 Historical Notes 50715 Network Information Gaussian Multiple-User Channels Single-User Gaussian Channel Gaussian Multiple-Access ChannelwithmUsers Gaussian Broadcast Channel Gaussian Relay Channel Gaussian Interference Channel Gaussian Two-Way Channel Jointly Typical Sequences Multiple-Access Channel Achievability of the Capacity Region for theMultiple-Access Channel Comments on the Capacity Region for theMultiple-Access Channel Convexity of the Capacity Region of theMultiple-Access Channel Converse for the Multiple-AccessChannel Multiple-Access Channels Gaussian Multiple-Access Channels Encoding of Correlated Sources Achievability of the Slepian WolfTheorem Converse for the Slepian WolfTheorem Slepian Wolf Theorem for ManySources Interpretation of Slepian WolfCoding Duality Between Slepian Wolf Encoding andMultiple-Access Channels Broadcast Channel Definitions for a Broadcast Channel Degraded Broadcast Channels Capacity Region for the

9 Degraded BroadcastChannel Relay Channel Source Coding with Side Information Rate Distortion with Side Information General Multiterminal Networks 587 Summary 594 Problems 596 Historical Notes 60916 Information Theory and Portfolio The Stock Market: Some Definitions Kuhn Tucker Characterization of the Log-OptimalPortfolio Asymptotic Optimality of the Log-OptimalPortfolio Side Information and the Growth Rate Investment in Stationary Markets Competitive Optimality of the Log-OptimalPortfolio Universal Portfolios Finite-Horizon Universal Portfolios Horizon-Free Universal Portfolios Shannon McMillan Breiman Theorem(General AEP) 644 Summary 650 Problems 652 Historical Notes 65517 Inequalities in Information Basic Inequalities of Information Theory Differential Entropy Bounds on Entropy and Relative Entropy Inequalities for Types Combinatorial Bounds on Entropy Entropy Rates of Subsets Entropy and Fisher Information Entropy Power Inequality and Brunn MinkowskiInequality Inequalities for Determinants Inequalities for Ratios of Determinants 683 Summary 686 Problems 686 Historical Notes 687 Bibliography689 List of Symbols723 Index727 PREFACE TO THESECOND EDITIONIn the years since the publication of the first edition, there were manyaspects of the book that we wished to improve, to rearrange.

10 Or to expand,but the constraints of reprinting would not allow us to make those changesbetween printings. In the new edition, we now get a chance to make someof these changes, to add problems, and to discuss some topics that we hadomitted from the first key changes include a reorganization of the chapters to makethe book easier to teach, and the addition of more than two hundrednew problems. We have added material on universal portfolios, universalsource coding, Gaussian feedback capacity, network Information Theory ,and developed the duality of data compression and channel capacity. Anew chapter has been added and many proofs ha


Related search queries