Example: bankruptcy

Information Theory and Coding - University of Cambridge

InformationTheoryandCodingComputerScienc eTripos PartII, MichaelmasTerm11 Lecturesby J G Daugman1. Foundations:Probability, Uncertainty, andInformation2. EntropiesDe ned,andWhy theyareMeasuresof Information3. SourceCodingTheorem;Pre x,Variable-,& Fixed-LengthCodes4. ChannelTypes,Properties,Noise,andChannel Capacity5. ContinuousInformation;Density; NoisyChannelCodingTheorem6. FourierSeries,Convergence,OrthogonalRepr esentation7. UsefulFourierTheorems;TransformPairs;Sam pling;Aliasing8. TheQuantizedDegrees-of-Freedomin a \Logons" Complexity andMinimalDescriptionLengthInformationTh eoryandCodingJ G DaugmanPrerequisitecourses:Probability;M athematical Methods forCS;DiscreteMathematicsAimsTheaimsof thiscourseareto introducetheprinciplesandapplicationsof informationis measuredin termsof probability andentropy, andtherelationshipsamongconditionalandjo int entropies;how theseareusedto calculatethecapacityof a communicationchannel,withandwithoutnoise ;codingschemes,includingerrorcorrectingc odes;how dis

The Quantized Degrees-of-Freedom in a Continuous Signal 10. Gabor-Heisenberg-Weyl Uncertainty Relation. Optimal \Logons" ... Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 lectures by J G Daugman 1. Overview: What is Information Theory? ... invented \white noise analysis" of non-linear systems, and made the ...

Tags:

  Information, Computer, System, Degree

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Information Theory and Coding - University of Cambridge

1 InformationTheoryandCodingComputerScienc eTripos PartII, MichaelmasTerm11 Lecturesby J G Daugman1. Foundations:Probability, Uncertainty, andInformation2. EntropiesDe ned,andWhy theyareMeasuresof Information3. SourceCodingTheorem;Pre x,Variable-,& Fixed-LengthCodes4. ChannelTypes,Properties,Noise,andChannel Capacity5. ContinuousInformation;Density; NoisyChannelCodingTheorem6. FourierSeries,Convergence,OrthogonalRepr esentation7. UsefulFourierTheorems;TransformPairs;Sam pling;Aliasing8. TheQuantizedDegrees-of-Freedomin a \Logons" Complexity andMinimalDescriptionLengthInformationTh eoryandCodingJ G DaugmanPrerequisitecourses:Probability;M athematical Methods forCS;DiscreteMathematicsAimsTheaimsof thiscourseareto introducetheprinciplesandapplicationsof informationis measuredin termsof probability andentropy, andtherelationshipsamongconditionalandjo int entropies;how theseareusedto calculatethecapacityof a communicationchannel,withandwithoutnoise ;codingschemes,includingerrorcorrectingc odes;how discretechannelsandmeasuresof informationgeneraliseto theircontinuousforms;theFourierperspecti ve.

2 Andextensionsto wavelets,complexity, compression,ande cient codingof Foundations:probability, uncertainty, conceptsof randomness,redundancy, compressibility, noise,bandwidth,anduncertainty arerelatedto ,randomvariables, themetricsofinformationaregroundedin therulesof probability. Entropiesde ned,andwhy theyaremeasuresof ,joint entropy, conditionalentropy, andtheChainRuleforentropy. Mutualinformationbetweenensemblesof entropy is thefundamentalmeasureof infor-mationcontent. Sourcecodingtheorem;pre x,variable-,and of a Channeltypes,properties,noise, of a discretechannelas themaximumof itsmutualinformationover allpossibleinputdistributions. Continuousinformation;density; thediscreteentropiesandmeasuresto ;power spectraldensity.

3 Signi canceof ciencyfornoisycontinuouschannels. Fourierseries,convergence, continuousor UsefulFouriertheorems; ; thetransform, 'sSamplingTheoremderived,andthecause(and removal)of aliasing. cient al-gorithmsforcomputingFouriertransforms of ,correlation,modulation,demodulation,coh erence. Thequantiseddegrees-of-freedomin a a continuoussig-nalof nitebandwidthanddurationhasa xednumber of theprinciplethatinformation,evenin such a signal,comesin quantised,countable,packets. Gabor-Heisenberg-Weyluncertainty \Logons".Uni cationofthetime-domainandthefrequency-do mainas endpoints of a Principleanditsoptimalsolutionby Gabor'sexpansionbasisof \logons". images,foranalysisandcompression.

4 Kolmogorov complexity. nitionof thealgorithmiccomplexity of a datasequence,anditsrelationtotheentropy of thedistributionfromwhich thedatawas ,andwhy thismeasureofcomplexity is theendof thecoursestudents shouldbe ableto calculatetheinformationcontent of a randomvariablefromitsprobability distribution relatethejoint, conditional,andmarginalentropiesof variablesin termsof theircoupledprobabilities de nechannelcapacitiesandpropertiesusingSha nnon'sTheorems constructe cient codesfordataonimperfectcommunicationchan nels generalisethediscreteconceptsto continuoussignalsoncontinuouschannels understandFourierTransformsandthemainide asof e cient algorithmsforthem describe theinformationresolutionandcompressionpr opertiesof waveletsRecommendedbook* Cover, Thomas, (1991).

5 Elementsof informationtheory. PartII, MichaelmasTerm11 lecturesby J G Daugman1. Overview:Whatis InformationTheory?Keyidea:Themovements andtransformationsof Information ,justlikethoseof a uid,areconstrainedby deepconnectionswith: probability Theory , statistics,andcombinatorics thermodynamics(statisticalphysics) spectralanalysis,Fourier(andother)transf orms samplingtheory, prediction,estimationtheory electricalengineering(bandwidth;signal-t o-noiseratio) complexity Theory (minimaldescriptionlength) signalprocessing,representation,compress ibilityAssuch, informationtheoryaddressesandanswersthet wo fundamentalquestionsof communicationtheory:1. Whatis theultimatedatacompression?

6 (answer:theentropy of thedata,H, is itscompressionlimit.)2. Whatis theultimatetransmissionrateof communication?(answer:thechannelcapacity ,C, is itsratelimit.)Allcommunicationschemeslie in betweenthesetwo limitsonthecom-pressibility of dataandthecapacity of a achieve which InformationTheoryo ersanswers: How shouldinformationbe measured? How much additionalinformationis gainedby somereductioninuncertainty? How dothea prioriprobabilitiesof possiblemessagesdeterminetheinformativen essof receivingthem? Whatis theinformationcontent of a randomvariable? How does thenoiselevel in a communicationchannellimititscapacity to transmitinformation? How doesthebandwidth(incycles/second)of a communicationchannellimititscapacity to transmitinformation?

7 Bywhatformalismshouldpriorknowledgebe combinedwithincomingdatato draw formallyjusti ableinferencesfromboth? How much informationin containedin a strandof DNA? How much informationis therein the ringpatternof a neurone?Historicaloriginsandimportant contributions: LudwigBOLTZMANN(1844-1906),physicist,sho wedin 1877thatthermodynamicentropy (de nedas theenergyof a statisticalen-semble[such as a gas]dividedby itstemperature:ergs/ degree )isrelatedtoth estatisticaldistributionof molecularcon gurations,withincreasingentropy correspondingto nesentropy,Wis thetotalnumber of possiblemolec-ularcon gurations,andkis theconstant which bearsBoltzmann'sname:k= 10 16ergsper degreecentigrade.

8 (Theaboveformulaappearsas anepitaphonBoltzmann'stombstone.)Thisis2 equivalent to thede nitionof theinformation(\negentropy")in anensemble,allof whosepossiblestatesareequiprobable,butwi thaminus signin front (andwhenthelogarithmis base2,k=1.)ThedeepconnectionsbetweenInfo rmationTheoryandthatbranch ofphysicsconcernedwiththermodynamicsands tatisticalmechanics,hingeuponBoltzmann's work. LeoSZILARD(1898-1964)in 1929identi edentropy Information -theoreticconceptsto solve thethermodynamicparadox knownas \Maxwell'sdemon"(a thought-experiment aboutgasmoleculesin a partitionedbox)by showingthattheamount of informationrequiredby thedemonaboutthepositionsandvelocitiesof themoleculeswas equal(negatively)tothedemon'sentropy increment.

9 JamesClerkMAXWELL(1831-1879)originatedth eparadox called\Maxwell'sDemon"which greatlyin uencedBoltzmannandwhichledto thewatershedinsight Cambridge ,Maxwell foundedtheCavendishLaboratorywhich becametheoriginalDepartment of Physics. R V HARTLEYin 1928foundedcommunicationtheorywithhispap erTransmissionof Information . Heproposedthata signal(ora communicationchannel)havingbandwidth over a durationThasa limitednumber of degrees-of-freedom,namely2 T, andthereforeit cancommunicateat mostthisquantity of nedtheinformationcontent of an equiprobableensembleofNpossiblestatesas equalto log2N. NorbertWIENER(1894-1964)uni edinformationtheoryandFourieranalysisby derivinga seriesof \whitenoiseanalysis"of non-linearsystems,andmadethede nitive contributionto modelinganddescribingtheinformationconte nt of DennisGABOR(1900-1979)crystallizedHartle y'sinsight by formu-latinga generalUncertaintyPrincipleforinformatio n,expressingthetrade-o forresolutionbetweenbandwidthandtime.

10 (Signalsthatarewell speci edin frequencycontent mustbe poorlylocalizedin time,andthosethatarewelllocalizedin timemustbe poorlyspeci edin frequencycontent.)Heformalizedthe\Inform ationDi-agram"to describe thisfundamentaltrade-o ,andderived thecon-tinuousfamilyof functionswhich optimize(minimize)theconjointuncertainty 1974 Gabor wontheNobel Prizein Physicsforhisworkin Fourieroptics,includingtheinventionof holography. ClaudeSHANNON(togetherwithWarrenWEAVER)i n 1949wrotethede nitive, classic,workin informationtheory:MathematicalTheoryof separatetreatments forcontinuous-timeanddiscrete-timesignal s,systems,andchannels,thisbooklaidoutall of thekey conceptsandrelationshipsthatde- nethe eldtoday.


Related search queries