Example: bankruptcy

Information Theory and Coding - University of Cambridge

InformationTheoryandCodingComputerScienc eTripos PartII, MichaelmasTerm11 Lecturesby J G Daugman1. Foundations:Probability, Uncertainty, andInformation2. EntropiesDe ned,andWhy theyareMeasuresof Information3. SourceCodingTheorem;Pre x,Variable-,& Fixed-LengthCodes4. ChannelTypes,Properties,Noise,andChannel Capacity5. ContinuousInformation;Density; NoisyChannelCodingTheorem6. FourierSeries,Convergence,OrthogonalRepr esentation7. UsefulFourierTheorems;TransformPairs;Sam pling;Aliasing8. TheQuantizedDegrees-of-Freedomin a \Logons" Complexity andMinimalDescriptionLengthInformationTh eoryandCodingJ G DaugmanPrerequisitecourses:Probability;M athematical Methods forCS;DiscreteMathematicsAimsTheaimsof thiscourseareto introducetheprinciplesandapplicationsof informationis measuredin termsof probability andentropy, andtherelationshipsamongconditionalandjo int entropies;how theseareusedto calculatethecapacityof a communicationchannel,withandwithoutnoise ;codingschemes,includingerrorcorrectingc odes;how discretechannelsandmeasuresof informationgeneraliseto theircontinuousforms;theFourierperspecti ve; andextensionsto wavelets,complexity, compression,ande cient codingof Foundations:probability, uncertainty, conceptsof randomness,redundancy.

Gabor-Heisenberg-Weyl uncertainty relation. Optimal \Logons". Uni cation of the time-domain and the frequency-domain as endpoints of a continuous deformation. The Uncertainty Principle and its optimal solution by Gabor’s expansion basis of \logons". Multi-resolution wavelet codes. Extension to images, for analysis and compression. Kolmogorov ...

Tags:

  Weyl

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Information Theory and Coding - University of Cambridge

1 InformationTheoryandCodingComputerScienc eTripos PartII, MichaelmasTerm11 Lecturesby J G Daugman1. Foundations:Probability, Uncertainty, andInformation2. EntropiesDe ned,andWhy theyareMeasuresof Information3. SourceCodingTheorem;Pre x,Variable-,& Fixed-LengthCodes4. ChannelTypes,Properties,Noise,andChannel Capacity5. ContinuousInformation;Density; NoisyChannelCodingTheorem6. FourierSeries,Convergence,OrthogonalRepr esentation7. UsefulFourierTheorems;TransformPairs;Sam pling;Aliasing8. TheQuantizedDegrees-of-Freedomin a \Logons" Complexity andMinimalDescriptionLengthInformationTh eoryandCodingJ G DaugmanPrerequisitecourses:Probability;M athematical Methods forCS;DiscreteMathematicsAimsTheaimsof thiscourseareto introducetheprinciplesandapplicationsof informationis measuredin termsof probability andentropy, andtherelationshipsamongconditionalandjo int entropies;how theseareusedto calculatethecapacityof a communicationchannel,withandwithoutnoise ;codingschemes,includingerrorcorrectingc odes;how discretechannelsandmeasuresof informationgeneraliseto theircontinuousforms;theFourierperspecti ve.

2 Andextensionsto wavelets,complexity, compression,ande cient codingof Foundations:probability, uncertainty, conceptsof randomness,redundancy, compressibility, noise,bandwidth,anduncertainty arerelatedto ,randomvariables, themetricsofinformationaregroundedin therulesof probability. Entropiesde ned,andwhy theyaremeasuresof ,joint entropy, conditionalentropy, andtheChainRuleforentropy. Mutualinformationbetweenensemblesof entropy is thefundamentalmeasureof infor-mationcontent. Sourcecodingtheorem;pre x,variable-,and of a Channeltypes,properties,noise, of a discretechannelas themaximumof itsmutualinformationover allpossibleinputdistributions. Continuousinformation;density; thediscreteentropiesandmeasuresto ;power spectraldensity. signi canceof ciencyfornoisycontinuouschannels. Fourierseries,convergence, continuousor UsefulFouriertheorems; ; thetransform, 'sSamplingTheoremderived,andthecause(and removal)of aliasing.

3 Cient al-gorithmsforcomputingFouriertransforms of ,correlation,modulation,demodulation,coh erence. Thequantiseddegrees-of-freedomin a a continuoussig-nalof nitebandwidthanddurationhasa xednumber of theprinciplethatinformation,evenin such a signal,comesin quantised,countable,packets. Gabor-Heisenberg-Weyluncertainty \Logons".Uni cationofthetime-domainandthefrequency-do mainas endpoints of a Principleanditsoptimalsolutionby Gabor'sexpansionbasisof \logons". images,foranalysisandcompression. Kolmogorov complexity. nitionof thealgorithmiccomplexity of a datasequence,anditsrelationtotheentropy of thedistributionfromwhich thedatawas ,andwhy thismeasureofcomplexity is theendof thecoursestudents shouldbe ableto calculatetheinformationcontent of a randomvariablefromitsprobability distribution relatethejoint, conditional,andmarginalentropiesof variablesin termsof theircoupledprobabilities de nechannelcapacitiesandpropertiesusingSha nnon'sTheorems constructe cient codesfordataonimperfectcommunicationchan nels generalisethediscreteconceptsto continuoussignalsoncontinuouschannels understandFourierTransformsandthemainide asof e cient algorithmsforthem describe theinformationresolutionandcompressionpr opertiesof waveletsRecommendedbook* Cover, Thomas, (1991).

4 Elementsof informationtheory. PartII, MichaelmasTerm11 lecturesby J G Daugman1. Overview:Whatis InformationTheory?Keyidea:Themovements andtransformationsof Information ,justlikethoseof a uid,areconstrainedby deepconnectionswith: probability Theory , statistics,andcombinatorics thermodynamics(statisticalphysics) spectralanalysis,Fourier(andother)transf orms samplingtheory, prediction,estimationtheory electricalengineering(bandwidth;signal-t o-noiseratio) complexity Theory (minimaldescriptionlength) signalprocessing,representation,compress ibilityAssuch, informationtheoryaddressesandanswersthet wo fundamentalquestionsof communicationtheory:1. Whatis theultimatedatacompression?(answer:theen tropy of thedata,H, is itscompressionlimit.)2. Whatis theultimatetransmissionrateof communication?(answer:thechannelcapacity ,C, is itsratelimit.)

5 Allcommunicationschemeslie in betweenthesetwo limitsonthecom-pressibility of dataandthecapacity of a achieve which InformationTheoryo ersanswers: How shouldinformationbe measured? How much additionalinformationis gainedby somereductioninuncertainty? How dothea prioriprobabilitiesof possiblemessagesdeterminetheinformativen essof receivingthem? Whatis theinformationcontent of a randomvariable? How does thenoiselevel in a communicationchannellimititscapacity to transmitinformation? How doesthebandwidth(incycles/second)of a communicationchannellimititscapacity to transmitinformation? Bywhatformalismshouldpriorknowledgebe combinedwithincomingdatato draw formallyjusti ableinferencesfromboth? How much informationin containedin a strandof DNA? How much informationis therein the ringpatternof a neurone?Historicaloriginsandimportant contributions: LudwigBOLTZMANN(1844-1906),physicist,sho wedin 1877thatthermodynamicentropy (de nedas theenergyof a statisticalen-semble[such as a gas]dividedby itstemperature:ergs/degree)isrelatedtoth estatisticaldistributionof molecularcon gurations,withincreasingentropy correspondingto nesentropy,Wis thetotalnumber of possiblemolec-ularcon gurations,andkis theconstant which bearsBoltzmann'sname:k= 10 16ergsper degreecentigrade.

6 (Theaboveformulaappearsas anepitaphonBoltzmann'stombstone.)Thisis2 equivalent to thede nitionof theinformation(\negentropy")in anensemble,allof whosepossiblestatesareequiprobable,butwi thaminus signin front (andwhenthelogarithmis base2,k=1.)ThedeepconnectionsbetweenInfo rmationTheoryandthatbranch ofphysicsconcernedwiththermodynamicsands tatisticalmechanics,hingeuponBoltzmann's work. LeoSZILARD(1898-1964)in 1929identi edentropy Information -theoreticconceptsto solve thethermodynamicparadox knownas \Maxwell'sdemon"(a thought-experiment aboutgasmoleculesin a partitionedbox)by showingthattheamount of informationrequiredby thedemonaboutthepositionsandvelocitiesof themoleculeswas equal(negatively)tothedemon'sentropy increment. JamesClerkMAXWELL(1831-1879)originatedth eparadox called\Maxwell'sDemon"which greatlyin uencedBoltzmannandwhichledto thewatershedinsight Cambridge ,Maxwell foundedtheCavendishLaboratorywhich becametheoriginalDepartment of Physics.

7 R V HARTLEYin 1928foundedcommunicationtheorywithhispap erTransmissionof Information . Heproposedthata signal(ora communicationchannel)havingbandwidth over a durationThasa limitednumber of degrees-of-freedom,namely2 T, andthereforeit cancommunicateat mostthisquantity of nedtheinformationcontent of an equiprobableensembleofNpossiblestatesas equalto log2N. NorbertWIENER(1894-1964)uni edinformationtheoryandFourieranalysisby derivinga seriesof \whitenoiseanalysis"of non-linearsystems,andmadethede nitive contributionto modelinganddescribingtheinformationconte nt of DennisGABOR(1900-1979)crystallizedHartle y'sinsight by formu-latinga generalUncertaintyPrincipleforinformatio n,expressingthetrade-o forresolutionbetweenbandwidthandtime.(Si gnalsthatarewell speci edin frequencycontent mustbe poorlylocalizedin time,andthosethatarewelllocalizedin timemustbe poorlyspeci edin frequencycontent.)

8 Heformalizedthe\InformationDi-agram"to describe thisfundamentaltrade-o ,andderived thecon-tinuousfamilyof functionswhich optimize(minimize)theconjointuncertainty 1974 Gabor wontheNobel Prizein Physicsforhisworkin Fourieroptics,includingtheinventionof holography. ClaudeSHANNON(togetherwithWarrenWEAVER)i n 1949wrotethede nitive, classic,workin informationtheory:MathematicalTheoryof separatetreatments forcontinuous-timeanddiscrete-timesignal s,systems,andchannels,thisbooklaidoutall of thekey conceptsandrelationshipsthatde- nethe eldtoday. In particular,he proved thefamousSourceCod-ingTheoremandtheNoisy ChannelCodingTheorem,plusmanyotherrelate dresultsaboutchannelcapacity. S KULLBACKandR A LEIBLER(1951)de nedrelativeentropy(alsocalledinformation fordiscrimination,orK-LDistance.) E T JAYNES(since1957)developedmaximumentropy methodsforinference,hypothesis-testing,a nddecision-making,basedonthephysicsof inquiredwhethertheseprinciplesimposefund amentalphysicallimitsto computationitself.

9 A N KOLMOGOROVin 1965proposedthatthecomplexityof astringof datacanbe de nedby thelengthof of datais itsminimaldescriptionlength,andthisspeci estheultimatecompressibility of \Kolmogorov complexity"Kof a stringis approximatelyequalto itsShannonentropyH, thereby unifyingthetheoryof descriptive complexity MathematicalFoundations;Probability Rules;Bayes'TheoremWhatare randomvariables?Whatis probability?Randomvariablesarevariablest hattake onvaluesdeterminedby prob-ability be discreteor continuous,in eithertheirdomainor example,a streamof ASCII encodedtextcharactersin a transmittedmessageis a discreterandomvariable,witha knownprobability distributionforany signalrepresentedby a voltageor soundpressurewave-formas a functionof time(perhapswithaddednoise),is a continuousrandomvariablehavinga continuousprobability density InformationTheoryinvolves probability distributionsof ran-domvariables,andconjoint orconditionalprobabilitiesde nedoverensemblesof ,theinformationcontent of asymbol or event is de nedby its(im)probability.

10 Classically, therearetwo di erent points of viewaboutwhatprobability actuallymeans: relativefrequency:sampletherandomvariabl ea greatmany timesandtallyupthefractionof timesthateach of itsdi erent possiblevaluesoccurs,to arrive at theprobability of each. degree-of-belief:probability is theplausibility of a propositionorthelikelihood thata particularstate(orvalueof a randomvariable)might occur,evenif itsoutcomecanonlybe decidedonce( a particularhorse-race).The rstview,the\frequentist"or operationalistview,is theonethatpredominatesin statisticsandin informationtheory. However,by nomeansdoes it capturethefullmeaningof probability. Forexample,thepropositionthat"Themoonis madeof greencheese"is onewhich surelyhasa probability thatwe shouldbe ableto attach to couldassessitsprobability by degree-of-beliefcalculationswhich5combin eourpriorknowledgeaboutphysics,geology, the\frequentist"de nitionof probability couldonlyassigna probability tothispropositionby performing(say)a largenumberof repeatedtripsto themoon,andtallyingupthefractionof tripsonwhich themoonturnedoutto be a eithercase,it seemssensiblethatthelessprobableanevent is,themoreinformationis gainedby notingits occurrence.


Related search queries