Transcription of Information Theory and Coding - University of Cambridge
{{id}} {{{paragraph}}}
InformationTheoryandCodingComputerScienc eTripos PartII, MichaelmasTerm11 Lecturesby J G Daugman1. Foundations:Probability, Uncertainty, andInformation2. EntropiesDe ned,andWhy theyareMeasuresof Information3. SourceCodingTheorem;Pre x,Variable-,& Fixed-LengthCodes4. ChannelTypes,Properties,Noise,andChannel Capacity5. ContinuousInformation;Density; NoisyChannelCodingTheorem6. FourierSeries,Convergence,OrthogonalRepr esentation7. UsefulFourierTheorems;TransformPairs;Sam pling;Aliasing8. TheQuantizedDegrees-of-Freedomin a \Logons" Complexity andMinimalDescriptionLengthInformationTh eoryandCodingJ G DaugmanPrerequisitecourses:Probability;M athematical Methods forCS;DiscreteMathematicsAimsTheaimsof thiscourseareto introducetheprinciplesandapplicationsof informationis measuredin termsof probability andentropy, andtherelationshipsamongconditionalandjo int entropies;how theseareusedto calculatethecapacityof a communicationchannel,withandwithoutnoise ;codingschemes,includingerrorcorrectingc odes;how dis
The Quantized Degrees-of-Freedom in a Continuous Signal 10. Gabor-Heisenberg-Weyl Uncertainty Relation. Optimal \Logons" ... Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 lectures by J G Daugman 1. Overview: What is Information Theory? ... invented \white noise analysis" of non-linear systems, and made the ...
Domain:
Source:
Link to this page:
Please notify us if you found a problem with this document:
{{id}} {{{paragraph}}}