Zum Hauptinhalt springen
Dekorationsartikel gehören nicht zum Leistungsumfang.
Information Theory
From Coding to Learning
Buch von Yury Polyanskiy (u. a.)
Sprache: Englisch

93,50 €*

inkl. MwSt.

Versandkostenfrei per Post / DHL

Lieferzeit 2-3 Werktage ab Escheinungsdatum. Dieses Produkt erscheint am 20.02.2025

Kategorien:
Beschreibung
"This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning. Includes over 210 student exercises, emphasising practical applications in statistics, machine learning and modern communication theory. Accompanied by online instructor solutions"--
"This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning. Includes over 210 student exercises, emphasising practical applications in statistics, machine learning and modern communication theory. Accompanied by online instructor solutions"--
Über den Autor
Yury Polyanskiy is a Professor of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology, with a focus on information theory, statistical machine learning, error-correcting codes, wireless communication, and fault tolerance. He is the recipient of the 2020 IEEE Information Theory Society James Massey Award for outstanding achievement in research and teaching in Information Theory.
Inhaltsverzeichnis
Part I. Information measures: 1. Entropy; 2. Divergence; 3. Mutual information; 4. Variational characterizations and continuity of information measures; 5. Extremization of mutual information: capacity saddle point; 6. Tensorization and information rates; 7. f-divergences; 8. Entropy method in combinatorics and geometry; 9. Random number generators; Part II. Lossless Data Compression: 10. Variable-length compression; 11. Fixed-length compression and Slepian-Wolf theorem; 12. Entropy of ergodic processes; 13. Universal compression; Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma; 15. Information projection and large deviations; 16. Hypothesis testing: error exponents; Part IV. Channel Coding: 17. Error correcting codes; 18. Random and maximal coding; 19. Channel capacity; 20. Channels with input constraints. Gaussian channels; 21. Capacity per unit cost; 22. Strong converse. Channel dispersion. Error exponents. Finite blocklength; 23. Channel coding with feedback; Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory; 25. Rate distortion: achievability bounds; 26. Evaluating rate-distortion function. Lossy Source-Channel separation; 27. Metric entropy; Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics; 30. Mutual information method; 31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation; 33. Strong data processing inequality.
Details
Erscheinungsjahr: 2025
Fachbereich: Kommunikationswissenschaften
Genre: Importe, Medienwissenschaften
Rubrik: Wissenschaften
Medium: Buch
ISBN-13: 9781108832908
ISBN-10: 1108832903
Sprache: Englisch
Einband: Gebunden
Autor: Polyanskiy, Yury
Wu, Yihong
Hersteller: Cambridge University Press
Verantwortliche Person für die EU: Libri GmbH, Europaallee 1, D-36244 Bad Hersfeld, gpsr@libri.de
Maße: 249 x 175 x 41 mm
Von/Mit: Yury Polyanskiy (u. a.)
Erscheinungsdatum: 20.02.2025
Gewicht: 1,656 kg
Artikel-ID: 129274232
Über den Autor
Yury Polyanskiy is a Professor of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology, with a focus on information theory, statistical machine learning, error-correcting codes, wireless communication, and fault tolerance. He is the recipient of the 2020 IEEE Information Theory Society James Massey Award for outstanding achievement in research and teaching in Information Theory.
Inhaltsverzeichnis
Part I. Information measures: 1. Entropy; 2. Divergence; 3. Mutual information; 4. Variational characterizations and continuity of information measures; 5. Extremization of mutual information: capacity saddle point; 6. Tensorization and information rates; 7. f-divergences; 8. Entropy method in combinatorics and geometry; 9. Random number generators; Part II. Lossless Data Compression: 10. Variable-length compression; 11. Fixed-length compression and Slepian-Wolf theorem; 12. Entropy of ergodic processes; 13. Universal compression; Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma; 15. Information projection and large deviations; 16. Hypothesis testing: error exponents; Part IV. Channel Coding: 17. Error correcting codes; 18. Random and maximal coding; 19. Channel capacity; 20. Channels with input constraints. Gaussian channels; 21. Capacity per unit cost; 22. Strong converse. Channel dispersion. Error exponents. Finite blocklength; 23. Channel coding with feedback; Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory; 25. Rate distortion: achievability bounds; 26. Evaluating rate-distortion function. Lossy Source-Channel separation; 27. Metric entropy; Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics; 30. Mutual information method; 31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation; 33. Strong data processing inequality.
Details
Erscheinungsjahr: 2025
Fachbereich: Kommunikationswissenschaften
Genre: Importe, Medienwissenschaften
Rubrik: Wissenschaften
Medium: Buch
ISBN-13: 9781108832908
ISBN-10: 1108832903
Sprache: Englisch
Einband: Gebunden
Autor: Polyanskiy, Yury
Wu, Yihong
Hersteller: Cambridge University Press
Verantwortliche Person für die EU: Libri GmbH, Europaallee 1, D-36244 Bad Hersfeld, gpsr@libri.de
Maße: 249 x 175 x 41 mm
Von/Mit: Yury Polyanskiy (u. a.)
Erscheinungsdatum: 20.02.2025
Gewicht: 1,656 kg
Artikel-ID: 129274232
Sicherheitshinweis

Ähnliche Produkte

Ähnliche Produkte