Yury Polyanskiy (Massachusetts Institute of Technology), Yihong Wu (Connecticut Yale University)
Information Theory
From Coding to Learning
Yury Polyanskiy (Massachusetts Institute of Technology), Yihong Wu (Connecticut Yale University)
Information Theory
From Coding to Learning
- Gebundenes Buch
- Merkliste
- Auf die Merkliste
- Bewerten Bewerten
- Teilen
- Produkt teilen
- Produkterinnerung
- Produkterinnerung
This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning. Includes over 210 student exercises, emphasising practical applications in statistics, machine learning and modern communication theory. Accompanied by online instructor solutions.
Andere Kunden interessierten sich auch für
Andreas LindholmMachine Learning50,99 €
Osvaldo Simeone (King's College London)Machine Learning for Engineers68,99 €
P. P. Vaidyanathan (California Institute of Technology)Signals, Systems, and Signal Processing59,99 €
Iddo DroriThe Science of Deep Learning51,39 €
GNSS Software Receivers89,99 €
Richard PoiselElec Warfare Recvrs & Recv.Sys177,99 €
Nicholas O'DonoughuePractical Geolocation for Electronic Warfare Using MATLAB142,99 €-
-
-
This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning. Includes over 210 student exercises, emphasising practical applications in statistics, machine learning and modern communication theory. Accompanied by online instructor solutions.
Produktdetails
- Produktdetails
- Verlag: Cambridge University Press
- Seitenzahl: 748
- Erscheinungstermin: 2. Januar 2025
- Englisch
- Abmessung: 260mm x 183mm x 44mm
- Gewicht: 1680g
- ISBN-13: 9781108832908
- ISBN-10: 1108832903
- Artikelnr.: 70725904
- Herstellerkennzeichnung
- Libri GmbH
- Europaallee 1
- 36244 Bad Hersfeld
- gpsr@libri.de
- Verlag: Cambridge University Press
- Seitenzahl: 748
- Erscheinungstermin: 2. Januar 2025
- Englisch
- Abmessung: 260mm x 183mm x 44mm
- Gewicht: 1680g
- ISBN-13: 9781108832908
- ISBN-10: 1108832903
- Artikelnr.: 70725904
- Herstellerkennzeichnung
- Libri GmbH
- Europaallee 1
- 36244 Bad Hersfeld
- gpsr@libri.de
Yury Polyanskiy is a Professor of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology, with a focus on information theory, statistical machine learning, error-correcting codes, wireless communication, and fault tolerance. He is the recipient of the 2020 IEEE Information Theory Society James Massey Award for outstanding achievement in research and teaching in Information Theory.
Part I. Information measures: 1. Entropy
2. Divergence
3. Mutual information
4. Variational characterizations and continuity of information measures
5. Extremization of mutual information: capacity saddle point
6. Tensorization and information rates
7. f-divergences
8. Entropy method in combinatorics and geometry
9. Random number generators
Part II. Lossless Data Compression: 10. Variable-length compression
11. Fixed-length compression and Slepian-Wolf theorem
12. Entropy of ergodic processes
13. Universal compression
Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma
15. Information projection and large deviations
16. Hypothesis testing: error exponents
Part IV. Channel Coding: 17. Error correcting codes
18. Random and maximal coding
19. Channel capacity
20. Channels with input constraints. Gaussian channels
21. Capacity per unit cost
22. Strong converse. Channel dispersion. Error exponents. Finite blocklength
23. Channel coding with feedback
Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory
25. Rate distortion: achievability bounds
26. Evaluating rate-distortion function. Lossy Source-Channel separation
27. Metric entropy
Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics
30. Mutual information method
31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation
33. Strong data processing inequality.
2. Divergence
3. Mutual information
4. Variational characterizations and continuity of information measures
5. Extremization of mutual information: capacity saddle point
6. Tensorization and information rates
7. f-divergences
8. Entropy method in combinatorics and geometry
9. Random number generators
Part II. Lossless Data Compression: 10. Variable-length compression
11. Fixed-length compression and Slepian-Wolf theorem
12. Entropy of ergodic processes
13. Universal compression
Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma
15. Information projection and large deviations
16. Hypothesis testing: error exponents
Part IV. Channel Coding: 17. Error correcting codes
18. Random and maximal coding
19. Channel capacity
20. Channels with input constraints. Gaussian channels
21. Capacity per unit cost
22. Strong converse. Channel dispersion. Error exponents. Finite blocklength
23. Channel coding with feedback
Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory
25. Rate distortion: achievability bounds
26. Evaluating rate-distortion function. Lossy Source-Channel separation
27. Metric entropy
Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics
30. Mutual information method
31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation
33. Strong data processing inequality.
Part I. Information measures: 1. Entropy
2. Divergence
3. Mutual information
4. Variational characterizations and continuity of information measures
5. Extremization of mutual information: capacity saddle point
6. Tensorization and information rates
7. f-divergences
8. Entropy method in combinatorics and geometry
9. Random number generators
Part II. Lossless Data Compression: 10. Variable-length compression
11. Fixed-length compression and Slepian-Wolf theorem
12. Entropy of ergodic processes
13. Universal compression
Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma
15. Information projection and large deviations
16. Hypothesis testing: error exponents
Part IV. Channel Coding: 17. Error correcting codes
18. Random and maximal coding
19. Channel capacity
20. Channels with input constraints. Gaussian channels
21. Capacity per unit cost
22. Strong converse. Channel dispersion. Error exponents. Finite blocklength
23. Channel coding with feedback
Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory
25. Rate distortion: achievability bounds
26. Evaluating rate-distortion function. Lossy Source-Channel separation
27. Metric entropy
Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics
30. Mutual information method
31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation
33. Strong data processing inequality.
2. Divergence
3. Mutual information
4. Variational characterizations and continuity of information measures
5. Extremization of mutual information: capacity saddle point
6. Tensorization and information rates
7. f-divergences
8. Entropy method in combinatorics and geometry
9. Random number generators
Part II. Lossless Data Compression: 10. Variable-length compression
11. Fixed-length compression and Slepian-Wolf theorem
12. Entropy of ergodic processes
13. Universal compression
Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma
15. Information projection and large deviations
16. Hypothesis testing: error exponents
Part IV. Channel Coding: 17. Error correcting codes
18. Random and maximal coding
19. Channel capacity
20. Channels with input constraints. Gaussian channels
21. Capacity per unit cost
22. Strong converse. Channel dispersion. Error exponents. Finite blocklength
23. Channel coding with feedback
Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory
25. Rate distortion: achievability bounds
26. Evaluating rate-distortion function. Lossy Source-Channel separation
27. Metric entropy
Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics
30. Mutual information method
31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation
33. Strong data processing inequality.







