Bild 1 von 1

Galerie
Bild 1 von 1

Ähnlichen Artikel verkaufen?
Informationsth eorie: Vom Programmieren zum Lernen von Yury Polyanskiy Hardcover Buch
US $113,38
Ca.CHF 90,90
Artikelzustand:
Neu
Neues, ungelesenes, ungebrauchtes Buch in makellosem Zustand ohne fehlende oder beschädigte Seiten. Genauere Einzelheiten entnehmen Sie bitte dem Angebot des Verkäufers.
3 verfügbar
Oops! Looks like we're having trouble connecting to our server.
Refresh your browser window to try again.
Versand:
Kostenlos Economy Shipping.
Standort: Fairfield, Ohio, USA
Lieferung:
Lieferung zwischen Do, 24. Jul und Mi, 30. Jul nach 94104 bei heutigem Zahlungseingang
Rücknahme:
30 Tage Rückgabe. Käufer zahlt Rückversand. Wenn Sie ein eBay-Versandetikett verwenden, werden die Kosten dafür von Ihrer Rückerstattung abgezogen.
Zahlungen:
Sicher einkaufen
Der Verkäufer ist für dieses Angebot verantwortlich.
eBay-Artikelnr.:388445774903
Artikelmerkmale
- Artikelzustand
- ISBN-13
- 9781108832908
- Type
- NA
- Publication Name
- NA
- ISBN
- 9781108832908
Über dieses Produkt
Product Identifiers
Publisher
Cambridge University Press
ISBN-10
1108832903
ISBN-13
9781108832908
eBay Product ID (ePID)
23067392090
Product Key Features
Book Title
Information Theory : from Coding to Learning
Number of Pages
550 Pages
Language
English
Topic
Signals & Signal Processing, General
Publication Year
2025
Illustrator
Yes
Genre
Technology & Engineering, Science
Format
Hardcover
Dimensions
Item Height
1.7 in
Item Length
10.3 in
Item Width
7.2 in
Additional Product Features
LCCN
2024-004713
Reviews
'Polyanskiy and Wu's book treats information theory and various subjects of statistics in a unique ensemble, a striking novelty in the literature. It develops in depth the connections between the two fields, which helps to presenting the theory in a more complete, elegant and transparent way. An exciting and inspiring read for graduate students and researchers.' Alexandre Tsybakov, CREST-ENSAE, Paris, 'The central role of information theory in data science and machine learning is highlighted in this book, and will be of interest to all researchers in these areas. The authors are two of the leading young information theorists currently active. Their deep understanding of the area is evident in the technical depth of the treatment, which also covers many communication theory-oriented aspects of information theory.' Venkat Anantharam, University of California, Berkeley, 'Since the publication of Claude E. Shannon's A Mathematical Theory of Communication in 1948, information theory has expanded beyond its original focus on reliable transmission and storage of information to applications in statistics, machine learning, computer science, and beyond. This textbook, written by two leading researchers at the intersection of these fields, offers a modern synthesis of both the classical subject matter and these recent developments. It is bound to become a classic reference.' Maxim Raginsky, University of Illinois, Urbana-Champaign, 'Written in a mathematically rigorous yet accessible style, this book offers information-theoretic tools that are indispensable for high-dimensional statistics. It also presents the classic topic of coding theorems in the modern one-shot (finite block-length) approach. To put it briefly, this is the information theory textbook of the new era.' Shun Watanabe, Tokyo University of Agriculture and Technology
Dewey Edition
23
Dewey Decimal
003.54
Table Of Content
Part I. Information measures: 1. Entropy; 2. Divergence; 3. Mutual information; 4. Variational characterizations and continuity of information measures; 5. Extremization of mutual information: capacity saddle point; 6. Tensorization and information rates; 7. f-divergences; 8. Entropy method in combinatorics and geometry; 9. Random number generators; Part II. Lossless Data Compression: 10. Variable-length compression; 11. Fixed-length compression and Slepian-Wolf theorem; 12. Entropy of ergodic processes; 13. Universal compression; Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma; 15. Information projection and large deviations; 16. Hypothesis testing: error exponents; Part IV. Channel Coding: 17. Error correcting codes; 18. Random and maximal coding; 19. Channel capacity; 20. Channels with input constraints. Gaussian channels; 21. Capacity per unit cost; 22. Strong converse. Channel dispersion. Error exponents. Finite blocklength; 23. Channel coding with feedback; Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory; 25. Rate distortion: achievability bounds; 26. Evaluating rate-distortion function. Lossy Source-Channel separation; 27. Metric entropy; Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics; 30. Mutual information method; 31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation; 33. Strong data processing inequality.
Synopsis
This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning, equipping students with a uniquely well-rounded and rigorous foundation for further study. Introduces core topics such as data compression, channel coding, and rate-distortion theory using a unique finite block-length approach. With over 210 end-of-part exercises and numerous examples, students are introduced to contemporary applications in statistics, machine learning and modern communication theory. This textbook presents information-theoretic methods with applications in statistical learning and computer science, such as f-divergences, PAC Bayes and variational principle, Kolmogorov's metric entropy, strong data processing inequalities, and entropic upper bounds for statistical estimation. Accompanied by a solutions manual for instructors, and additional standalone chapters on more specialized topics in information theory, this is the ideal introductory textbook for senior undergraduate and graduate students in electrical engineering, statistics, and computer science., This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning. Includes over 210 student exercises, emphasising practical applications in statistics, machine learning and modern communication theory. Accompanied by online instructor solutions.
LC Classification Number
Q360.P65 2025
Artikelbeschreibung des Verkäufers
Info zu diesem Verkäufer
grandeagleretail
98,3% positive Bewertungen•2.8 Mio. Artikel verkauft
Angemeldet als gewerblicher Verkäufer
Verkäuferbewertungen (1'053'852)
- e***e (610)- Bewertung vom Käufer.Letzter MonatBestätigter KaufAwesome seller.... highly recommend!!!!
- r***s (963)- Bewertung vom Käufer.Letzter MonatBestätigter KaufFAST SHIPPING!!! GREAT EBAY SELLER!!! 5 STARS !!!!!
- i***o (866)- Bewertung vom Käufer.Letzter MonatBestätigter KaufJust as described , good communication. Thank you
Noch mehr entdecken:
- Penguin Books Sprachkurse und Lehrmaterialien,
- Penguin Books Studium und Erwachsenenbildung,
- Penguin Books Fachbücher, Lernen und Nachschlagen,
- Englische Studium und Erwachsenenbildung Penguin Books,
- Penguin Books Studium und Erwachsenenbildung Ab 2010,
- Gebundene-Ausgabe-Penguin-Books Studium und Erwachsenenbildung,
- Taschenbuch-Format-Penguin-Books Studium und Erwachsenenbildung,
- Fachbücher, Lernen und Nachschlagen Penguin Books auf Englisch,
- Taschenbuch-Format-Penguin-Books Fachbücher, Lernen und Nachschlagen,
- Penguin Books Ab 2010 Fachbücher, Lernen und Nachschlagen