Machine learning allows computers to learn and discern patterns without actually being programmed. When Statistical techniques and machine learning are combined together they are a powerful tool for analysing various kinds of data in many computer science/engineering areas including, image processing, speech processing, natural language processing, robot control, as well as in fundamental sciences such as biology, medicine, astronomy, physics, and materials. Introduction to Statistical Machine Learning provides a general introduction to machine learning that covers a wide range of topics…mehr
Machine learning allows computers to learn and discern patterns without actually being programmed. When Statistical techniques and machine learning are combined together they are a powerful tool for analysing various kinds of data in many computer science/engineering areas including, image processing, speech processing, natural language processing, robot control, as well as in fundamental sciences such as biology, medicine, astronomy, physics, and materials.
Introduction to Statistical Machine Learning provides a general introduction to machine learning that covers a wide range of topics concisely and will help you bridge the gap between theory and practice. Part I discusses the fundamental concepts of statistics and probability that are used in describing machine learning algorithms. Part II and Part III explain the two major approaches of machine learning techniques; generative methods and discriminative methods. While Part III provides an in-depth look at advanced topics that play essential roles in making machine learning algorithms more useful in practice. The accompanying MATLAB/Octave programs provide you with the necessary practical skills needed to accomplish a wide range of data analysis tasks.
Die Herstellerinformationen sind derzeit nicht verfügbar.
Autorenporträt
Masashi Sugiyama received the degrees of Bachelor of Engineering, Master of Engineering, and Doctor of Engineering in Computer Science from Tokyo Institute of Technology, Japan in 1997, 1999, and 2001, respectively. In 2001, he was appointed Assistant Professor in the same institute, and he was promoted to Associate Professor in 2003. He moved to the University of Tokyo as Professor in 2014. He received an Alexander von Humboldt Foundation Research Fellowship and researched at Fraunhofer Institute, Berlin, Germany, from 2003 to 2004. In 2006, he received a European Commission Program Erasmus Mundus Scholarship and researched at the University of Edinburgh, Edinburgh, UK. He received the Faculty Award from IBM in 2007 for his contribution to machine learning under non-stationarity, the Nagao Special Researcher Award from the Information Processing Society of Japan in 2011 and the Young Scientists' Prize from the Commendation for Science and Technology by the Minister of Education, Culture, Sports, Science and Technology Japan for his contribution to the density-ratio paradigm of machine learning. His research interests include theories and algorithms of machine learning and data mining, and a wide range of applications such as signal processing, image processing, and robot control.
Inhaltsangabe
Part I: Introduction to Statistics and Probability1. Random variables and probability distributions2. Examples of discrete probability distributions3. Examples of continuous probability distributions4. Multi-dimensional probability distributions5. Examples of multi-dimensional probability distributions6. Random sample generation from arbitrary probability distributions7. Probability distributions of the sum of independent random variables8. Probability inequalities9. Statistical inference10. Hypothesis testing
Part II: Generative Approach to Statistical Pattern Recognition11. Fundamentals of statistical pattern recognition12. Criteria for developing classifiers13. Maximum likelihood estimation14. Theoretical properties of maximum likelihood estimation15. Linear discriminant analysis16. Model selection for maximum likelihood estimation17. Maximum likelihood estimation for Gaussian mixture model18. Bayesian inference19. Numerical computation in Bayesian inference20. Model selection in Bayesian inference21. Kernel density estimation22. Nearest neighbor density estimation
Part III: Discriminative Approach to Statistical Machine Learning23. Fundamentals of statistical machine learning24. Learning Models25. Least-squares regression26. Constrained least-squares regression27. Sparse regression28. Robust regression29. Least-squares classification30. Support vector classification31. Ensemble classification32. Probabilistic classification33. Structured classification
Part IV: Further Topics34. Outlier detection35. Unsupervised dimensionality reduction36. Clustering37. Online learning38. Semi-supervised learning39. Supervised dimensionality reduction40. Transfer learning41. Multi-task learning
Part I: Introduction to Statistics and Probability1. Random variables and probability distributions2. Examples of discrete probability distributions3. Examples of continuous probability distributions4. Multi-dimensional probability distributions5. Examples of multi-dimensional probability distributions6. Random sample generation from arbitrary probability distributions7. Probability distributions of the sum of independent random variables8. Probability inequalities9. Statistical inference10. Hypothesis testing
Part II: Generative Approach to Statistical Pattern Recognition11. Fundamentals of statistical pattern recognition12. Criteria for developing classifiers13. Maximum likelihood estimation14. Theoretical properties of maximum likelihood estimation15. Linear discriminant analysis16. Model selection for maximum likelihood estimation17. Maximum likelihood estimation for Gaussian mixture model18. Bayesian inference19. Numerical computation in Bayesian inference20. Model selection in Bayesian inference21. Kernel density estimation22. Nearest neighbor density estimation
Part III: Discriminative Approach to Statistical Machine Learning23. Fundamentals of statistical machine learning24. Learning Models25. Least-squares regression26. Constrained least-squares regression27. Sparse regression28. Robust regression29. Least-squares classification30. Support vector classification31. Ensemble classification32. Probabilistic classification33. Structured classification
Part IV: Further Topics34. Outlier detection35. Unsupervised dimensionality reduction36. Clustering37. Online learning38. Semi-supervised learning39. Supervised dimensionality reduction40. Transfer learning41. Multi-task learning
Rezensionen
"The probabilistic and statistical background is well presented, providing the reader with a complete coverage of the generative approach to statistical pattern recognition and the discriminative approach to statistical machine learning." --Zentralblatt MATH
Es gelten unsere Allgemeinen Geschäftsbedingungen: www.buecher.de/agb
Impressum
www.buecher.de ist ein Internetauftritt der buecher.de internetstores GmbH
Geschäftsführung: Monica Sawhney | Roland Kölbl | Günter Hilger
Sitz der Gesellschaft: Batheyer Straße 115 - 117, 58099 Hagen
Postanschrift: Bürgermeister-Wegele-Str. 12, 86167 Augsburg
Amtsgericht Hagen HRB 13257
Steuernummer: 321/5800/1497
USt-IdNr: DE450055826