Introduction to Statistical Machine Learning

Introduction to Statistical Machine Learning

3.5 (4 ratings by Goodreads)
By (author) 

Free delivery worldwide

Available. Dispatched from the UK in 2 business days
When will my order arrive?

Description

Machine learning allows computers to learn and discern patterns without actually being programmed. When Statistical techniques and machine learning are combined together they are a powerful tool for analysing various kinds of data in many computer science/engineering areas including, image processing, speech processing, natural language processing, robot control, as well as in fundamental sciences such as biology, medicine, astronomy, physics, and materials.

Introduction to Statistical Machine Learning provides a general introduction to machine learning that covers a wide range of topics concisely and will help you bridge the gap between theory and practice. Part I discusses the fundamental concepts of statistics and probability that are used in describing machine learning algorithms. Part II and Part III explain the two major approaches of machine learning techniques; generative methods and discriminative methods. While Part III provides an in-depth look at advanced topics that play essential roles in making machine learning algorithms more useful in practice. The accompanying MATLAB/Octave programs provide you with the necessary practical skills needed to accomplish a wide range of data analysis tasks.
show more

Product details

  • Paperback | 534 pages
  • 191 x 235 x 25.4mm | 1,110g
  • Morgan Kaufmann Publishers In
  • San Francisco, United States
  • English
  • black & white illustrations
  • 0128021217
  • 9780128021217
  • 1,396,325

Table of contents

Part I: Introduction to Statistics and Probability

1. Random variables and probability distributions

2. Examples of discrete probability distributions

3. Examples of continuous probability distributions

4. Multi-dimensional probability distributions

5. Examples of multi-dimensional probability distributions

6. Random sample generation from arbitrary probability distributions

7. Probability distributions of the sum of independent random variables

8. Probability inequalities

9. Statistical inference

10. Hypothesis testing

Part II: Generative Approach to Statistical Pattern Recognition

11. Fundamentals of statistical pattern recognition

12. Criteria for developing classifiers

13. Maximum likelihood estimation

14. Theoretical properties of maximum likelihood estimation

15. Linear discriminant analysis

16. Model selection for maximum likelihood estimation

17. Maximum likelihood estimation for Gaussian mixture model

18. Bayesian inference

19. Numerical computation in Bayesian inference

20. Model selection in Bayesian inference

21. Kernel density estimation

22. Nearest neighbor density estimation

Part III: Discriminative Approach to Statistical Machine Learning

23. Fundamentals of statistical machine learning

24. Learning Models

25. Least-squares regression

26. Constrained least-squares regression

27. Sparse regression

28. Robust regression

29. Least-squares classification

30. Support vector classification

31. Ensemble classification

32. Probabilistic classification

33. Structured classification

Part IV: Further Topics

34. Outlier detection

35. Unsupervised dimensionality reduction

36. Clustering

37. Online learning

38. Semi-supervised learning

39. Supervised dimensionality reduction

40. Transfer learning

41. Multi-task learning
show more

Review Text

"The probabilistic and statistical background is well presented, providing the reader with a complete coverage of the generative approach to statistical pattern recognition and the discriminative approach to statistical machine learning." -- Zentralblatt MATH, Introduction to Statistical Machine Learning
show more

Review quote

"The probabilistic and statistical background is well presented, providing the reader with a complete coverage of the generative approach to statistical pattern recognition and the discriminative approach to statistical machine learning." --Zentralblatt MATH, Introduction to Statistical Machine Learning
show more

About Masashi Sugiyama

Masashi Sugiyama received the degrees of Bachelor of Engineering, Master of Engineering, and Doctor of Engineering in Computer Science from Tokyo Institute of Technology, Japan in 1997, 1999, and 2001, respectively. In 2001, he was appointed Assistant Professor in the same institute, and he was promoted to Associate Professor in 2003. He moved to the University of Tokyo as Professor in 2014. He received an Alexander von Humboldt Foundation Research Fellowship and researched at Fraunhofer Institute, Berlin, Germany, from 2003 to 2004. In 2006, he received a European Commission Program Erasmus Mundus Scholarship and researched at the University of Edinburgh, Edinburgh, UK. He received the Faculty Award from IBM in 2007 for his contribution to machine learning under non-stationarity, the Nagao Special Researcher Award from the Information Processing Society of Japan in 2011 and the Young Scientists' Prize from the Commendation for Science and Technology by the Minister of Education, Culture, Sports, Science and Technology Japan for his contribution to the density-ratio paradigm of machine learning. His research interests include theories and algorithms of machine learning and data mining, and a wide range of applications such as signal processing, image processing, and robot control.
show more

Rating details

4 ratings
3.5 out of 5 stars
5 25% (1)
4 0% (0)
3 75% (3)
2 0% (0)
1 0% (0)
Book ratings by Goodreads
Goodreads is the world's largest site for readers with over 50 million reviews. We're featuring millions of their reader ratings on our book pages to help you find your new favourite book. Close X