| Titre : |
Gaussian Processes for Machine Learning |
| Type de document : |
texte imprimé |
| Auteurs : |
Carl Edward Rasmussen (1969-....), Auteur ; Christopher K. I. Williams, Auteur |
| Editeur : |
Cambridge, Mass. : MIT Press |
| Année de publication : |
2006, cop. 2006 |
| Collection : |
Adaptive computation and machine learning |
| Importance : |
1 vol. (XVIII-248 p.) |
| Présentation : |
graphiques, figures, illustrations, jaquette illustrée |
| Format : |
26 cm |
| ISBN/ISSN/EAN : |
978-0-262-18253-9 |
| Note générale : |
PPN 097588938 .- ISBN 0-262-18253-X (rel.) Document accessible en ligne sur Mit Press direct (https://direct.mit.edu/books/oa-monograph/2320/Gaussian-Processes-for-Machine-Learning ; https://gaussianprocess.org/gpml/chapters/RW.pdf) |
| Langues : |
Anglais (eng) |
| Tags : |
Processus gaussiens -- Informatique Apprentissage automatique -- Modèles mathématiques Markov, Processus de Gaussian processes -- Data processing Machine learning -- Mathematical models Markov processes |
| Index. décimale : |
519.23 Processus probabilistes - Processus stochastiques - Processus gaussiens |
| Résumé : |
Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. GPs have received increased attention in the machine-learning community over the past decade, and this book provides a long-needed systematic and unified treatment of theoretical and practical aspects of GPs in machine learning. The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics. The book deals with the supervised-learning problem for both regression and classification, and includes detailed algorithms. A wide variety of covariance (kernel) functions are presented and their properties discussed. Model selection is discussed both from a Bayesian and a classical perspective. Many connections to other well-known techniques from machine learning and statistics are discussed, including support-vector machines, neural networks, splines, regularization networks, relevance vector machines and others. Theoretical issues including learning curves and the PAC-Bayesian framework are treated, and several approximation methods for learning with large datasets are discussed. The book contains illustrative examples and exercises, and code and datasets are available on the Web. Appendixes provide mathematical background and a discussion of Gaussian Markov processes. |
| Note de contenu : |
Bibliographie p. [223]-238. Index auteurs p.[239]-243. Index sujet p.[244]-248 |
| En ligne : |
https://direct.mit.edu/books/oa-monograph/2320/Gaussian-Processes-for-Machine-Le [...] |
Gaussian Processes for Machine Learning [texte imprimé] / Carl Edward Rasmussen (1969-....), Auteur ; Christopher K. I. Williams, Auteur . - Cambridge, Mass. : MIT Press, 2006, cop. 2006 . - 1 vol. (XVIII-248 p.) : graphiques, figures, illustrations, jaquette illustrée ; 26 cm. - ( Adaptive computation and machine learning) . ISBN : 978-0-262-18253-9 PPN 097588938 .- ISBN 0-262-18253-X (rel.) Document accessible en ligne sur Mit Press direct (https://direct.mit.edu/books/oa-monograph/2320/Gaussian-Processes-for-Machine-Learning ; https://gaussianprocess.org/gpml/chapters/RW.pdf) Langues : Anglais ( eng)
| Tags : |
Processus gaussiens -- Informatique Apprentissage automatique -- Modèles mathématiques Markov, Processus de Gaussian processes -- Data processing Machine learning -- Mathematical models Markov processes |
| Index. décimale : |
519.23 Processus probabilistes - Processus stochastiques - Processus gaussiens |
| Résumé : |
Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. GPs have received increased attention in the machine-learning community over the past decade, and this book provides a long-needed systematic and unified treatment of theoretical and practical aspects of GPs in machine learning. The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics. The book deals with the supervised-learning problem for both regression and classification, and includes detailed algorithms. A wide variety of covariance (kernel) functions are presented and their properties discussed. Model selection is discussed both from a Bayesian and a classical perspective. Many connections to other well-known techniques from machine learning and statistics are discussed, including support-vector machines, neural networks, splines, regularization networks, relevance vector machines and others. Theoretical issues including learning curves and the PAC-Bayesian framework are treated, and several approximation methods for learning with large datasets are discussed. The book contains illustrative examples and exercises, and code and datasets are available on the Web. Appendixes provide mathematical background and a discussion of Gaussian Markov processes. |
| Note de contenu : |
Bibliographie p. [223]-238. Index auteurs p.[239]-243. Index sujet p.[244]-248 |
| En ligne : |
https://direct.mit.edu/books/oa-monograph/2320/Gaussian-Processes-for-Machine-Le [...] |
|