Gaussian processes for machine learning:
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Buch |
Sprache: | English |
Veröffentlicht: |
Cambridge, Mass. [u.a.]
MIT Press
2006
|
Schriftenreihe: | Adaptive computation and machine learning
|
Schlagworte: | |
Online-Zugang: | Inhaltsverzeichnis |
Beschreibung: | Hier auch später erschienene, unveränderte Nachdrucke |
Beschreibung: | XVIII, 248 S. graph. Darst. |
ISBN: | 9780262182539 026218253X |
Internformat
MARC
LEADER | 00000nam a2200000zc 4500 | ||
---|---|---|---|
001 | BV021552118 | ||
003 | DE-604 | ||
005 | 20221202 | ||
007 | t | ||
008 | 060418s2006 xxud||| |||| 00||| eng d | ||
010 | |a 2005053433 | ||
020 | |a 9780262182539 |9 978-0-262-18253-9 | ||
020 | |a 026218253X |9 0-262-18253-X | ||
035 | |a (OCoLC)61285753 | ||
035 | |a (DE-599)BVBBV021552118 | ||
040 | |a DE-604 |b ger |e aacr | ||
041 | 0 | |a eng | |
044 | |a xxu |c US | ||
049 | |a DE-91G |a DE-706 |a DE-355 |a DE-29T |a DE-83 |a DE-11 |a DE-91 |a DE-188 |a DE-384 |a DE-703 |a DE-739 | ||
050 | 0 | |a QA274.4 | |
082 | 0 | |a 519.2/3 |2 22 | |
084 | |a ST 300 |0 (DE-625)143650: |2 rvk | ||
084 | |a ST 304 |0 (DE-625)143653: |2 rvk | ||
084 | |a SK 820 |0 (DE-625)143258: |2 rvk | ||
084 | |a DAT 708f |2 stub | ||
084 | |a MAT 605f |2 stub | ||
100 | 1 | |a Rasmussen, Carl Edward |e Verfasser |0 (DE-588)1137889306 |4 aut | |
245 | 1 | 0 | |a Gaussian processes for machine learning |c Carl Edward Rasmussen ; Christopher K.I. Williams |
264 | 1 | |a Cambridge, Mass. [u.a.] |b MIT Press |c 2006 | |
300 | |a XVIII, 248 S. |b graph. Darst. | ||
336 | |b txt |2 rdacontent | ||
337 | |b n |2 rdamedia | ||
338 | |b nc |2 rdacarrier | ||
490 | 0 | |a Adaptive computation and machine learning | |
500 | |a Hier auch später erschienene, unveränderte Nachdrucke | ||
650 | 4 | |a Apprentissage automatique - Modèles mathématiques | |
650 | 4 | |a Processus gaussiens - Informatique | |
650 | 4 | |a Datenverarbeitung | |
650 | 4 | |a Mathematisches Modell | |
650 | 4 | |a Gaussian processes |x Data processing | |
650 | 4 | |a Machine learning |x Mathematical models | |
650 | 0 | 7 | |a Gauß-Prozess |0 (DE-588)4156111-9 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Maschinelles Lernen |0 (DE-588)4193754-5 |2 gnd |9 rswk-swf |
689 | 0 | 0 | |a Maschinelles Lernen |0 (DE-588)4193754-5 |D s |
689 | 0 | 1 | |a Gauß-Prozess |0 (DE-588)4156111-9 |D s |
689 | 0 | |5 DE-604 | |
700 | 1 | |a Williams, Christopher K. I. |d ca. 20./21. Jh. |e Sonstige |0 (DE-588)1274480272 |4 oth | |
856 | 4 | 2 | |m HBZ Datenaustausch |q application/pdf |u http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=014768173&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |3 Inhaltsverzeichnis |
999 | |a oai:aleph.bib-bvb.de:BVB01-014768173 |
Datensatz im Suchindex
_version_ | 1804135308796624896 |
---|---|
adam_text | Contents
Series Foreword xi
Preface xiii
Symbols and Notation xvii
1 Introduction 1
1.1 A Pictorial Introduction to Bayesian Modelling 3
1.2 Roadmap 5
2 Regression 7
2.1 Weight space View 7
2.1.1 The Standard Linear Model 8
2.1.2 Projections of Inputs into Feature Space 11
2.2 Function space View 13
2.3 Varying the Hyperparameters 19
2.4 Decision Theory for Regression 21
2.5 An Example Application 22
2.6 Smoothing, Weight Functions and Equivalent Kernels 24
* 2.7 Incorporating Explicit Basis Functions 27
2.7.1 Marginal Likelihood 29
2.8 History and Related Work 29
2.9 Exercises 30
3 Classification 33
3.1 Classification Problems 34
3.1.1 Decision Theory for Classification 35
3.2 Linear Models for Classification 37
3.3 Gaussian Process Classification 39
3.4 The Laplace Approximation for the Binary GP Classifier 41
3.4.1 Posterior 42
3.4.2 Predictions 44
3.4.3 Implementation 45
3.4.4 Marginal Likelihood 47
* 3.5 Multi class Laplace Approximation 48
3.5.1 Implementation 51
3.6 Expectation Propagation 52
3.6.1 Predictions 56
3.6.2 Marginal Likelihood 57
3.6.3 Implementation 57
3.7 Experiments 60
3.7.1 A Toy Problem 60
3.7.2 One dimensional Example 62
3.7.3 Binary Handwritten Digit Classification Example 63
3.7.4 10 class Handwritten Digit Classification Example 70
3.8 Discussion 72
Sections marked by an asterisk contain advanced material that may be omitted on a first reading.
Contents
* 3.9 Appendix: Moment Derivations 74
3.10 Exercises 75
4 Covariance Functions 79
4.1 Preliminaries 79
* 4.1.1 Mean Square Continuity and Differentiability 81
4.2 Examples of Covariance Functions 81
4.2.1 Stationary Covariance Functions 82
4.2.2 Dot Product Covariance Functions 89
4.2.3 Other Non stationary Covariance Functions 90
4.2.4 Making New Kernels from Old 94
4.3 Eigenfunction Analysis of Kernels 96
* 4.3.1 An Analytic Example 97
4.3.2 Numerical Approximation of Eigenfunctions 98
4.4 Kernels for Non vectorial Inputs 99
4.4.1 String Kernels 100
4.4.2 Fisher Kernels 101
4.5 Exercises 102
5 Model Selection and Adaptation of Hyperparameters 105
5.1 The Model Selection Problem 106
5.2 Bayesian Model Selection 108
5.3 Cross validation Ill
5.4 Model Selection for GP Regression . 112
5.4.1 Marginal Likelihood 112
5.4.2 Cross validation 116
5.4.3 Examples and Discussion . 118
5.5 Model Selection for GP Classification 124
* 5.5.1 Derivatives of the Marginal Likelihood for Laplace s Approximation 125
* 5.5.2 Derivatives of the Marginal Likelihood for EP 127
5.5.3 Cross validation 127
5.5.4 Example 128
5.6 Exercises 128
6 Relationships between GPs and Other Models 129
6.1 Reproducing Kernel Hilbert Spaces • 129
6.2 Regularization 132
* 6.2.1 Regularization Denned by Differential Operators 133
6.2.2 Obtaining the Regularized Solution 135
6.2.3 The Relationship of the Regularization View to Gaussian Process
Prediction 135
6.3 Spline Models 136
* 6.3.1 A 1 d Gaussian Process Spline Construction 138
* 6.4 Support Vector Machines 141
6.4.1 Support Vector Classification 141
6.4.2 Support Vector Regression 145
* 6.5 Least squares Classification 146
6.5.1 Probabilistic Least squares Classification 147
Contents
* 6.6 Relevance Vector Machines 149
6.7 Exercises 150
7 Theoretical Perspectives 151
7.1 The Equivalent Kernel 151
7.1.1 Some Specific Examples of Equivalent Kernels 153
* 7.2 Asymptotic Analysis 155
7.2.1 Consistency 155
7.2.2 Equivalence and Orthogonality 157
* 7.3 Average case Learning Curves 159
* 7.4 PAC Bayesian Analysis 161
7.4.1 The PAC Framework 162
7.4.2 PAC Bayesian Analysis 163
7.4.3 PAC Bayesian Analysis of GP Classification 164
7.5 Comparison with Other Supervised Learning Methods 165
* 7.6 Appendix: Learning Curve for the Ornstein Uhlenbeck Process 168
7.7 Exercises 169
8 Approximation Methods for Large Datasets 171
8.1 Reduced rank Approximations of the Gram Matrix 171
8.2 Greedy Approximation 174
8.3 Approximations for GPR with Fixed Hyperparameters 175
8.3.1 Subset of Regressors 175
8.3.2 The Nystrom Method 177
8.3.3 Subset of Datapoints 177
8.3.4 Projected Process Approximation 178
8.3.5 Bayesian Committee Machine 180
8.3.6 Iterative Solution of Linear Systems 181
8.3.7 Comparison of Approximate GPR Methods 182
8.4 Approximations for GPC with Fixed Hyperparameters 185
* 8.5 Approximating the Marginal Likelihood and its Derivatives 185
* 8.6 Appendix: Equivalence of SR and GPR Using the Nystrom Approximate
Kernel 187
8.7 Exercises 187
9 Further Issues and Conclusions 189
9.1 Multiple Outputs 190
9.2 Noise Models with Dependencies 190
9.3 Non Gaussian Likelihoods 191
9.4 Derivative Observations 191
9.5 Prediction with Uncertain Inputs 192
9.6 Mixtures of Gaussian Processes 192
9.7 Global Optimization 193
9.8 Evaluation of Integrals 193
9.9 Student s t Process 194
9.10 Invarianc.es 194
9.11 Latent Variable Models 196
9.12 Conclusions and Future Directions 196
Contents
Appendix A Mathematical Background 199
A.I Joint, Marginal and Conditional Probability 199
A.2 Gaussian Identities 200
A.3 Matrix Identities 201
A.3.1 Matrix Derivatives 202
A.3.2 Matrix Norms 202
A.4 Cholesky Decomposition 202
A.5 Entropy and Kullback Leibler Divergence 203
A.6 Limits 204
A.7 Measure and Integration 204
A.7.1 Lp Spaces 205
A.8 Fourier Transforms 205
A.9 Convexity 206
Appendix B Gaussian Markov Processes 207
B.I Fourier Analysis 208
B.I.I Sampling and Periodization 209
B.2 Continuous time Gaussian Markov Processes 211
B.2.1 Continuous time GMPs onl 211
B.2.2 The Solution of the Corresponding SDE on the Circle 213
B.3 Discrete time Gaussian Markov Processes 214
B.3.1 Discrete time GMPs on Z 214
B.3.2 The Solution of the Corresponding Difference Equation on P,v ¦ • 215
B.4 The Relationship Between Discrete time and Sampled Continuous time
GMPs 217
B.5 Markov Processes in Higher Dimensions 218
Appendix C Datasets and Code 221
Bibliography 223
Author Index 239
Subject Index 245
|
adam_txt |
Contents
Series Foreword xi
Preface xiii
Symbols and Notation xvii
1 Introduction 1
1.1 A Pictorial Introduction to Bayesian Modelling 3
1.2 Roadmap 5
2 Regression 7
2.1 Weight space View 7
2.1.1 The Standard Linear Model 8
2.1.2 Projections of Inputs into Feature Space 11
2.2 Function space View 13
2.3 Varying the Hyperparameters 19
2.4 Decision Theory for Regression 21
2.5 An Example Application 22
2.6 Smoothing, Weight Functions and Equivalent Kernels 24
* 2.7 Incorporating Explicit Basis Functions 27
2.7.1 Marginal Likelihood 29
2.8 History and Related Work 29
2.9 Exercises 30
3 Classification 33
3.1 Classification Problems 34
3.1.1 Decision Theory for Classification 35
3.2 Linear Models for Classification 37
3.3 Gaussian Process Classification 39
3.4 The Laplace Approximation for the Binary GP Classifier 41
3.4.1 Posterior 42
3.4.2 Predictions 44
3.4.3 Implementation 45
3.4.4 Marginal Likelihood 47
* 3.5 Multi class Laplace Approximation 48
3.5.1 Implementation 51
3.6 Expectation Propagation 52
3.6.1 Predictions 56
3.6.2 Marginal Likelihood 57
3.6.3 Implementation 57
3.7 Experiments 60
3.7.1 A Toy Problem 60
3.7.2 One dimensional Example 62
3.7.3 Binary Handwritten Digit Classification Example 63
3.7.4 10 class Handwritten Digit Classification Example 70
3.8 Discussion 72
'Sections marked by an asterisk contain advanced material that may be omitted on a first reading.
Contents
* 3.9 Appendix: Moment Derivations 74
3.10 Exercises 75
4 Covariance Functions 79
4.1 Preliminaries 79
* 4.1.1 Mean Square Continuity and Differentiability 81
4.2 Examples of Covariance Functions 81
4.2.1 Stationary Covariance Functions 82
4.2.2 Dot Product Covariance Functions 89
4.2.3 Other Non stationary Covariance Functions 90
4.2.4 Making New Kernels from Old 94
4.3 Eigenfunction Analysis of Kernels 96
* 4.3.1 An Analytic Example 97
4.3.2 Numerical Approximation of Eigenfunctions 98
4.4 Kernels for Non vectorial Inputs 99
4.4.1 String Kernels 100
4.4.2 Fisher Kernels 101
4.5 Exercises 102
5 Model Selection and Adaptation of Hyperparameters 105
5.1 The Model Selection Problem 106
5.2 Bayesian Model Selection 108
5.3 Cross validation Ill
5.4 Model Selection for GP Regression . 112
5.4.1 Marginal Likelihood 112
5.4.2 Cross validation 116
5.4.3 Examples and Discussion . 118
5.5 Model Selection for GP Classification 124
* 5.5.1 Derivatives of the Marginal Likelihood for Laplace's Approximation 125
* 5.5.2 Derivatives of the Marginal Likelihood for EP 127
5.5.3 Cross validation 127
5.5.4 Example 128
5.6 Exercises 128
6 Relationships between GPs and Other Models 129
6.1 Reproducing Kernel Hilbert Spaces • 129
6.2 Regularization 132
* 6.2.1 Regularization Denned by Differential Operators 133
6.2.2 Obtaining the Regularized Solution 135
6.2.3 The Relationship of the Regularization View to Gaussian Process
Prediction 135
6.3 Spline Models 136
* 6.3.1 A 1 d Gaussian Process Spline Construction 138
* 6.4 Support Vector Machines 141
6.4.1 Support Vector Classification 141
6.4.2 Support Vector Regression 145
* 6.5 Least squares Classification 146
6.5.1 Probabilistic Least squares Classification 147
Contents
* 6.6 Relevance Vector Machines 149
6.7 Exercises 150
7 Theoretical Perspectives 151
7.1 The Equivalent Kernel 151
7.1.1 Some Specific Examples of Equivalent Kernels 153
* 7.2 Asymptotic Analysis 155
7.2.1 Consistency 155
7.2.2 Equivalence and Orthogonality 157
* 7.3 Average case Learning Curves 159
* 7.4 PAC Bayesian Analysis 161
7.4.1 The PAC Framework 162
7.4.2 PAC Bayesian Analysis 163
7.4.3 PAC Bayesian Analysis of GP Classification 164
7.5 Comparison with Other Supervised Learning Methods 165
* 7.6 Appendix: Learning Curve for the Ornstein Uhlenbeck Process 168
7.7 Exercises 169
8 Approximation Methods for Large Datasets 171
8.1 Reduced rank Approximations of the Gram Matrix 171
8.2 Greedy Approximation 174
8.3 Approximations for GPR with Fixed Hyperparameters 175
8.3.1 Subset of Regressors 175
8.3.2 The Nystrom Method 177
8.3.3 Subset of Datapoints 177
8.3.4 Projected Process Approximation 178
8.3.5 Bayesian Committee Machine 180
8.3.6 Iterative Solution of Linear Systems 181
8.3.7 Comparison of Approximate GPR Methods 182
8.4 Approximations for GPC with Fixed Hyperparameters 185
* 8.5 Approximating the Marginal Likelihood and its Derivatives 185
* 8.6 Appendix: Equivalence of SR and GPR Using the Nystrom Approximate
Kernel 187
8.7 Exercises 187
9 Further Issues and Conclusions 189
9.1 Multiple Outputs 190
9.2 Noise Models with Dependencies 190
9.3 Non Gaussian Likelihoods 191
9.4 Derivative Observations 191
9.5 Prediction with Uncertain Inputs 192
9.6 Mixtures of Gaussian Processes 192
9.7 Global Optimization 193
9.8 Evaluation of Integrals 193
9.9 Student's t Process 194
9.10 Invarianc.es 194
9.11 Latent Variable Models 196
9.12 Conclusions and Future Directions 196
Contents
Appendix A Mathematical Background 199
A.I Joint, Marginal and Conditional Probability 199
A.2 Gaussian Identities 200
A.3 Matrix Identities 201
A.3.1 Matrix Derivatives 202
A.3.2 Matrix Norms 202
A.4 Cholesky Decomposition 202
A.5 Entropy and Kullback Leibler Divergence 203
A.6 Limits 204
A.7 Measure and Integration 204
A.7.1 Lp Spaces 205
A.8 Fourier Transforms 205
A.9 Convexity 206
Appendix B Gaussian Markov Processes 207
B.I Fourier Analysis 208
B.I.I Sampling and Periodization 209
B.2 Continuous time Gaussian Markov Processes 211
B.2.1 Continuous time GMPs onl 211
B.2.2 The Solution of the Corresponding SDE on the Circle 213
B.3 Discrete time Gaussian Markov Processes 214
B.3.1 Discrete time GMPs on Z 214
B.3.2 The Solution of the Corresponding Difference Equation on P,v ¦ • 215
B.4 The Relationship Between Discrete time and Sampled Continuous time
GMPs 217
B.5 Markov Processes in Higher Dimensions 218
Appendix C Datasets and Code 221
Bibliography 223
Author Index 239
Subject Index 245 |
any_adam_object | 1 |
any_adam_object_boolean | 1 |
author | Rasmussen, Carl Edward |
author_GND | (DE-588)1137889306 (DE-588)1274480272 |
author_facet | Rasmussen, Carl Edward |
author_role | aut |
author_sort | Rasmussen, Carl Edward |
author_variant | c e r ce cer |
building | Verbundindex |
bvnumber | BV021552118 |
callnumber-first | Q - Science |
callnumber-label | QA274 |
callnumber-raw | QA274.4 |
callnumber-search | QA274.4 |
callnumber-sort | QA 3274.4 |
callnumber-subject | QA - Mathematics |
classification_rvk | ST 300 ST 304 SK 820 |
classification_tum | DAT 708f MAT 605f |
ctrlnum | (OCoLC)61285753 (DE-599)BVBBV021552118 |
dewey-full | 519.2/3 |
dewey-hundreds | 500 - Natural sciences and mathematics |
dewey-ones | 519 - Probabilities and applied mathematics |
dewey-raw | 519.2/3 |
dewey-search | 519.2/3 |
dewey-sort | 3519.2 13 |
dewey-tens | 510 - Mathematics |
discipline | Informatik Mathematik |
discipline_str_mv | Informatik Mathematik |
format | Book |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>02244nam a2200553zc 4500</leader><controlfield tag="001">BV021552118</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20221202 </controlfield><controlfield tag="007">t</controlfield><controlfield tag="008">060418s2006 xxud||| |||| 00||| eng d</controlfield><datafield tag="010" ind1=" " ind2=" "><subfield code="a">2005053433</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9780262182539</subfield><subfield code="9">978-0-262-18253-9</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">026218253X</subfield><subfield code="9">0-262-18253-X</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)61285753</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)BVBBV021552118</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">aacr</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="044" ind1=" " ind2=" "><subfield code="a">xxu</subfield><subfield code="c">US</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-91G</subfield><subfield code="a">DE-706</subfield><subfield code="a">DE-355</subfield><subfield code="a">DE-29T</subfield><subfield code="a">DE-83</subfield><subfield code="a">DE-11</subfield><subfield code="a">DE-91</subfield><subfield code="a">DE-188</subfield><subfield code="a">DE-384</subfield><subfield code="a">DE-703</subfield><subfield code="a">DE-739</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">QA274.4</subfield></datafield><datafield tag="082" ind1="0" ind2=" "><subfield code="a">519.2/3</subfield><subfield code="2">22</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 300</subfield><subfield code="0">(DE-625)143650:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 304</subfield><subfield code="0">(DE-625)143653:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">SK 820</subfield><subfield code="0">(DE-625)143258:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">DAT 708f</subfield><subfield code="2">stub</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">MAT 605f</subfield><subfield code="2">stub</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Rasmussen, Carl Edward</subfield><subfield code="e">Verfasser</subfield><subfield code="0">(DE-588)1137889306</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Gaussian processes for machine learning</subfield><subfield code="c">Carl Edward Rasmussen ; Christopher K.I. Williams</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Cambridge, Mass. [u.a.]</subfield><subfield code="b">MIT Press</subfield><subfield code="c">2006</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">XVIII, 248 S.</subfield><subfield code="b">graph. Darst.</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="490" ind1="0" ind2=" "><subfield code="a">Adaptive computation and machine learning</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">Hier auch später erschienene, unveränderte Nachdrucke</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Apprentissage automatique - Modèles mathématiques</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Processus gaussiens - Informatique</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Datenverarbeitung</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Mathematisches Modell</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Gaussian processes</subfield><subfield code="x">Data processing</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Machine learning</subfield><subfield code="x">Mathematical models</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Gauß-Prozess</subfield><subfield code="0">(DE-588)4156111-9</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Maschinelles Lernen</subfield><subfield code="0">(DE-588)4193754-5</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Maschinelles Lernen</subfield><subfield code="0">(DE-588)4193754-5</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="1"><subfield code="a">Gauß-Prozess</subfield><subfield code="0">(DE-588)4156111-9</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Williams, Christopher K. I.</subfield><subfield code="d">ca. 20./21. Jh.</subfield><subfield code="e">Sonstige</subfield><subfield code="0">(DE-588)1274480272</subfield><subfield code="4">oth</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="m">HBZ Datenaustausch</subfield><subfield code="q">application/pdf</subfield><subfield code="u">http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=014768173&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA</subfield><subfield code="3">Inhaltsverzeichnis</subfield></datafield><datafield tag="999" ind1=" " ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-014768173</subfield></datafield></record></collection> |
id | DE-604.BV021552118 |
illustrated | Illustrated |
index_date | 2024-07-02T14:32:04Z |
indexdate | 2024-07-09T20:38:27Z |
institution | BVB |
isbn | 9780262182539 026218253X |
language | English |
lccn | 2005053433 |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-014768173 |
oclc_num | 61285753 |
open_access_boolean | |
owner | DE-91G DE-BY-TUM DE-706 DE-355 DE-BY-UBR DE-29T DE-83 DE-11 DE-91 DE-BY-TUM DE-188 DE-384 DE-703 DE-739 |
owner_facet | DE-91G DE-BY-TUM DE-706 DE-355 DE-BY-UBR DE-29T DE-83 DE-11 DE-91 DE-BY-TUM DE-188 DE-384 DE-703 DE-739 |
physical | XVIII, 248 S. graph. Darst. |
publishDate | 2006 |
publishDateSearch | 2006 |
publishDateSort | 2006 |
publisher | MIT Press |
record_format | marc |
series2 | Adaptive computation and machine learning |
spelling | Rasmussen, Carl Edward Verfasser (DE-588)1137889306 aut Gaussian processes for machine learning Carl Edward Rasmussen ; Christopher K.I. Williams Cambridge, Mass. [u.a.] MIT Press 2006 XVIII, 248 S. graph. Darst. txt rdacontent n rdamedia nc rdacarrier Adaptive computation and machine learning Hier auch später erschienene, unveränderte Nachdrucke Apprentissage automatique - Modèles mathématiques Processus gaussiens - Informatique Datenverarbeitung Mathematisches Modell Gaussian processes Data processing Machine learning Mathematical models Gauß-Prozess (DE-588)4156111-9 gnd rswk-swf Maschinelles Lernen (DE-588)4193754-5 gnd rswk-swf Maschinelles Lernen (DE-588)4193754-5 s Gauß-Prozess (DE-588)4156111-9 s DE-604 Williams, Christopher K. I. ca. 20./21. Jh. Sonstige (DE-588)1274480272 oth HBZ Datenaustausch application/pdf http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=014768173&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA Inhaltsverzeichnis |
spellingShingle | Rasmussen, Carl Edward Gaussian processes for machine learning Apprentissage automatique - Modèles mathématiques Processus gaussiens - Informatique Datenverarbeitung Mathematisches Modell Gaussian processes Data processing Machine learning Mathematical models Gauß-Prozess (DE-588)4156111-9 gnd Maschinelles Lernen (DE-588)4193754-5 gnd |
subject_GND | (DE-588)4156111-9 (DE-588)4193754-5 |
title | Gaussian processes for machine learning |
title_auth | Gaussian processes for machine learning |
title_exact_search | Gaussian processes for machine learning |
title_exact_search_txtP | Gaussian processes for machine learning |
title_full | Gaussian processes for machine learning Carl Edward Rasmussen ; Christopher K.I. Williams |
title_fullStr | Gaussian processes for machine learning Carl Edward Rasmussen ; Christopher K.I. Williams |
title_full_unstemmed | Gaussian processes for machine learning Carl Edward Rasmussen ; Christopher K.I. Williams |
title_short | Gaussian processes for machine learning |
title_sort | gaussian processes for machine learning |
topic | Apprentissage automatique - Modèles mathématiques Processus gaussiens - Informatique Datenverarbeitung Mathematisches Modell Gaussian processes Data processing Machine learning Mathematical models Gauß-Prozess (DE-588)4156111-9 gnd Maschinelles Lernen (DE-588)4193754-5 gnd |
topic_facet | Apprentissage automatique - Modèles mathématiques Processus gaussiens - Informatique Datenverarbeitung Mathematisches Modell Gaussian processes Data processing Machine learning Mathematical models Gauß-Prozess Maschinelles Lernen |
url | http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=014768173&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |
work_keys_str_mv | AT rasmussencarledward gaussianprocessesformachinelearning AT williamschristopherki gaussianprocessesformachinelearning |