Pattern recognition and machine learning:
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Buch |
Sprache: | English |
Veröffentlicht: |
New York, NY
Springer
2006
|
Schriftenreihe: | Information science and statistics
|
Schlagworte: | |
Online-Zugang: | Beschreibung Publisher description Table of contents only Inhaltsverzeichnis Inhaltsverzeichnis |
Beschreibung: | Literaturverzeichnis Seite 711 - 728 Hier auch später erschienene, unveränderte Nachdrucke |
Beschreibung: | XX, 738 S. Ill., graph. Darst. |
ISBN: | 0387310738 9780387310732 |
Internformat
MARC
LEADER | 00000nam a2200000 c 4500 | ||
---|---|---|---|
001 | BV021648269 | ||
003 | DE-604 | ||
005 | 20230620 | ||
007 | t | ||
008 | 060707s2006 xxuad|| |||| 00||| eng d | ||
016 | 7 | |a 977347575 |2 DE-101 | |
020 | |a 0387310738 |c Gb. (Pr. in Vorb.) |9 0-387-31073-8 | ||
020 | |a 9780387310732 |9 978-0-387-31073-2 | ||
035 | |a (OCoLC)254896146 | ||
035 | |a (DE-599)BVBBV021648269 | ||
040 | |a DE-604 |b ger |e rakddb | ||
041 | 0 | |a eng | |
044 | |a xxu |c XD-US | ||
049 | |a DE-91G |a DE-703 |a DE-20 |a DE-473 |a DE-355 |a DE-739 |a DE-860 |a DE-384 |a DE-706 |a DE-573 |a DE-91 |a DE-19 |a DE-945 |a DE-898 |a DE-861 |a DE-29T |a DE-521 |a DE-83 |a DE-634 |a DE-11 |a DE-525 |a DE-188 |a DE-M382 | ||
050 | 0 | |a Q327 | |
082 | 0 | |a 006.4 | |
084 | |a QH 234 |0 (DE-625)141549: |2 rvk | ||
084 | |a ST 330 |0 (DE-625)143663: |2 rvk | ||
084 | |a 004 |2 sdnb | ||
084 | |a DAT 709f |2 stub | ||
084 | |a DAT 770f |2 stub | ||
100 | 1 | |a Bishop, Christopher M. |d 1959- |e Verfasser |0 (DE-588)120454165 |4 aut | |
245 | 1 | 0 | |a Pattern recognition and machine learning |c Christopher M. Bishop |
264 | 1 | |a New York, NY |b Springer |c 2006 | |
300 | |a XX, 738 S. |b Ill., graph. Darst. | ||
336 | |b txt |2 rdacontent | ||
337 | |b n |2 rdamedia | ||
338 | |b nc |2 rdacarrier | ||
490 | 0 | |a Information science and statistics | |
500 | |a Literaturverzeichnis Seite 711 - 728 | ||
500 | |a Hier auch später erschienene, unveränderte Nachdrucke | ||
650 | 4 | |a Maschinelles Lernen | |
650 | 4 | |a Mustererkennung | |
650 | 4 | |a Machine learning | |
650 | 4 | |a Pattern perception | |
650 | 0 | 7 | |a Mustererkennung |0 (DE-588)4040936-3 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Maschinelles Lernen |0 (DE-588)4193754-5 |2 gnd |9 rswk-swf |
689 | 0 | 0 | |a Mustererkennung |0 (DE-588)4040936-3 |D s |
689 | 0 | 1 | |a Maschinelles Lernen |0 (DE-588)4193754-5 |D s |
689 | 0 | |5 DE-604 | |
775 | 0 | 8 | |i Äquivalent |n Druck-Ausgabe, Paperback |d 2016 |z 978-1-4939-3843-8 |w (DE-604)BV044802275 |
856 | 4 | 2 | |q text/html |u http://deposit.dnb.de/cgi-bin/dokserv?id=2718189&prov=M&dok_var=1&dok_ext=htm |3 Beschreibung |
856 | 4 | 2 | |q text/html |u http://www.loc.gov/catdir/enhancements/fy0818/2006922522-d.html |3 Publisher description |
856 | 4 | 2 | |q text/html |u http://www.loc.gov/catdir/enhancements/fy0818/2006922522-t.html |3 Table of contents only |
856 | 4 | |u http://www3.ub.tu-berlin.de/ihv/001716289.pdf |3 Inhaltsverzeichnis | |
856 | 4 | 2 | |m SWBplus Fremddatenuebernahme |q application/pdf |u http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=014862953&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |3 Inhaltsverzeichnis |
943 | 1 | |a oai:aleph.bib-bvb.de:BVB01-014862953 |
Datensatz im Suchindex
_version_ | 1805088371689652224 |
---|---|
adam_text |
CONTENTS PREFACE VII MATHEMATICAL NOTATION XI CONTENTS XIII 1
INTRODUCTION 1 1.1 EXAMPLE: POLYNOMIAL CURVE FITTING . . . . . . . . . .
. . . . . . . 4 1.2 PROBABILITY THEORY . . . . . . . . . . . . . . . . .
. . . . . . . . . 12 1.2.1 PROBABILITY DENSITIES . . . . . . . . . . . .
. . . . . . . . . 17 1.2.2 EXPECTATIONS AND COVARIANCES . . . . . . . .
. . . . . . . . 19 1.2.3 BAYESIAN PROBABILITIES . . . . . . . . . . . .
. . . . . . . . 21 1.2.4 THE GAUSSIAN DISTRIBUTION . . . . . . . . . . .
. . . . . . . 24 1.2.5 CURVE FITTING RE-VISITED . . . . . . . . . . . .
. . . . . . . . 28 1.2.6 BAYESIAN CURVE FITTING . . . . . . . . . . . .
. . . . . . . . 30 1.3 MODEL SELECTION . . . . . . . . . . . . . . . . .
. . . . . . . . . . 32 1.4 THE CURSE OF DIMENSIONALITY . . . . . . . . .
. . . . . . . . . . . . 33 1.5 DECISION THEORY . . . . . . . . . . . . .
. . . . . . . . . . . . . . 38 1.5.1 MINIMIZING THE MISCLASSIFICATION
RATE . . . . . . . . . . . . 39 1.5.2 MINIMIZING THE EXPECTED LOSS . . .
. . . . . . . . . . . . . 41 1.5.3 THE REJECT OPTION . . . . . . . . . .
. . . . . . . . . . . . . 42 1.5.4 INFERENCE AND DECISION . . . . . . .
. . . . . . . . . . . . . 42 1.5.5 LOSS FUNCTIONS FOR REGRESSION . . . .
. . . . . . . . . . . . . 46 1.6 INFORMATION THEORY . . . . . . . . . .
. . . . . . . . . . . . . . . . 48 1.6.1 RELATIVE ENTROPY AND MUTUAL
INFORMATION . . . . . . . . . . 55 EXERCISES . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . 58 C * CHRISTPHER M. BISHOP
(2002*2006). SPRINGER, 2006. FIRST PRINTING. FURTHER INFORMATION
AVAILABLE AT HTTP://RESEARCH.MICROSOFT.COM/ * CMBISHOP/PRML XIII XIV
CONTENTS 2 PROBABILITY DISTRIBUTIONS 67 2.1 BINARY VARIABLES . . . . . .
. . . . . . . . . . . . . . . . . . . . . 68 2.1.1 THE BETA DISTRIBUTION
. . . . . . . . . . . . . . . . . . . . . 71 2.2 MULTINOMIAL VARIABLES .
. . . . . . . . . . . . . . . . . . . . . . . 74 2.2.1 THE DIRICHLET
DISTRIBUTION . . . . . . . . . . . . . . . . . . . 76 2.3 THE GAUSSIAN
DISTRIBUTION . . . . . . . . . . . . . . . . . . . . . . 78 2.3.1
CONDITIONAL GAUSSIAN DISTRIBUTIONS . . . . . . . . . . . . . . 85 2.3.2
MARGINAL GAUSSIAN DISTRIBUTIONS . . . . . . . . . . . . . . . 88 2.3.3
BAYES* THEOREM FOR GAUSSIAN VARIABLES . . . . . . . . . . . . 90 2.3.4
MAXIMUM LIKELIHOOD FOR THE GAUSSIAN . . . . . . . . . . . . 93 2.3.5
SEQUENTIAL ESTIMATION . . . . . . . . . . . . . . . . . . . . . 94 2.3.6
BAYESIAN INFERENCE FOR THE GAUSSIAN . . . . . . . . . . . . . 97 2.3.7
STUDENT*S T-DISTRIBUTION . . . . . . . . . . . . . . . . . . . . 102
2.3.8 PERIODIC VARIABLES . . . . . . . . . . . . . . . . . . . . . . .
105 2.3.9 MIXTURES OF GAUSSIANS . . . . . . . . . . . . . . . . . . . .
110 2.4 THE EXPONENTIAL FAMILY . . . . . . . . . . . . . . . . . . . . .
. . 113 2.4.1 MAXIMUM LIKELIHOOD AND SUFFICIENT STATISTICS . . . . . . .
. 116 2.4.2 CONJUGATE PRIORS . . . . . . . . . . . . . . . . . . . . . .
. 117 2.4.3 NONINFORMATIVE PRIORS . . . . . . . . . . . . . . . . . . .
. 117 2.5 NONPARAMETRIC METHODS . . . . . . . . . . . . . . . . . . . .
. . . 120 2.5.1 KERNEL DENSITY ESTIMATORS . . . . . . . . . . . . . . .
. . . . 122 2.5.2 NEAREST-NEIGHBOUR METHODS . . . . . . . . . . . . . .
. . . 124 EXERCISES . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . 127 3 LINEAR MODELS FOR REGRESSION 137 3.1 LINEAR BASIS
FUNCTION MODELS . . . . . . . . . . . . . . . . . . . . 138 3.1.1
MAXIMUM LIKELIHOOD AND LEAST SQUARES . . . . . . . . . . . . 140 3.1.2
GEOMETRY OF LEAST SQUARES . . . . . . . . . . . . . . . . . . 143 3.1.3
SEQUENTIAL LEARNING . . . . . . . . . . . . . . . . . . . . . . 143
3.1.4 REGULARIZED LEAST SQUARES . . . . . . . . . . . . . . . . . . .
144 3.1.5 MULTIPLE OUTPUTS . . . . . . . . . . . . . . . . . . . . . . .
146 3.2 THE BIAS-VARIANCE DECOMPOSITION . . . . . . . . . . . . . . . .
. . 147 3.3 BAYESIAN LINEAR REGRESSION . . . . . . . . . . . . . . . . .
. . . . 152 3.3.1 PARAMETER DISTRIBUTION . . . . . . . . . . . . . . . .
. . . . 152 3.3.2 PREDICTIVE DISTRIBUTION . . . . . . . . . . . . . . .
. . . . . 156 3.3.3 EQUIVALENT KERNEL . . . . . . . . . . . . . . . . .
. . . . . . 159 3.4 BAYESIAN MODEL COMPARISON . . . . . . . . . . . . .
. . . . . . . . 161 3.5 THE EVIDENCE APPROXIMATION . . . . . . . . . . .
. . . . . . . . . 165 3.5.1 EVALUATION OF THE EVIDENCE FUNCTION . . . .
. . . . . . . . . 166 3.5.2 MAXIMIZING THE EVIDENCE FUNCTION . . . . . .
. . . . . . . . 168 3.5.3 EFFECTIVE NUMBER OF PARAMETERS . . . . . . . .
. . . . . . . 170 3.6 LIMITATIONS OF FIXED BASIS FUNCTIONS . . . . . . .
. . . . . . . . . 172 EXERCISES . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . 173 C * CHRISTPHER M. BISHOP (2002*2006).
SPRINGER, 2006. FIRST PRINTING. FURTHER INFORMATION AVAILABLE AT
HTTP://RESEARCH.MICROSOFT.COM/ * CMBISHOP/PRML CONTENTS XV 4 LINEAR
MODELS FOR CLASSIFICATION 179 4.1 DISCRIMINANT FUNCTIONS . . . . . . . .
. . . . . . . . . . . . . . . . 181 4.1.1 TWO CLASSES . . . . . . . . .
. . . . . . . . . . . . . . . . . 181 4.1.2 MULTIPLE CLASSES . . . . . .
. . . . . . . . . . . . . . . . . . 182 4.1.3 LEAST SQUARES FOR
CLASSIFICATION . . . . . . . . . . . . . . . . 184 4.1.4 FISHER*S LINEAR
DISCRIMINANT . . . . . . . . . . . . . . . . . . 186 4.1.5 RELATION TO
LEAST SQUARES . . . . . . . . . . . . . . . . . . . 189 4.1.6 FISHER*S
DISCRIMINANT FOR MULTIPLE CLASSES . . . . . . . . . . 191 4.1.7 THE
PERCEPTRON ALGORITHM . . . . . . . . . . . . . . . . . . . 192 4.2
PROBABILISTIC GENERATIVE MODELS . . . . . . . . . . . . . . . . . . .
196 4.2.1 CONTINUOUS INPUTS . . . . . . . . . . . . . . . . . . . . . .
198 4.2.2 MAXIMUM LIKELIHOOD SOLUTION . . . . . . . . . . . . . . . .
200 4.2.3 DISCRETE FEATURES . . . . . . . . . . . . . . . . . . . . . .
. 202 4.2.4 EXPONENTIAL FAMILY . . . . . . . . . . . . . . . . . . . . .
. 202 4.3 PROBABILISTIC DISCRIMINATIVE MODELS . . . . . . . . . . . . .
. . . . 203 4.3.1 FIXED BASIS FUNCTIONS . . . . . . . . . . . . . . . .
. . . . . 204 4.3.2 LOGISTIC REGRESSION . . . . . . . . . . . . . . . .
. . . . . . 205 4.3.3 ITERATIVE REWEIGHTED LEAST SQUARES . . . . . . . .
. . . . . . 207 4.3.4 MULTICLASS LOGISTIC REGRESSION . . . . . . . . . .
. . . . . . . 209 4.3.5 PROBIT REGRESSION . . . . . . . . . . . . . . .
. . . . . . . . 210 4.3.6 CANONICAL LINK FUNCTIONS . . . . . . . . . . .
. . . . . . . . 212 4.4 THE LAPLACE APPROXIMATION . . . . . . . . . . .
. . . . . . . . . . 213 4.4.1 MODEL COMPARISON AND BIC . . . . . . . . .
. . . . . . . . 216 4.5 BAYESIAN LOGISTIC REGRESSION . . . . . . . . . .
. . . . . . . . . . 217 4.5.1 LAPLACE APPROXIMATION . . . . . . . . . .
. . . . . . . . . . 217 4.5.2 PREDICTIVE DISTRIBUTION . . . . . . . . .
. . . . . . . . . . . 218 EXERCISES . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . 220 5 NEURAL NETWORKS 225 5.1
FEED-FORWARD NETWORK FUNCTIONS . . . . . . . . . . . . . . . . . . 227
5.1.1 WEIGHT-SPACE SYMMETRIES . . . . . . . . . . . . . . . . . . 231
5.2 NETWORK TRAINING . . . . . . . . . . . . . . . . . . . . . . . . . .
. 232 5.2.1 PARAMETER OPTIMIZATION . . . . . . . . . . . . . . . . . . .
. 236 5.2.2 LOCAL QUADRATIC APPROXIMATION . . . . . . . . . . . . . . .
. 237 5.2.3 USE OF GRADIENT INFORMATION . . . . . . . . . . . . . . . .
. 239 5.2.4 GRADIENT DESCENT OPTIMIZATION . . . . . . . . . . . . . . .
. 240 5.3 ERROR BACKPROPAGATION . . . . . . . . . . . . . . . . . . . .
. . . . 241 5.3.1 EVALUATION OF ERROR-FUNCTION DERIVATIVES . . . . . . .
. . . . 242 5.3.2 A SIMPLE EXAMPLE . . . . . . . . . . . . . . . . . . .
. . . 245 5.3.3 EFFICIENCY OF BACKPROPAGATION . . . . . . . . . . . . .
. . . 246 5.3.4 THE JACOBIAN MATRIX . . . . . . . . . . . . . . . . . .
. . . 247 5.4 THE HESSIAN MATRIX . . . . . . . . . . . . . . . . . . . .
. . . . . . 249 5.4.1 DIAGONAL APPROXIMATION . . . . . . . . . . . . . .
. . . . . 250 5.4.2 OUTER PRODUCT APPROXIMATION . . . . . . . . . . . .
. . . . . 251 5.4.3 INVERSE HESSIAN . . . . . . . . . . . . . . . . . .
. . . . . . 252 C * CHRISTPHER M. BISHOP (2002*2006). SPRINGER, 2006.
FIRST PRINTING. FURTHER INFORMATION AVAILABLE AT
HTTP://RESEARCH.MICROSOFT.COM/ * CMBISHOP/PRML XVI CONTENTS 5.4.4 FINITE
DIFFERENCES . . . . . . . . . . . . . . . . . . . . . . . 252 5.4.5
EXACT EVALUATION OF THE HESSIAN . . . . . . . . . . . . . . . 253 5.4.6
FAST MULTIPLICATION BY THE HESSIAN . . . . . . . . . . . . . . 254 5.5
REGULARIZATION IN NEURAL NETWORKS . . . . . . . . . . . . . . . . . 256
5.5.1 CONSISTENT GAUSSIAN PRIORS . . . . . . . . . . . . . . . . . . 257
5.5.2 EARLY STOPPING . . . . . . . . . . . . . . . . . . . . . . . . 259
5.5.3 INVARIANCES . . . . . . . . . . . . . . . . . . . . . . . . . .
261 5.5.4 TANGENT PROPAGATION . . . . . . . . . . . . . . . . . . . . .
263 5.5.5 TRAINING WITH TRANSFORMED DATA . . . . . . . . . . . . . . . .
265 5.5.6 CONVOLUTIONAL NETWORKS . . . . . . . . . . . . . . . . . . .
267 5.5.7 SOFT WEIGHT SHARING . . . . . . . . . . . . . . . . . . . . .
. 269 5.6 MIXTURE DENSITY NETWORKS . . . . . . . . . . . . . . . . . . .
. . . 272 5.7 BAYESIAN NEURAL NETWORKS . . . . . . . . . . . . . . . . .
. . . . . 277 5.7.1 POSTERIOR PARAMETER DISTRIBUTION . . . . . . . . . .
. . . . . 278 5.7.2 HYPERPARAMETER OPTIMIZATION . . . . . . . . . . . .
. . . . 280 5.7.3 BAYESIAN NEURAL NETWORKS FOR CLASSIFICATION . . . . .
. . . . 281 EXERCISES . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . 284 6 KERNEL METHODS 291 6.1 DUAL REPRESENTATIONS . .
. . . . . . . . . . . . . . . . . . . . . . . 293 6.2 CONSTRUCTING
KERNELS . . . . . . . . . . . . . . . . . . . . . . . . . 294 6.3 RADIAL
BASIS FUNCTION NETWORKS . . . . . . . . . . . . . . . . . . . 299 6.3.1
NADARAYA-WATSON MODEL . . . . . . . . . . . . . . . . . . . 301 6.4
GAUSSIAN PROCESSES . . . . . . . . . . . . . . . . . . . . . . . . . .
303 6.4.1 LINEAR REGRESSION REVISITED . . . . . . . . . . . . . . . . .
. 304 6.4.2 GAUSSIAN PROCESSES FOR REGRESSION . . . . . . . . . . . . .
. 306 6.4.3 LEARNING THE HYPERPARAMETERS . . . . . . . . . . . . . . . .
311 6.4.4 AUTOMATIC RELEVANCE DETERMINATION . . . . . . . . . . . . .
312 6.4.5 GAUSSIAN PROCESSES FOR CLASSIFICATION . . . . . . . . . . . .
. 313 6.4.6 LAPLACE APPROXIMATION . . . . . . . . . . . . . . . . . . .
. 315 6.4.7 CONNECTION TO NEURAL NETWORKS . . . . . . . . . . . . . . .
. 319 EXERCISES . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . 320 7 SPARSE KERNEL MACHINES 325 7.1 MAXIMUM MARGIN
CLASSIFIERS . . . . . . . . . . . . . . . . . . . . 326 7.1.1
OVERLAPPING CLASS DISTRIBUTIONS . . . . . . . . . . . . . . . . 331
7.1.2 RELATION TO LOGISTIC REGRESSION . . . . . . . . . . . . . . . .
336 7.1.3 MULTICLASS SVMS . . . . . . . . . . . . . . . . . . . . . . .
338 7.1.4 SVMS FOR REGRESSION . . . . . . . . . . . . . . . . . . . . .
339 7.1.5 COMPUTATIONAL LEARNING THEORY . . . . . . . . . . . . . . . .
344 7.2 RELEVANCE VECTOR MACHINES . . . . . . . . . . . . . . . . . . .
. . 345 7.2.1 RVM FOR REGRESSION . . . . . . . . . . . . . . . . . . . .
. . 345 7.2.2 ANALYSIS OF SPARSITY . . . . . . . . . . . . . . . . . . .
. . . 349 7.2.3 RVM FOR CLASSIFICATION . . . . . . . . . . . . . . . . .
. . . 353 EXERCISES . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . 357 C * CHRISTPHER M. BISHOP (2002*2006). SPRINGER,
2006. FIRST PRINTING. FURTHER INFORMATION AVAILABLE AT
HTTP://RESEARCH.MICROSOFT.COM/ * CMBISHOP/PRML CONTENTS XVII 8 GRAPHICAL
MODELS 359 8.1 BAYESIAN NETWORKS . . . . . . . . . . . . . . . . . . . .
. . . . . . 360 8.1.1 EXAMPLE: POLYNOMIAL REGRESSION . . . . . . . . . .
. . . . . 362 8.1.2 GENERATIVE MODELS . . . . . . . . . . . . . . . . .
. . . . . 365 8.1.3 DISCRETE VARIABLES . . . . . . . . . . . . . . . . .
. . . . . . 366 8.1.4 LINEAR-GAUSSIAN MODELS . . . . . . . . . . . . . .
. . . . . 370 8.2 CONDITIONAL INDEPENDENCE . . . . . . . . . . . . . . .
. . . . . . . 372 8.2.1 THREE EXAMPLE GRAPHS . . . . . . . . . . . . . .
. . . . . . 373 8.2.2 D-SEPARATION . . . . . . . . . . . . . . . . . . .
. . . . . . 378 8.3 MARKOV RANDOM FIELDS . . . . . . . . . . . . . . . .
. . . . . . . 383 8.3.1 CONDITIONAL INDEPENDENCE PROPERTIES . . . . . .
. . . . . . . 383 8.3.2 FACTORIZATION PROPERTIES . . . . . . . . . . . .
. . . . . . . 384 8.3.3 ILLUSTRATION: IMAGE DE-NOISING . . . . . . . . .
. . . . . . . 387 8.3.4 RELATION TO DIRECTED GRAPHS . . . . . . . . . .
. . . . . . . . 390 8.4 INFERENCE IN GRAPHICAL MODELS . . . . . . . . .
. . . . . . . . . . . 393 8.4.1 INFERENCE ON A CHAIN . . . . . . . . . .
. . . . . . . . . . . 394 8.4.2 TREES . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . 398 8.4.3 FACTOR GRAPHS . . . . . . . . . . . .
. . . . . . . . . . . . . 399 8.4.4 THE SUM-PRODUCT ALGORITHM . . . . .
. . . . . . . . . . . . . 402 8.4.5 THE MAX-SUM ALGORITHM . . . . . . .
. . . . . . . . . . . . 411 8.4.6 EXACT INFERENCE IN GENERAL GRAPHS . .
. . . . . . . . . . . . 416 8.4.7 LOOPY BELIEF PROPAGATION . . . . . . .
. . . . . . . . . . . . 417 8.4.8 LEARNING THE GRAPH STRUCTURE . . . . .
. . . . . . . . . . . . 418 EXERCISES . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . 418 9 MIXTURE MODELS AND EM 423 9.1 K
-MEANS CLUSTERING . . . . . . . . . . . . . . . . . . . . . . . . . 424
9.1.1 IMAGE SEGMENTATION AND COMPRESSION . . . . . . . . . . . . 428 9.2
MIXTURES OF GAUSSIANS . . . . . . . . . . . . . . . . . . . . . . . .
430 9.2.1 MAXIMUM LIKELIHOOD . . . . . . . . . . . . . . . . . . . . .
432 9.2.2 EM FOR GAUSSIAN MIXTURES . . . . . . . . . . . . . . . . . .
435 9.3 AN ALTERNATIVE VIEW OF EM . . . . . . . . . . . . . . . . . . .
. . 439 9.3.1 GAUSSIAN MIXTURES REVISITED . . . . . . . . . . . . . . .
. . 441 9.3.2 RELATION TO K -MEANS . . . . . . . . . . . . . . . . . . .
. . 443 9.3.3 MIXTURES OF BERNOULLI DISTRIBUTIONS . . . . . . . . . . .
. . . 444 9.3.4 EM FOR BAYESIAN LINEAR REGRESSION . . . . . . . . . . .
. . . 448 9.4 THE EM ALGORITHM IN GENERAL . . . . . . . . . . . . . . .
. . . . . 450 EXERCISES . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . 455 10 APPROXIMATE INFERENCE 461 10.1 VARIATIONAL
INFERENCE . . . . . . . . . . . . . . . . . . . . . . . . . 462 10.1.1
FACTORIZED DISTRIBUTIONS . . . . . . . . . . . . . . . . . . . . 464
10.1.2 PROPERTIES OF FACTORIZED APPROXIMATIONS . . . . . . . . . . . 466
10.1.3 EXAMPLE: THE UNIVARIATE GAUSSIAN . . . . . . . . . . . . . . 470
10.1.4 MODEL COMPARISON . . . . . . . . . . . . . . . . . . . . . . 473
10.2 ILLUSTRATION: VARIATIONAL MIXTURE OF GAUSSIANS . . . . . . . . . .
. . 474 C * CHRISTPHER M. BISHOP (2002*2006). SPRINGER, 2006. FIRST
PRINTING. FURTHER INFORMATION AVAILABLE AT
HTTP://RESEARCH.MICROSOFT.COM/ * CMBISHOP/PRML XVIII CONTENTS 10.2.1
VARIATIONAL DISTRIBUTION . . . . . . . . . . . . . . . . . . . . 475
10.2.2 VARIATIONAL LOWER BOUND . . . . . . . . . . . . . . . . . . . 481
10.2.3 PREDICTIVE DENSITY . . . . . . . . . . . . . . . . . . . . . . .
482 10.2.4 DETERMINING THE NUMBER OF COMPONENTS . . . . . . . . . . .
483 10.2.5 INDUCED FACTORIZATIONS . . . . . . . . . . . . . . . . . . .
. 485 10.3 VARIATIONAL LINEAR REGRESSION . . . . . . . . . . . . . . . .
. . . . 486 10.3.1 VARIATIONAL DISTRIBUTION . . . . . . . . . . . . . .
. . . . . . 486 10.3.2 PREDICTIVE DISTRIBUTION . . . . . . . . . . . . .
. . . . . . . 488 10.3.3 LOWER BOUND . . . . . . . . . . . . . . . . . .
. . . . . . . 489 10.4 EXPONENTIAL FAMILY DISTRIBUTIONS . . . . . . . .
. . . . . . . . . . 490 10.4.1 VARIATIONAL MESSAGE PASSING . . . . . . .
. . . . . . . . . . 491 10.5 LOCAL VARIATIONAL METHODS . . . . . . . . .
. . . . . . . . . . . . . 493 10.6 VARIATIONAL LOGISTIC REGRESSION . . .
. . . . . . . . . . . . . . . . 498 10.6.1 VARIATIONAL POSTERIOR
DISTRIBUTION . . . . . . . . . . . . . . . 498 10.6.2 OPTIMIZING THE
VARIATIONAL PARAMETERS . . . . . . . . . . . . 500 10.6.3 INFERENCE OF
HYPERPARAMETERS . . . . . . . . . . . . . . . . 502 10.7 EXPECTATION
PROPAGATION . . . . . . . . . . . . . . . . . . . . . . . 505 10.7.1
EXAMPLE: THE CLUTTER PROBLEM . . . . . . . . . . . . . . . . 511 10.7.2
EXPECTATION PROPAGATION ON GRAPHS . . . . . . . . . . . . . . 513
EXERCISES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . 517 11 SAMPLING METHODS 523 11.1 BASIC SAMPLING ALGORITHMS . . . .
. . . . . . . . . . . . . . . . . 526 11.1.1 STANDARD DISTRIBUTIONS . .
. . . . . . . . . . . . . . . . . . 526 11.1.2 REJECTION SAMPLING . . .
. . . . . . . . . . . . . . . . . . . 528 11.1.3 ADAPTIVE REJECTION
SAMPLING . . . . . . . . . . . . . . . . . 530 11.1.4 IMPORTANCE
SAMPLING . . . . . . . . . . . . . . . . . . . . . 532 11.1.5
SAMPLING-IMPORTANCE-RESAMPLING . . . . . . . . . . . . . . 534 11.1.6
SAMPLING AND THE EM ALGORITHM . . . . . . . . . . . . . . . 536 11.2
MARKOV CHAIN MONTE CARLO . . . . . . . . . . . . . . . . . . . . . 537
11.2.1 MARKOV CHAINS . . . . . . . . . . . . . . . . . . . . . . . . 539
11.2.2 THE METROPOLIS-HASTINGS ALGORITHM . . . . . . . . . . . . . 541
11.3 GIBBS SAMPLING . . . . . . . . . . . . . . . . . . . . . . . . . .
. 542 11.4 SLICE SAMPLING . . . . . . . . . . . . . . . . . . . . . . .
. . . . . 546 11.5 THE HYBRID MONTE CARLO ALGORITHM . . . . . . . . . .
. . . . . . . 548 11.5.1 DYNAMICAL SYSTEMS . . . . . . . . . . . . . . .
. . . . . . . 548 11.5.2 HYBRID MONTE CARLO . . . . . . . . . . . . . .
. . . . . . . 552 11.6 ESTIMATING THE PARTITION FUNCTION . . . . . . . .
. . . . . . . . . . 554 EXERCISES . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . 556 12 CONTINUOUS LATENT VARIABLES 559
12.1 PRINCIPAL COMPONENT ANALYSIS . . . . . . . . . . . . . . . . . . .
. 561 12.1.1 MAXIMUM VARIANCE FORMULATION . . . . . . . . . . . . . . .
561 12.1.2 MINIMUM-ERROR FORMULATION . . . . . . . . . . . . . . . . .
563 12.1.3 APPLICATIONS OF PCA . . . . . . . . . . . . . . . . . . . . .
565 12.1.4 PCA FOR HIGH-DIMENSIONAL DATA . . . . . . . . . . . . . . .
569 C * CHRISTPHER M. BISHOP (2002*2006). SPRINGER, 2006. FIRST
PRINTING. FURTHER INFORMATION AVAILABLE AT
HTTP://RESEARCH.MICROSOFT.COM/ * CMBISHOP/PRML CONTENTS XIX 12.2
PROBABILISTIC PCA . . . . . . . . . . . . . . . . . . . . . . . . . .
570 12.2.1 MAXIMUM LIKELIHOOD PCA . . . . . . . . . . . . . . . . . .
574 12.2.2 EM ALGORITHM FOR PCA . . . . . . . . . . . . . . . . . . . .
577 12.2.3 BAYESIAN PCA . . . . . . . . . . . . . . . . . . . . . . . .
580 12.2.4 FACTOR ANALYSIS . . . . . . . . . . . . . . . . . . . . . . .
. 583 12.3 KERNEL PCA . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . 586 12.4 NONLINEAR LATENT VARIABLE MODELS . . . . . . . . . .
. . . . . . . . 591 12.4.1 INDEPENDENT COMPONENT ANALYSIS . . . . . . .
. . . . . . . . 591 12.4.2 AUTOASSOCIATIVE NEURAL NETWORKS . . . . . . .
. . . . . . . . 592 12.4.3 MODELLING NONLINEAR MANIFOLDS . . . . . . . .
. . . . . . . . 595 EXERCISES . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . 599 13 SEQUENTIAL DATA 605 13.1 MARKOV MODELS
. . . . . . . . . . . . . . . . . . . . . . . . . . . . 607 13.2 HIDDEN
MARKOV MODELS . . . . . . . . . . . . . . . . . . . . . . . 610 13.2.1
MAXIMUM LIKELIHOOD FOR THE HMM . . . . . . . . . . . . . 615 13.2.2 THE
FORWARD-BACKWARD ALGORITHM . . . . . . . . . . . . . . 618 13.2.3 THE
SUM-PRODUCT ALGORITHM FOR THE HMM . . . . . . . . . . 625 13.2.4 SCALING
FACTORS . . . . . . . . . . . . . . . . . . . . . . . . 627 13.2.5 THE
VITERBI ALGORITHM . . . . . . . . . . . . . . . . . . . . . 629 13.2.6
EXTENSIONS OF THE HIDDEN MARKOV MODEL . . . . . . . . . . . 631 13.3
LINEAR DYNAMICAL SYSTEMS . . . . . . . . . . . . . . . . . . . . . . 635
13.3.1 INFERENCE IN LDS . . . . . . . . . . . . . . . . . . . . . . .
638 13.3.2 LEARNING IN LDS . . . . . . . . . . . . . . . . . . . . . . .
642 13.3.3 EXTENSIONS OF LDS . . . . . . . . . . . . . . . . . . . . . .
644 13.3.4 PARTICLE FILTERS . . . . . . . . . . . . . . . . . . . . . .
. . . 645 EXERCISES . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . 646 14 COMBINING MODELS 653 14.1 BAYESIAN MODEL
AVERAGING . . . . . . . . . . . . . . . . . . . . . . 654 14.2
COMMITTEES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
655 14.3 BOOSTING . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . 657 14.3.1 MINIMIZING EXPONENTIAL ERROR . . . . . . . . . . . .
. . . . 659 14.3.2 ERROR FUNCTIONS FOR BOOSTING . . . . . . . . . . . .
. . . . . 661 14.4 TREE-BASED MODELS . . . . . . . . . . . . . . . . . .
. . . . . . . . 663 14.5 CONDITIONAL MIXTURE MODELS . . . . . . . . . .
. . . . . . . . . . . 666 14.5.1 MIXTURES OF LINEAR REGRESSION MODELS .
. . . . . . . . . . . . 667 14.5.2 MIXTURES OF LOGISTIC MODELS . . . . .
. . . . . . . . . . . . 670 14.5.3 MIXTURES OF EXPERTS . . . . . . . . .
. . . . . . . . . . . . . 672 EXERCISES . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . 674 APPENDIX A DATA SETS 677
APPENDIX B PROBABILITY DISTRIBUTIONS 685 APPENDIX C PROPERTIES OF
MATRICES 695 C * CHRISTPHER M. BISHOP (2002*2006). SPRINGER, 2006. FIRST
PRINTING. FURTHER INFORMATION AVAILABLE AT
HTTP://RESEARCH.MICROSOFT.COM/ * CMBISHOP/PRML XX CONTENTS APPENDIX D
CALCULUS OF VARIATIONS 703 APPENDIX E LAGRANGE MULTIPLIERS 707
REFERENCES 711 C * CHRISTPHER M. BISHOP (2002*2006). SPRINGER, 2006.
FIRST PRINTING. FURTHER INFORMATION AVAILABLE AT
HTTP://RESEARCH.MICROSOFT.COM/ * CMBISHOP/PRML |
adam_txt |
CONTENTS PREFACE VII MATHEMATICAL NOTATION XI CONTENTS XIII 1
INTRODUCTION 1 1.1 EXAMPLE: POLYNOMIAL CURVE FITTING . . . . . . . . . .
. . . . . . . 4 1.2 PROBABILITY THEORY . . . . . . . . . . . . . . . . .
. . . . . . . . . 12 1.2.1 PROBABILITY DENSITIES . . . . . . . . . . . .
. . . . . . . . . 17 1.2.2 EXPECTATIONS AND COVARIANCES . . . . . . . .
. . . . . . . . 19 1.2.3 BAYESIAN PROBABILITIES . . . . . . . . . . . .
. . . . . . . . 21 1.2.4 THE GAUSSIAN DISTRIBUTION . . . . . . . . . . .
. . . . . . . 24 1.2.5 CURVE FITTING RE-VISITED . . . . . . . . . . . .
. . . . . . . . 28 1.2.6 BAYESIAN CURVE FITTING . . . . . . . . . . . .
. . . . . . . . 30 1.3 MODEL SELECTION . . . . . . . . . . . . . . . . .
. . . . . . . . . . 32 1.4 THE CURSE OF DIMENSIONALITY . . . . . . . . .
. . . . . . . . . . . . 33 1.5 DECISION THEORY . . . . . . . . . . . . .
. . . . . . . . . . . . . . 38 1.5.1 MINIMIZING THE MISCLASSIFICATION
RATE . . . . . . . . . . . . 39 1.5.2 MINIMIZING THE EXPECTED LOSS . . .
. . . . . . . . . . . . . 41 1.5.3 THE REJECT OPTION . . . . . . . . . .
. . . . . . . . . . . . . 42 1.5.4 INFERENCE AND DECISION . . . . . . .
. . . . . . . . . . . . . 42 1.5.5 LOSS FUNCTIONS FOR REGRESSION . . . .
. . . . . . . . . . . . . 46 1.6 INFORMATION THEORY . . . . . . . . . .
. . . . . . . . . . . . . . . . 48 1.6.1 RELATIVE ENTROPY AND MUTUAL
INFORMATION . . . . . . . . . . 55 EXERCISES . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . 58 C * CHRISTPHER M. BISHOP
(2002*2006). SPRINGER, 2006. FIRST PRINTING. FURTHER INFORMATION
AVAILABLE AT HTTP://RESEARCH.MICROSOFT.COM/ * CMBISHOP/PRML XIII XIV
CONTENTS 2 PROBABILITY DISTRIBUTIONS 67 2.1 BINARY VARIABLES . . . . . .
. . . . . . . . . . . . . . . . . . . . . 68 2.1.1 THE BETA DISTRIBUTION
. . . . . . . . . . . . . . . . . . . . . 71 2.2 MULTINOMIAL VARIABLES .
. . . . . . . . . . . . . . . . . . . . . . . 74 2.2.1 THE DIRICHLET
DISTRIBUTION . . . . . . . . . . . . . . . . . . . 76 2.3 THE GAUSSIAN
DISTRIBUTION . . . . . . . . . . . . . . . . . . . . . . 78 2.3.1
CONDITIONAL GAUSSIAN DISTRIBUTIONS . . . . . . . . . . . . . . 85 2.3.2
MARGINAL GAUSSIAN DISTRIBUTIONS . . . . . . . . . . . . . . . 88 2.3.3
BAYES* THEOREM FOR GAUSSIAN VARIABLES . . . . . . . . . . . . 90 2.3.4
MAXIMUM LIKELIHOOD FOR THE GAUSSIAN . . . . . . . . . . . . 93 2.3.5
SEQUENTIAL ESTIMATION . . . . . . . . . . . . . . . . . . . . . 94 2.3.6
BAYESIAN INFERENCE FOR THE GAUSSIAN . . . . . . . . . . . . . 97 2.3.7
STUDENT*S T-DISTRIBUTION . . . . . . . . . . . . . . . . . . . . 102
2.3.8 PERIODIC VARIABLES . . . . . . . . . . . . . . . . . . . . . . .
105 2.3.9 MIXTURES OF GAUSSIANS . . . . . . . . . . . . . . . . . . . .
110 2.4 THE EXPONENTIAL FAMILY . . . . . . . . . . . . . . . . . . . . .
. . 113 2.4.1 MAXIMUM LIKELIHOOD AND SUFFICIENT STATISTICS . . . . . . .
. 116 2.4.2 CONJUGATE PRIORS . . . . . . . . . . . . . . . . . . . . . .
. 117 2.4.3 NONINFORMATIVE PRIORS . . . . . . . . . . . . . . . . . . .
. 117 2.5 NONPARAMETRIC METHODS . . . . . . . . . . . . . . . . . . . .
. . . 120 2.5.1 KERNEL DENSITY ESTIMATORS . . . . . . . . . . . . . . .
. . . . 122 2.5.2 NEAREST-NEIGHBOUR METHODS . . . . . . . . . . . . . .
. . . 124 EXERCISES . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . 127 3 LINEAR MODELS FOR REGRESSION 137 3.1 LINEAR BASIS
FUNCTION MODELS . . . . . . . . . . . . . . . . . . . . 138 3.1.1
MAXIMUM LIKELIHOOD AND LEAST SQUARES . . . . . . . . . . . . 140 3.1.2
GEOMETRY OF LEAST SQUARES . . . . . . . . . . . . . . . . . . 143 3.1.3
SEQUENTIAL LEARNING . . . . . . . . . . . . . . . . . . . . . . 143
3.1.4 REGULARIZED LEAST SQUARES . . . . . . . . . . . . . . . . . . .
144 3.1.5 MULTIPLE OUTPUTS . . . . . . . . . . . . . . . . . . . . . . .
146 3.2 THE BIAS-VARIANCE DECOMPOSITION . . . . . . . . . . . . . . . .
. . 147 3.3 BAYESIAN LINEAR REGRESSION . . . . . . . . . . . . . . . . .
. . . . 152 3.3.1 PARAMETER DISTRIBUTION . . . . . . . . . . . . . . . .
. . . . 152 3.3.2 PREDICTIVE DISTRIBUTION . . . . . . . . . . . . . . .
. . . . . 156 3.3.3 EQUIVALENT KERNEL . . . . . . . . . . . . . . . . .
. . . . . . 159 3.4 BAYESIAN MODEL COMPARISON . . . . . . . . . . . . .
. . . . . . . . 161 3.5 THE EVIDENCE APPROXIMATION . . . . . . . . . . .
. . . . . . . . . 165 3.5.1 EVALUATION OF THE EVIDENCE FUNCTION . . . .
. . . . . . . . . 166 3.5.2 MAXIMIZING THE EVIDENCE FUNCTION . . . . . .
. . . . . . . . 168 3.5.3 EFFECTIVE NUMBER OF PARAMETERS . . . . . . . .
. . . . . . . 170 3.6 LIMITATIONS OF FIXED BASIS FUNCTIONS . . . . . . .
. . . . . . . . . 172 EXERCISES . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . 173 C * CHRISTPHER M. BISHOP (2002*2006).
SPRINGER, 2006. FIRST PRINTING. FURTHER INFORMATION AVAILABLE AT
HTTP://RESEARCH.MICROSOFT.COM/ * CMBISHOP/PRML CONTENTS XV 4 LINEAR
MODELS FOR CLASSIFICATION 179 4.1 DISCRIMINANT FUNCTIONS . . . . . . . .
. . . . . . . . . . . . . . . . 181 4.1.1 TWO CLASSES . . . . . . . . .
. . . . . . . . . . . . . . . . . 181 4.1.2 MULTIPLE CLASSES . . . . . .
. . . . . . . . . . . . . . . . . . 182 4.1.3 LEAST SQUARES FOR
CLASSIFICATION . . . . . . . . . . . . . . . . 184 4.1.4 FISHER*S LINEAR
DISCRIMINANT . . . . . . . . . . . . . . . . . . 186 4.1.5 RELATION TO
LEAST SQUARES . . . . . . . . . . . . . . . . . . . 189 4.1.6 FISHER*S
DISCRIMINANT FOR MULTIPLE CLASSES . . . . . . . . . . 191 4.1.7 THE
PERCEPTRON ALGORITHM . . . . . . . . . . . . . . . . . . . 192 4.2
PROBABILISTIC GENERATIVE MODELS . . . . . . . . . . . . . . . . . . .
196 4.2.1 CONTINUOUS INPUTS . . . . . . . . . . . . . . . . . . . . . .
198 4.2.2 MAXIMUM LIKELIHOOD SOLUTION . . . . . . . . . . . . . . . .
200 4.2.3 DISCRETE FEATURES . . . . . . . . . . . . . . . . . . . . . .
. 202 4.2.4 EXPONENTIAL FAMILY . . . . . . . . . . . . . . . . . . . . .
. 202 4.3 PROBABILISTIC DISCRIMINATIVE MODELS . . . . . . . . . . . . .
. . . . 203 4.3.1 FIXED BASIS FUNCTIONS . . . . . . . . . . . . . . . .
. . . . . 204 4.3.2 LOGISTIC REGRESSION . . . . . . . . . . . . . . . .
. . . . . . 205 4.3.3 ITERATIVE REWEIGHTED LEAST SQUARES . . . . . . . .
. . . . . . 207 4.3.4 MULTICLASS LOGISTIC REGRESSION . . . . . . . . . .
. . . . . . . 209 4.3.5 PROBIT REGRESSION . . . . . . . . . . . . . . .
. . . . . . . . 210 4.3.6 CANONICAL LINK FUNCTIONS . . . . . . . . . . .
. . . . . . . . 212 4.4 THE LAPLACE APPROXIMATION . . . . . . . . . . .
. . . . . . . . . . 213 4.4.1 MODEL COMPARISON AND BIC . . . . . . . . .
. . . . . . . . 216 4.5 BAYESIAN LOGISTIC REGRESSION . . . . . . . . . .
. . . . . . . . . . 217 4.5.1 LAPLACE APPROXIMATION . . . . . . . . . .
. . . . . . . . . . 217 4.5.2 PREDICTIVE DISTRIBUTION . . . . . . . . .
. . . . . . . . . . . 218 EXERCISES . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . 220 5 NEURAL NETWORKS 225 5.1
FEED-FORWARD NETWORK FUNCTIONS . . . . . . . . . . . . . . . . . . 227
5.1.1 WEIGHT-SPACE SYMMETRIES . . . . . . . . . . . . . . . . . . 231
5.2 NETWORK TRAINING . . . . . . . . . . . . . . . . . . . . . . . . . .
. 232 5.2.1 PARAMETER OPTIMIZATION . . . . . . . . . . . . . . . . . . .
. 236 5.2.2 LOCAL QUADRATIC APPROXIMATION . . . . . . . . . . . . . . .
. 237 5.2.3 USE OF GRADIENT INFORMATION . . . . . . . . . . . . . . . .
. 239 5.2.4 GRADIENT DESCENT OPTIMIZATION . . . . . . . . . . . . . . .
. 240 5.3 ERROR BACKPROPAGATION . . . . . . . . . . . . . . . . . . . .
. . . . 241 5.3.1 EVALUATION OF ERROR-FUNCTION DERIVATIVES . . . . . . .
. . . . 242 5.3.2 A SIMPLE EXAMPLE . . . . . . . . . . . . . . . . . . .
. . . 245 5.3.3 EFFICIENCY OF BACKPROPAGATION . . . . . . . . . . . . .
. . . 246 5.3.4 THE JACOBIAN MATRIX . . . . . . . . . . . . . . . . . .
. . . 247 5.4 THE HESSIAN MATRIX . . . . . . . . . . . . . . . . . . . .
. . . . . . 249 5.4.1 DIAGONAL APPROXIMATION . . . . . . . . . . . . . .
. . . . . 250 5.4.2 OUTER PRODUCT APPROXIMATION . . . . . . . . . . . .
. . . . . 251 5.4.3 INVERSE HESSIAN . . . . . . . . . . . . . . . . . .
. . . . . . 252 C * CHRISTPHER M. BISHOP (2002*2006). SPRINGER, 2006.
FIRST PRINTING. FURTHER INFORMATION AVAILABLE AT
HTTP://RESEARCH.MICROSOFT.COM/ * CMBISHOP/PRML XVI CONTENTS 5.4.4 FINITE
DIFFERENCES . . . . . . . . . . . . . . . . . . . . . . . 252 5.4.5
EXACT EVALUATION OF THE HESSIAN . . . . . . . . . . . . . . . 253 5.4.6
FAST MULTIPLICATION BY THE HESSIAN . . . . . . . . . . . . . . 254 5.5
REGULARIZATION IN NEURAL NETWORKS . . . . . . . . . . . . . . . . . 256
5.5.1 CONSISTENT GAUSSIAN PRIORS . . . . . . . . . . . . . . . . . . 257
5.5.2 EARLY STOPPING . . . . . . . . . . . . . . . . . . . . . . . . 259
5.5.3 INVARIANCES . . . . . . . . . . . . . . . . . . . . . . . . . .
261 5.5.4 TANGENT PROPAGATION . . . . . . . . . . . . . . . . . . . . .
263 5.5.5 TRAINING WITH TRANSFORMED DATA . . . . . . . . . . . . . . . .
265 5.5.6 CONVOLUTIONAL NETWORKS . . . . . . . . . . . . . . . . . . .
267 5.5.7 SOFT WEIGHT SHARING . . . . . . . . . . . . . . . . . . . . .
. 269 5.6 MIXTURE DENSITY NETWORKS . . . . . . . . . . . . . . . . . . .
. . . 272 5.7 BAYESIAN NEURAL NETWORKS . . . . . . . . . . . . . . . . .
. . . . . 277 5.7.1 POSTERIOR PARAMETER DISTRIBUTION . . . . . . . . . .
. . . . . 278 5.7.2 HYPERPARAMETER OPTIMIZATION . . . . . . . . . . . .
. . . . 280 5.7.3 BAYESIAN NEURAL NETWORKS FOR CLASSIFICATION . . . . .
. . . . 281 EXERCISES . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . 284 6 KERNEL METHODS 291 6.1 DUAL REPRESENTATIONS . .
. . . . . . . . . . . . . . . . . . . . . . . 293 6.2 CONSTRUCTING
KERNELS . . . . . . . . . . . . . . . . . . . . . . . . . 294 6.3 RADIAL
BASIS FUNCTION NETWORKS . . . . . . . . . . . . . . . . . . . 299 6.3.1
NADARAYA-WATSON MODEL . . . . . . . . . . . . . . . . . . . 301 6.4
GAUSSIAN PROCESSES . . . . . . . . . . . . . . . . . . . . . . . . . .
303 6.4.1 LINEAR REGRESSION REVISITED . . . . . . . . . . . . . . . . .
. 304 6.4.2 GAUSSIAN PROCESSES FOR REGRESSION . . . . . . . . . . . . .
. 306 6.4.3 LEARNING THE HYPERPARAMETERS . . . . . . . . . . . . . . . .
311 6.4.4 AUTOMATIC RELEVANCE DETERMINATION . . . . . . . . . . . . .
312 6.4.5 GAUSSIAN PROCESSES FOR CLASSIFICATION . . . . . . . . . . . .
. 313 6.4.6 LAPLACE APPROXIMATION . . . . . . . . . . . . . . . . . . .
. 315 6.4.7 CONNECTION TO NEURAL NETWORKS . . . . . . . . . . . . . . .
. 319 EXERCISES . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . 320 7 SPARSE KERNEL MACHINES 325 7.1 MAXIMUM MARGIN
CLASSIFIERS . . . . . . . . . . . . . . . . . . . . 326 7.1.1
OVERLAPPING CLASS DISTRIBUTIONS . . . . . . . . . . . . . . . . 331
7.1.2 RELATION TO LOGISTIC REGRESSION . . . . . . . . . . . . . . . .
336 7.1.3 MULTICLASS SVMS . . . . . . . . . . . . . . . . . . . . . . .
338 7.1.4 SVMS FOR REGRESSION . . . . . . . . . . . . . . . . . . . . .
339 7.1.5 COMPUTATIONAL LEARNING THEORY . . . . . . . . . . . . . . . .
344 7.2 RELEVANCE VECTOR MACHINES . . . . . . . . . . . . . . . . . . .
. . 345 7.2.1 RVM FOR REGRESSION . . . . . . . . . . . . . . . . . . . .
. . 345 7.2.2 ANALYSIS OF SPARSITY . . . . . . . . . . . . . . . . . . .
. . . 349 7.2.3 RVM FOR CLASSIFICATION . . . . . . . . . . . . . . . . .
. . . 353 EXERCISES . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . 357 C * CHRISTPHER M. BISHOP (2002*2006). SPRINGER,
2006. FIRST PRINTING. FURTHER INFORMATION AVAILABLE AT
HTTP://RESEARCH.MICROSOFT.COM/ * CMBISHOP/PRML CONTENTS XVII 8 GRAPHICAL
MODELS 359 8.1 BAYESIAN NETWORKS . . . . . . . . . . . . . . . . . . . .
. . . . . . 360 8.1.1 EXAMPLE: POLYNOMIAL REGRESSION . . . . . . . . . .
. . . . . 362 8.1.2 GENERATIVE MODELS . . . . . . . . . . . . . . . . .
. . . . . 365 8.1.3 DISCRETE VARIABLES . . . . . . . . . . . . . . . . .
. . . . . . 366 8.1.4 LINEAR-GAUSSIAN MODELS . . . . . . . . . . . . . .
. . . . . 370 8.2 CONDITIONAL INDEPENDENCE . . . . . . . . . . . . . . .
. . . . . . . 372 8.2.1 THREE EXAMPLE GRAPHS . . . . . . . . . . . . . .
. . . . . . 373 8.2.2 D-SEPARATION . . . . . . . . . . . . . . . . . . .
. . . . . . 378 8.3 MARKOV RANDOM FIELDS . . . . . . . . . . . . . . . .
. . . . . . . 383 8.3.1 CONDITIONAL INDEPENDENCE PROPERTIES . . . . . .
. . . . . . . 383 8.3.2 FACTORIZATION PROPERTIES . . . . . . . . . . . .
. . . . . . . 384 8.3.3 ILLUSTRATION: IMAGE DE-NOISING . . . . . . . . .
. . . . . . . 387 8.3.4 RELATION TO DIRECTED GRAPHS . . . . . . . . . .
. . . . . . . . 390 8.4 INFERENCE IN GRAPHICAL MODELS . . . . . . . . .
. . . . . . . . . . . 393 8.4.1 INFERENCE ON A CHAIN . . . . . . . . . .
. . . . . . . . . . . 394 8.4.2 TREES . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . 398 8.4.3 FACTOR GRAPHS . . . . . . . . . . . .
. . . . . . . . . . . . . 399 8.4.4 THE SUM-PRODUCT ALGORITHM . . . . .
. . . . . . . . . . . . . 402 8.4.5 THE MAX-SUM ALGORITHM . . . . . . .
. . . . . . . . . . . . 411 8.4.6 EXACT INFERENCE IN GENERAL GRAPHS . .
. . . . . . . . . . . . 416 8.4.7 LOOPY BELIEF PROPAGATION . . . . . . .
. . . . . . . . . . . . 417 8.4.8 LEARNING THE GRAPH STRUCTURE . . . . .
. . . . . . . . . . . . 418 EXERCISES . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . 418 9 MIXTURE MODELS AND EM 423 9.1 K
-MEANS CLUSTERING . . . . . . . . . . . . . . . . . . . . . . . . . 424
9.1.1 IMAGE SEGMENTATION AND COMPRESSION . . . . . . . . . . . . 428 9.2
MIXTURES OF GAUSSIANS . . . . . . . . . . . . . . . . . . . . . . . .
430 9.2.1 MAXIMUM LIKELIHOOD . . . . . . . . . . . . . . . . . . . . .
432 9.2.2 EM FOR GAUSSIAN MIXTURES . . . . . . . . . . . . . . . . . .
435 9.3 AN ALTERNATIVE VIEW OF EM . . . . . . . . . . . . . . . . . . .
. . 439 9.3.1 GAUSSIAN MIXTURES REVISITED . . . . . . . . . . . . . . .
. . 441 9.3.2 RELATION TO K -MEANS . . . . . . . . . . . . . . . . . . .
. . 443 9.3.3 MIXTURES OF BERNOULLI DISTRIBUTIONS . . . . . . . . . . .
. . . 444 9.3.4 EM FOR BAYESIAN LINEAR REGRESSION . . . . . . . . . . .
. . . 448 9.4 THE EM ALGORITHM IN GENERAL . . . . . . . . . . . . . . .
. . . . . 450 EXERCISES . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . 455 10 APPROXIMATE INFERENCE 461 10.1 VARIATIONAL
INFERENCE . . . . . . . . . . . . . . . . . . . . . . . . . 462 10.1.1
FACTORIZED DISTRIBUTIONS . . . . . . . . . . . . . . . . . . . . 464
10.1.2 PROPERTIES OF FACTORIZED APPROXIMATIONS . . . . . . . . . . . 466
10.1.3 EXAMPLE: THE UNIVARIATE GAUSSIAN . . . . . . . . . . . . . . 470
10.1.4 MODEL COMPARISON . . . . . . . . . . . . . . . . . . . . . . 473
10.2 ILLUSTRATION: VARIATIONAL MIXTURE OF GAUSSIANS . . . . . . . . . .
. . 474 C * CHRISTPHER M. BISHOP (2002*2006). SPRINGER, 2006. FIRST
PRINTING. FURTHER INFORMATION AVAILABLE AT
HTTP://RESEARCH.MICROSOFT.COM/ * CMBISHOP/PRML XVIII CONTENTS 10.2.1
VARIATIONAL DISTRIBUTION . . . . . . . . . . . . . . . . . . . . 475
10.2.2 VARIATIONAL LOWER BOUND . . . . . . . . . . . . . . . . . . . 481
10.2.3 PREDICTIVE DENSITY . . . . . . . . . . . . . . . . . . . . . . .
482 10.2.4 DETERMINING THE NUMBER OF COMPONENTS . . . . . . . . . . .
483 10.2.5 INDUCED FACTORIZATIONS . . . . . . . . . . . . . . . . . . .
. 485 10.3 VARIATIONAL LINEAR REGRESSION . . . . . . . . . . . . . . . .
. . . . 486 10.3.1 VARIATIONAL DISTRIBUTION . . . . . . . . . . . . . .
. . . . . . 486 10.3.2 PREDICTIVE DISTRIBUTION . . . . . . . . . . . . .
. . . . . . . 488 10.3.3 LOWER BOUND . . . . . . . . . . . . . . . . . .
. . . . . . . 489 10.4 EXPONENTIAL FAMILY DISTRIBUTIONS . . . . . . . .
. . . . . . . . . . 490 10.4.1 VARIATIONAL MESSAGE PASSING . . . . . . .
. . . . . . . . . . 491 10.5 LOCAL VARIATIONAL METHODS . . . . . . . . .
. . . . . . . . . . . . . 493 10.6 VARIATIONAL LOGISTIC REGRESSION . . .
. . . . . . . . . . . . . . . . 498 10.6.1 VARIATIONAL POSTERIOR
DISTRIBUTION . . . . . . . . . . . . . . . 498 10.6.2 OPTIMIZING THE
VARIATIONAL PARAMETERS . . . . . . . . . . . . 500 10.6.3 INFERENCE OF
HYPERPARAMETERS . . . . . . . . . . . . . . . . 502 10.7 EXPECTATION
PROPAGATION . . . . . . . . . . . . . . . . . . . . . . . 505 10.7.1
EXAMPLE: THE CLUTTER PROBLEM . . . . . . . . . . . . . . . . 511 10.7.2
EXPECTATION PROPAGATION ON GRAPHS . . . . . . . . . . . . . . 513
EXERCISES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . 517 11 SAMPLING METHODS 523 11.1 BASIC SAMPLING ALGORITHMS . . . .
. . . . . . . . . . . . . . . . . 526 11.1.1 STANDARD DISTRIBUTIONS . .
. . . . . . . . . . . . . . . . . . 526 11.1.2 REJECTION SAMPLING . . .
. . . . . . . . . . . . . . . . . . . 528 11.1.3 ADAPTIVE REJECTION
SAMPLING . . . . . . . . . . . . . . . . . 530 11.1.4 IMPORTANCE
SAMPLING . . . . . . . . . . . . . . . . . . . . . 532 11.1.5
SAMPLING-IMPORTANCE-RESAMPLING . . . . . . . . . . . . . . 534 11.1.6
SAMPLING AND THE EM ALGORITHM . . . . . . . . . . . . . . . 536 11.2
MARKOV CHAIN MONTE CARLO . . . . . . . . . . . . . . . . . . . . . 537
11.2.1 MARKOV CHAINS . . . . . . . . . . . . . . . . . . . . . . . . 539
11.2.2 THE METROPOLIS-HASTINGS ALGORITHM . . . . . . . . . . . . . 541
11.3 GIBBS SAMPLING . . . . . . . . . . . . . . . . . . . . . . . . . .
. 542 11.4 SLICE SAMPLING . . . . . . . . . . . . . . . . . . . . . . .
. . . . . 546 11.5 THE HYBRID MONTE CARLO ALGORITHM . . . . . . . . . .
. . . . . . . 548 11.5.1 DYNAMICAL SYSTEMS . . . . . . . . . . . . . . .
. . . . . . . 548 11.5.2 HYBRID MONTE CARLO . . . . . . . . . . . . . .
. . . . . . . 552 11.6 ESTIMATING THE PARTITION FUNCTION . . . . . . . .
. . . . . . . . . . 554 EXERCISES . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . 556 12 CONTINUOUS LATENT VARIABLES 559
12.1 PRINCIPAL COMPONENT ANALYSIS . . . . . . . . . . . . . . . . . . .
. 561 12.1.1 MAXIMUM VARIANCE FORMULATION . . . . . . . . . . . . . . .
561 12.1.2 MINIMUM-ERROR FORMULATION . . . . . . . . . . . . . . . . .
563 12.1.3 APPLICATIONS OF PCA . . . . . . . . . . . . . . . . . . . . .
565 12.1.4 PCA FOR HIGH-DIMENSIONAL DATA . . . . . . . . . . . . . . .
569 C * CHRISTPHER M. BISHOP (2002*2006). SPRINGER, 2006. FIRST
PRINTING. FURTHER INFORMATION AVAILABLE AT
HTTP://RESEARCH.MICROSOFT.COM/ * CMBISHOP/PRML CONTENTS XIX 12.2
PROBABILISTIC PCA . . . . . . . . . . . . . . . . . . . . . . . . . .
570 12.2.1 MAXIMUM LIKELIHOOD PCA . . . . . . . . . . . . . . . . . .
574 12.2.2 EM ALGORITHM FOR PCA . . . . . . . . . . . . . . . . . . . .
577 12.2.3 BAYESIAN PCA . . . . . . . . . . . . . . . . . . . . . . . .
580 12.2.4 FACTOR ANALYSIS . . . . . . . . . . . . . . . . . . . . . . .
. 583 12.3 KERNEL PCA . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . 586 12.4 NONLINEAR LATENT VARIABLE MODELS . . . . . . . . . .
. . . . . . . . 591 12.4.1 INDEPENDENT COMPONENT ANALYSIS . . . . . . .
. . . . . . . . 591 12.4.2 AUTOASSOCIATIVE NEURAL NETWORKS . . . . . . .
. . . . . . . . 592 12.4.3 MODELLING NONLINEAR MANIFOLDS . . . . . . . .
. . . . . . . . 595 EXERCISES . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . 599 13 SEQUENTIAL DATA 605 13.1 MARKOV MODELS
. . . . . . . . . . . . . . . . . . . . . . . . . . . . 607 13.2 HIDDEN
MARKOV MODELS . . . . . . . . . . . . . . . . . . . . . . . 610 13.2.1
MAXIMUM LIKELIHOOD FOR THE HMM . . . . . . . . . . . . . 615 13.2.2 THE
FORWARD-BACKWARD ALGORITHM . . . . . . . . . . . . . . 618 13.2.3 THE
SUM-PRODUCT ALGORITHM FOR THE HMM . . . . . . . . . . 625 13.2.4 SCALING
FACTORS . . . . . . . . . . . . . . . . . . . . . . . . 627 13.2.5 THE
VITERBI ALGORITHM . . . . . . . . . . . . . . . . . . . . . 629 13.2.6
EXTENSIONS OF THE HIDDEN MARKOV MODEL . . . . . . . . . . . 631 13.3
LINEAR DYNAMICAL SYSTEMS . . . . . . . . . . . . . . . . . . . . . . 635
13.3.1 INFERENCE IN LDS . . . . . . . . . . . . . . . . . . . . . . .
638 13.3.2 LEARNING IN LDS . . . . . . . . . . . . . . . . . . . . . . .
642 13.3.3 EXTENSIONS OF LDS . . . . . . . . . . . . . . . . . . . . . .
644 13.3.4 PARTICLE FILTERS . . . . . . . . . . . . . . . . . . . . . .
. . . 645 EXERCISES . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . 646 14 COMBINING MODELS 653 14.1 BAYESIAN MODEL
AVERAGING . . . . . . . . . . . . . . . . . . . . . . 654 14.2
COMMITTEES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
655 14.3 BOOSTING . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . 657 14.3.1 MINIMIZING EXPONENTIAL ERROR . . . . . . . . . . . .
. . . . 659 14.3.2 ERROR FUNCTIONS FOR BOOSTING . . . . . . . . . . . .
. . . . . 661 14.4 TREE-BASED MODELS . . . . . . . . . . . . . . . . . .
. . . . . . . . 663 14.5 CONDITIONAL MIXTURE MODELS . . . . . . . . . .
. . . . . . . . . . . 666 14.5.1 MIXTURES OF LINEAR REGRESSION MODELS .
. . . . . . . . . . . . 667 14.5.2 MIXTURES OF LOGISTIC MODELS . . . . .
. . . . . . . . . . . . 670 14.5.3 MIXTURES OF EXPERTS . . . . . . . . .
. . . . . . . . . . . . . 672 EXERCISES . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . 674 APPENDIX A DATA SETS 677
APPENDIX B PROBABILITY DISTRIBUTIONS 685 APPENDIX C PROPERTIES OF
MATRICES 695 C * CHRISTPHER M. BISHOP (2002*2006). SPRINGER, 2006. FIRST
PRINTING. FURTHER INFORMATION AVAILABLE AT
HTTP://RESEARCH.MICROSOFT.COM/ * CMBISHOP/PRML XX CONTENTS APPENDIX D
CALCULUS OF VARIATIONS 703 APPENDIX E LAGRANGE MULTIPLIERS 707
REFERENCES 711 C * CHRISTPHER M. BISHOP (2002*2006). SPRINGER, 2006.
FIRST PRINTING. FURTHER INFORMATION AVAILABLE AT
HTTP://RESEARCH.MICROSOFT.COM/ * CMBISHOP/PRML |
any_adam_object | 1 |
any_adam_object_boolean | 1 |
author | Bishop, Christopher M. 1959- |
author_GND | (DE-588)120454165 |
author_facet | Bishop, Christopher M. 1959- |
author_role | aut |
author_sort | Bishop, Christopher M. 1959- |
author_variant | c m b cm cmb |
building | Verbundindex |
bvnumber | BV021648269 |
callnumber-first | Q - Science |
callnumber-label | Q327 |
callnumber-raw | Q327 |
callnumber-search | Q327 |
callnumber-sort | Q 3327 |
callnumber-subject | Q - General Science |
classification_rvk | QH 234 ST 330 |
classification_tum | DAT 709f DAT 770f |
ctrlnum | (OCoLC)254896146 (DE-599)BVBBV021648269 |
dewey-full | 006.4 |
dewey-hundreds | 000 - Computer science, information, general works |
dewey-ones | 006 - Special computer methods |
dewey-raw | 006.4 |
dewey-search | 006.4 |
dewey-sort | 16.4 |
dewey-tens | 000 - Computer science, information, general works |
discipline | Informatik Wirtschaftswissenschaften |
discipline_str_mv | Informatik Wirtschaftswissenschaften |
format | Book |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>00000nam a2200000 c 4500</leader><controlfield tag="001">BV021648269</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20230620</controlfield><controlfield tag="007">t</controlfield><controlfield tag="008">060707s2006 xxuad|| |||| 00||| eng d</controlfield><datafield tag="016" ind1="7" ind2=" "><subfield code="a">977347575</subfield><subfield code="2">DE-101</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">0387310738</subfield><subfield code="c">Gb. (Pr. in Vorb.)</subfield><subfield code="9">0-387-31073-8</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9780387310732</subfield><subfield code="9">978-0-387-31073-2</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)254896146</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)BVBBV021648269</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">rakddb</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="044" ind1=" " ind2=" "><subfield code="a">xxu</subfield><subfield code="c">XD-US</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-91G</subfield><subfield code="a">DE-703</subfield><subfield code="a">DE-20</subfield><subfield code="a">DE-473</subfield><subfield code="a">DE-355</subfield><subfield code="a">DE-739</subfield><subfield code="a">DE-860</subfield><subfield code="a">DE-384</subfield><subfield code="a">DE-706</subfield><subfield code="a">DE-573</subfield><subfield code="a">DE-91</subfield><subfield code="a">DE-19</subfield><subfield code="a">DE-945</subfield><subfield code="a">DE-898</subfield><subfield code="a">DE-861</subfield><subfield code="a">DE-29T</subfield><subfield code="a">DE-521</subfield><subfield code="a">DE-83</subfield><subfield code="a">DE-634</subfield><subfield code="a">DE-11</subfield><subfield code="a">DE-525</subfield><subfield code="a">DE-188</subfield><subfield code="a">DE-M382</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">Q327</subfield></datafield><datafield tag="082" ind1="0" ind2=" "><subfield code="a">006.4</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">QH 234</subfield><subfield code="0">(DE-625)141549:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 330</subfield><subfield code="0">(DE-625)143663:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">004</subfield><subfield code="2">sdnb</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">DAT 709f</subfield><subfield code="2">stub</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">DAT 770f</subfield><subfield code="2">stub</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Bishop, Christopher M.</subfield><subfield code="d">1959-</subfield><subfield code="e">Verfasser</subfield><subfield code="0">(DE-588)120454165</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Pattern recognition and machine learning</subfield><subfield code="c">Christopher M. Bishop</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">New York, NY</subfield><subfield code="b">Springer</subfield><subfield code="c">2006</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">XX, 738 S.</subfield><subfield code="b">Ill., graph. Darst.</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="490" ind1="0" ind2=" "><subfield code="a">Information science and statistics</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">Literaturverzeichnis Seite 711 - 728</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">Hier auch später erschienene, unveränderte Nachdrucke</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Maschinelles Lernen</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Mustererkennung</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Machine learning</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Pattern perception</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Mustererkennung</subfield><subfield code="0">(DE-588)4040936-3</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Maschinelles Lernen</subfield><subfield code="0">(DE-588)4193754-5</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Mustererkennung</subfield><subfield code="0">(DE-588)4040936-3</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="1"><subfield code="a">Maschinelles Lernen</subfield><subfield code="0">(DE-588)4193754-5</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="775" ind1="0" ind2="8"><subfield code="i">Äquivalent</subfield><subfield code="n">Druck-Ausgabe, Paperback</subfield><subfield code="d">2016</subfield><subfield code="z">978-1-4939-3843-8</subfield><subfield code="w">(DE-604)BV044802275</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="q">text/html</subfield><subfield code="u">http://deposit.dnb.de/cgi-bin/dokserv?id=2718189&prov=M&dok_var=1&dok_ext=htm</subfield><subfield code="3">Beschreibung</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="q">text/html</subfield><subfield code="u">http://www.loc.gov/catdir/enhancements/fy0818/2006922522-d.html</subfield><subfield code="3">Publisher description</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="q">text/html</subfield><subfield code="u">http://www.loc.gov/catdir/enhancements/fy0818/2006922522-t.html</subfield><subfield code="3">Table of contents only</subfield></datafield><datafield tag="856" ind1="4" ind2=" "><subfield code="u">http://www3.ub.tu-berlin.de/ihv/001716289.pdf</subfield><subfield code="3">Inhaltsverzeichnis</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="m">SWBplus Fremddatenuebernahme</subfield><subfield code="q">application/pdf</subfield><subfield code="u">http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=014862953&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA</subfield><subfield code="3">Inhaltsverzeichnis</subfield></datafield><datafield tag="943" ind1="1" ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-014862953</subfield></datafield></record></collection> |
id | DE-604.BV021648269 |
illustrated | Illustrated |
index_date | 2024-07-02T15:02:12Z |
indexdate | 2024-07-20T09:06:57Z |
institution | BVB |
isbn | 0387310738 9780387310732 |
language | English |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-014862953 |
oclc_num | 254896146 |
open_access_boolean | |
owner | DE-91G DE-BY-TUM DE-703 DE-20 DE-473 DE-BY-UBG DE-355 DE-BY-UBR DE-739 DE-860 DE-384 DE-706 DE-573 DE-91 DE-BY-TUM DE-19 DE-BY-UBM DE-945 DE-898 DE-BY-UBR DE-861 DE-29T DE-521 DE-83 DE-634 DE-11 DE-525 DE-188 DE-M382 |
owner_facet | DE-91G DE-BY-TUM DE-703 DE-20 DE-473 DE-BY-UBG DE-355 DE-BY-UBR DE-739 DE-860 DE-384 DE-706 DE-573 DE-91 DE-BY-TUM DE-19 DE-BY-UBM DE-945 DE-898 DE-BY-UBR DE-861 DE-29T DE-521 DE-83 DE-634 DE-11 DE-525 DE-188 DE-M382 |
physical | XX, 738 S. Ill., graph. Darst. |
publishDate | 2006 |
publishDateSearch | 2006 |
publishDateSort | 2006 |
publisher | Springer |
record_format | marc |
series2 | Information science and statistics |
spelling | Bishop, Christopher M. 1959- Verfasser (DE-588)120454165 aut Pattern recognition and machine learning Christopher M. Bishop New York, NY Springer 2006 XX, 738 S. Ill., graph. Darst. txt rdacontent n rdamedia nc rdacarrier Information science and statistics Literaturverzeichnis Seite 711 - 728 Hier auch später erschienene, unveränderte Nachdrucke Maschinelles Lernen Mustererkennung Machine learning Pattern perception Mustererkennung (DE-588)4040936-3 gnd rswk-swf Maschinelles Lernen (DE-588)4193754-5 gnd rswk-swf Mustererkennung (DE-588)4040936-3 s Maschinelles Lernen (DE-588)4193754-5 s DE-604 Äquivalent Druck-Ausgabe, Paperback 2016 978-1-4939-3843-8 (DE-604)BV044802275 text/html http://deposit.dnb.de/cgi-bin/dokserv?id=2718189&prov=M&dok_var=1&dok_ext=htm Beschreibung text/html http://www.loc.gov/catdir/enhancements/fy0818/2006922522-d.html Publisher description text/html http://www.loc.gov/catdir/enhancements/fy0818/2006922522-t.html Table of contents only http://www3.ub.tu-berlin.de/ihv/001716289.pdf Inhaltsverzeichnis SWBplus Fremddatenuebernahme application/pdf http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=014862953&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA Inhaltsverzeichnis |
spellingShingle | Bishop, Christopher M. 1959- Pattern recognition and machine learning Maschinelles Lernen Mustererkennung Machine learning Pattern perception Mustererkennung (DE-588)4040936-3 gnd Maschinelles Lernen (DE-588)4193754-5 gnd |
subject_GND | (DE-588)4040936-3 (DE-588)4193754-5 |
title | Pattern recognition and machine learning |
title_auth | Pattern recognition and machine learning |
title_exact_search | Pattern recognition and machine learning |
title_exact_search_txtP | Pattern recognition and machine learning |
title_full | Pattern recognition and machine learning Christopher M. Bishop |
title_fullStr | Pattern recognition and machine learning Christopher M. Bishop |
title_full_unstemmed | Pattern recognition and machine learning Christopher M. Bishop |
title_short | Pattern recognition and machine learning |
title_sort | pattern recognition and machine learning |
topic | Maschinelles Lernen Mustererkennung Machine learning Pattern perception Mustererkennung (DE-588)4040936-3 gnd Maschinelles Lernen (DE-588)4193754-5 gnd |
topic_facet | Maschinelles Lernen Mustererkennung Machine learning Pattern perception |
url | http://deposit.dnb.de/cgi-bin/dokserv?id=2718189&prov=M&dok_var=1&dok_ext=htm http://www.loc.gov/catdir/enhancements/fy0818/2006922522-d.html http://www.loc.gov/catdir/enhancements/fy0818/2006922522-t.html http://www3.ub.tu-berlin.de/ihv/001716289.pdf http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=014862953&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |
work_keys_str_mv | AT bishopchristopherm patternrecognitionandmachinelearning |
Es ist kein Print-Exemplar vorhanden.
Inhaltsverzeichnis