The EM algorithm and extensions:
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Buch |
Sprache: | English |
Veröffentlicht: |
Hoboken, NJ
Wiley-Interscience
2008
|
Ausgabe: | 2. ed. |
Schriftenreihe: | Wiley series in probability and statistics
|
Schlagworte: | |
Online-Zugang: | Table of contents only Inhaltsverzeichnis |
Beschreibung: | XXVII, 359 S. |
ISBN: | 9780471201700 0471201707 |
Internformat
MARC
LEADER | 00000nam a2200000zc 4500 | ||
---|---|---|---|
001 | BV023115621 | ||
003 | DE-604 | ||
005 | 20241210 | ||
007 | t| | ||
008 | 080206s2008 xxu |||| 00||| eng d | ||
010 | |a 2007017908 | ||
020 | |a 9780471201700 |c cl |9 978-0-471-20170-0 | ||
020 | |a 0471201707 |9 0-471-20170-7 | ||
035 | |a (OCoLC)137325058 | ||
035 | |a (DE-599)DNB 2007017908 | ||
040 | |a DE-604 |b ger |e aacr | ||
041 | 0 | |a eng | |
044 | |a xxu |c US | ||
049 | |a DE-91G |a DE-473 |a DE-355 |a DE-11 |a DE-578 |a DE-19 |a DE-20 |a DE-29T |a DE-634 | ||
050 | 0 | |a QA276.8 | |
082 | 0 | |a 519.5/44 | |
084 | |a QH 233 |0 (DE-625)141548: |2 rvk | ||
084 | |a SK 830 |0 (DE-625)143259: |2 rvk | ||
084 | |a MAT 620f |2 stub | ||
100 | 1 | |a McLachlan, Geoffrey J. |d 1946- |e Verfasser |0 (DE-588)128823348 |4 aut | |
245 | 1 | 0 | |a The EM algorithm and extensions |c Geoffrey J. McLachlan ; Thriyambakam Krishnan |
250 | |a 2. ed. | ||
264 | 1 | |a Hoboken, NJ |b Wiley-Interscience |c 2008 | |
300 | |a XXVII, 359 S. | ||
336 | |b txt |2 rdacontent | ||
337 | |b n |2 rdamedia | ||
338 | |b nc |2 rdacarrier | ||
490 | 0 | |a Wiley series in probability and statistics | |
650 | 4 | |a Expectation-maximization algorithms | |
650 | 4 | |a Estimation theory | |
650 | 4 | |a Missing observations (Statistics) | |
650 | 0 | 7 | |a Fehlende Daten |0 (DE-588)4264715-0 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Maximum-Likelihood-Schätzung |0 (DE-588)4194624-8 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a EM-Algorithmus |0 (DE-588)4659559-4 |2 gnd |9 rswk-swf |
689 | 0 | 0 | |a Maximum-Likelihood-Schätzung |0 (DE-588)4194624-8 |D s |
689 | 0 | 1 | |a Fehlende Daten |0 (DE-588)4264715-0 |D s |
689 | 0 | |8 1\p |5 DE-604 | |
689 | 1 | 0 | |a EM-Algorithmus |0 (DE-588)4659559-4 |D s |
689 | 1 | |5 DE-604 | |
700 | 1 | |a Krishnan, Thriyambakam |d 1938- |e Verfasser |0 (DE-588)104344466 |4 aut | |
856 | 4 | |u http://www.loc.gov/catdir/toc/ecip0716/2007017908.html |3 Table of contents only | |
856 | 4 | 2 | |m HBZ Datenaustausch |q application/pdf |u http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=016318148&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |3 Inhaltsverzeichnis |
883 | 1 | |8 1\p |a cgwrk |d 20201028 |q DE-101 |u https://d-nb.info/provenance/plan#cgwrk | |
943 | 1 | |a oai:aleph.bib-bvb.de:BVB01-016318148 |
Datensatz im Suchindex
_version_ | 1818065992514797568 |
---|---|
adam_text |
CONTENTS
PREFACE TO THE SECOND EDITION xix
PREFACE TO THE FIRST EDITION xxi
LIST OF EXAMPLES xxv
1 GENERAL INTRODUCTION 1
1.1 Introduction ]
1.2 Maximum Likelihood Estimation 3
1.3 Newton-Type Methods 5
1.3.1 Introduction 5
1.3.2 Newton-Raphson Method 5
1.3.3 Quasi-Newton Methods 6
1.3.4 Modified Newton Methods 6
1.4 Introductory Examples 8
1.4.1 Introduction 8
1.4.2 Example 1.1: A Multinomial Example 8
vii
ViM CONTENTS
1.4.3 Example 1.2: Estimation of Mixing Proportions 13
1.5 Formulation of the EM Algorithm 18
1.5.1 EM Algorithm 18
1.5.2 Example 1.3: Censored Exponentially Distributed Survival
Times 20
1.5.3 E-and M-Steps for the Regular Exponential Family 22
1.5.4 Example 1.4: Censored Exponentially Distributed Survival
Times (Example 1.3 Continued) 23
1.5.5 Generalized EM Algorithm 24
1.5.6 GEM Algorithm Based on One Newton-Raphson Step 24
1.5.7 EM Gradient Algorithm 25
1.5.8 EM Mapping 26
1.6 EM Algorithm for MAP and MPL Estimation 26
1.6.1 Maximum a Posteriori Estimation 26
1.6.2 Example 1.5: A Multinomial Example (Example 1.1
Continued) 27
1.6.3 Maximum Penalized Estimation 27
1.7 Brief Summary of the Properties of the EM Algorithm 28
1.8 History of the EM Algorithm 29
1.8.1 Early EM History 29
1.8.2 Work Before Dempster, Laird, and Rubin (1977) 29
1.8.3 EM Examples and Applications Since Dempster, Laird, and
Rubin (1977) 31
1.8.4 Two Interpretations of EM 32
1.8.5 Developments in EM Theory, Methodology, and Applications
33
1.9 Overview of the Book 36
1.10 Notations 37
2 EXAMPLES OF THE EM ALGORITHM 41
2.1 Introduction 41
2.2 Multivariate Data with Missing Values 42
2.2.1 Example 2.1: Bivariate Normal Data with Missing Values 42
2.2.2 Numerical Illustration 45
2.2.3 Multivariate Data: Buck's Method 45
2.3 Least Squares with Missing Data 47
2.3.1 Healy-Westmacott Procedure 47
CONTENTS IX
2.3.2 Example 2.2: Linear Regression with Missing Dependent
Values 47
2.3.3 Example 2.3: Missing Values in a Latin Square Design 49
2.3.4 Healy-Westmacott Procedure as an EM Algorithm 49
2.4 Example 2.4: Multinomial with Complex Cell Structure 51
2.5 Example 2.5: Analysis of PET and SPECT Data 54
2.6 Example 2.6: Multivariate ^-Distribution (Known D.F.) 58
2.6.1 ML Estimation of Multivariate ^-Distribution 58
2.6.2 Numerical Example: Stack Loss Data 61
2.7 Finite Normal Mixtures 61
2.7.1 Example 2.7: Univariate Component Densities 61
2.7.2 Example 2.8: Multivariate Component Densities 64
2.7.3 Numerical Example: Red Blood Cell Volume Data 65
2.8 Example 2.9: Grouped and Truncated Data 66
2.8.1 Introduction 66
2.8.2 Specification of Complete Data 66
2.8.3 E-Step 69
2.8.4 M-Step 70
2.8.5 Confirmation of Incomplete-Data Score Statistic 70
2.8.6 M-Step for Grouped Normal Data 71
2.8.7 Numerical Example: Grouped Log Normal Data 72
2.9 Example 2.10: A Hidden Markov AR( 1) model 73
3 BASIC THEORY OF THE EM ALGORITHM 77
3.1 Introduction 77
3.2 Monotonicity of the EM Algorithm 78
3.3 Monotonicity of a Generalized EM Algorithm 79
3.4 Convergence of an EM Sequence to a Stationary Value 79
3.4.1 Introduction 79
3.4.2 Regularity Conditions of Wu (1983) 80
3.4.3 Main Convergence Theorem for a Generalized EM Sequence 81
3.4.4 A Convergence Theorem for an EM Sequence 82
3.5 Convergence of an EM Sequence of Iterates 83
3.5.1 Introduction 83
3.5.2 Two Convergence Theorems of Wu (1983) 83
3.5.3 Convergence of an EM Sequence to a Unique Maximum
Likelihood Estimate 84
X CONTENTS
3.5.4 Constrained Parameter Spaces 84
3.6 Examples of Nontypical Behavior of an EM (GEM) Sequence 85
3.6.1 Example 3.1: Convergence to a Saddle Point 85
3.6.2 Example 3.2: Convergence to a Local Minimum 88
3.6.3 Example 3.3: Nonconvergence of a Generalized EM
Sequence 90
3.6.4 Example 3.4: Some E-Step Pathologies 93
3.7 Score Statistic 95
3.8 Missing Information 95
3.8.1 Missing Information Principle 95
3.8.2 Example 3.5: Censored Exponentially Distributed Survival
Times (Example 1.3 Continued) 96
3.9 Rate of Convergence of the EM Algorithm 99
3.9.1 Rate Matrix for Linear Convergence 99
3.9.2 Measuring the Linear Rate of Convergence 100
3.9.3 Rate Matrix in Terms of Information Matrices 101
3.9.4 Rate Matrix for Maximum a Posteriori Estimation 102
3.9.5 Derivation of Rate Matrix in Terms of Information Matrices 102
3.9.6 Example 3.6: Censored Exponentially Distributed Survival
Times (Example 1.3 Continued) 103
4 STANDARD ERRORS AND SPEEDING UP CONVERGENCE 105
4.1 Introduction 105
4.2 Observed Information Matrix 106
4.2.1 Direct Evaluation 106
4.2.2 Extraction of Observed Information Matrix in Terms of the
Complete-Data Log Likelihood 106
4.2.3 Regular Case 108
4.2.4 Evaluation of the Conditional Expected Complete-Data
Information Matrix 108
4.2.5 Examples 109
4.3 Approximations to Observed Information Matrix: i.i.d. Case 114
4.4 Observed Information Matrix for Grouped Data 116
4.4.1 Approximation Based on Empirical Information 116
4.4.2 Example 4.3: Grouped Data from an Exponential
Distribution 117
4.5 Supplemented EM Algorithm 120
CONTENTS Xi
4.5.1 Definition 120
4.5.2 Calculation of J(* ) via Numerical Differentiation 122
4.5.3 Stability 123
4.5.4 Monitoring Convergence 124
4.5.5 Difficulties of the SEM Algorithm 124
4.5.6 Example 4.4: Univariate Contaminated Normal Data 125
4.5.7 Example 4.5: Bivariate Normal Data with Missing Values 128
4.6 Bootstrap Approach to Standard Error Approximation 130
4.7 Baker's, Louis', and Oakes' Methods for Standard Error Computation 131
4.7.1 Baker's Method for Standard Error Computation 131
4.7.2 Louis'Method of Standard Error Computation 132
4.7.3 Oakes'Formula for Standard Error Computation 133
4.7.4 Example 4.6: Oakes'Standard Error for Example 1.1 134
4.7.5 Example 4.7: Louis' Method for Example 2.4 134
4.7.6 Baker's Method for Standard Error for Categorical Data 135
4.7.7 Example 4.8: Baker's Method for Example 2.4 136
4.8 Acceleration of the EM Algorithm via Aitken's Method 137
4.8.1 Aitken's Acceleration Method 137
4.8.2 Louis'Method 137
4.8.3 Example 4.9: Multinomial Data 138
4.8.4 Example 4.10: Geometric Mixture 139
4.8.5 Example 4.11: Grouped and Truncated Data. (Example 2.8
Continued) 142
4.9 An Aitken Acceleration-Based Stopping Criterion 142
4.10 Conjugate Gradient Acceleration of EM Algorithm 144
4.10.1 Conjugate Gradient Method 144
4.10.2 A Generalized Conjugate Gradient Algorithm 144
4.10.3 Accelerating the EM Algorithm 145
4.11 Hybrid Methods for Finding the MLE 146
4.11.1 Introduction 146
4.11.2 Combined EM and Modified Newton-Raphson Algorithm 146
4.12 A GEM Algorithm Based on One Newton-Raphson Step 148
4.12.1 Derivation of a Condition to be a Generalized EM Sequence 148
4.12.2 Simulation Experiment 149
4.13 EM Gradient Algorithm 149
4.14 A Quasi-Newton Acceleration of the EM Algorithm 151
4.14.1 The Method 151
XII CONTENTS
4.14.2 Example 4.12: Dirichlet Distribution 153
4.15 Ikeda Acceleration 15 7
5 EXTENSIONS OF THE EM ALGORITHM 159
5.1 Introduction 159
5.2 ECM Algorithm 160
5.2.1 Motivation 160
5.2.2 Formal Definition 160
5.2.3 Convergence Properties 162
5.2.4 Speed of Convergence 162
5.2.5 Convergence Rates of EM and ECM 163
5.2.6 Example 5.1: ECM Algorithm for Hidden Markov AR(1)
Model 164
5.2.7 Discussion 164
5.3 Multicycle ECM Algorithm 165
5.4 Example 5.2: Normal Mixtures with Equal Correlations 166
5.4.1 Normal Components with Equal Correlations 166
5.4.2 Application of ECM Algorithm 166
5.4.3 Fisher's Ms Data 168
5.5 Example 5.3: Mixture Models for Survival Data 168
5.5.1 Competing Risks in Survival Analysis 168
5.5.2 A Two-Component Mixture Regression Model 169
5.5.3 Observed Data 169
5.5.4 Application of EM Algorithm 170
5.5.5 M-Step for Gompertz Components 171
5.5.6 Application of a Multicycle ECM Algorithm 172
5.5.7 Other Examples of EM Algorithm in Survival Analysis 173
5.6 Example 5.4: Contingency Tables with Incomplete Data 174
5.7 ECME Algorithm 175
5.8 Example 5.5: MLE of f-Distribution with Unknown D.F. 176
5.8.1 Application of the EM Algorithm 176
5.8.2 M-Step 177
5.8.3 Application of ECM Algorithm 177
5.8.4 Application of ECME Algorithm 178
5.8.5 Some Standard Results 178
5.8.6 Missing Data 179
5.8.7 Numerical Examples 181
CONTENTS Xiii
5.8.8 Theoretical Results on the Rate of Convergence 181
5.9 Example 5.6: Variance Components 182
5.9.1 A Variance Components Model 182
5.9.2 E-Step 183
5.9.3 M-Step 184
5.9.4 Application of Two Versions of ECME Algorithm 185
5.9.5 Numerical Example 185
5.10 Linear Mixed Models 186
5.10.1 Introduction 186
5.10.2 General Form of Linear Mixed Model 187
5.10.3 REML Estimation 188
5.10.4 Example 5.7: REML Estimation in a Hierarchical Random
Effects Model 188
5.10.5 Some Other EM-Related Approaches to Mixed Model
Estimation 191
5.10.6 Generalized Linear Mixed Models 191
5.11 Example 5.8: Factor Analysis 193
5.11.1 EM Algorithm for Factor Analysis 193
5.11.2 ECME Algorithm for Factor Analysis 196
5.11.3 Numerical Example 196
5.11.4 EM Algorithm in Principal Component Analysis 196
5.12 Efficient Data Augmentation 198
5.12.1 Motivation 198
5.12.2 Maximum Likelihood Estimation of f-Distribution 198
5.12.3 Variance Components Model 202
5.13 Alternating ECM Algorithm 202
5.14 Example 5.9: Mixtures of Factor Analyzers 204
5.14.1 Normal Component Factor Analyzers 205
5.14.2 E-step 205
5.14.3 CM-steps 206
5.14.4 if-Component Factor Analyzers 207
5.14.5 E-step 210
5.14.6 CM-steps 211
5.15 Parameter-Expanded EM (PX-EM) Algorithm 212
5.16 EMS Algorithm 213
5.17 One-Step-Late Algorithm 213
5.18 Variance Estimation for Penalized EM and OSL Algorithms 214
XiV CONTENTS
5.18.1 Penalized EM Algorithm 214
5.18.2 OSL Algorithm 215
5.18.3 Example 5.9: Variance of MPLE for the Multinomial (Examples
I.1 and 4.1 Continued) 215
5.19 Incremental EM 216
5.20 Linear Inverse Problems 217
6 MONTE CARLO VERSIONS OF THE EM ALGORITHM 219
6.1 Introduction 219
6.2 Monte Carlo Techniques 220
6.2.1 Integration and Optimization 220
6.2.2 Example 6.1: Monte Carlo Integration 221
6.3 Monte Carlo EM 221
6.3.1 Introduction 221
6.3.2 Example 6.2: Monte Carlo EM for Censored Data from
Normal 223
6.3.3 Example 6.3: MCEM for a Two-Parameter Multinomial
(Example 2.4 Continued) 224
6.3.4 MCEM in Generalized Linear Mixed Models 224
6.3.5 Estimation of Standard Error with MCEM 225
6.3.6 Example 6.4: MCEM Estimate of Standard Error for
One-Parameter Multinomial (Example I.I Continued) 226
6.3.7 Stochastic EM Algorithm 227
6.4 Data Augmentation 228
6.4.1 The Algorithm 228
6.4.2 Example 6.5: Data Augmentation in the Multinomial (Examples
I.I, 1.5 Continued) 229
6.5 Bayesian EM 230
6.5.1 Posterior Mode by EM 230
6.5.2 Example 6.6: Bayesian EM for Normal with Semi-Conjugate
Prior 231
6.6 I.I.D. Monte Carlo Algorithms 232
6.6.1 Introduction 232
6.6.2 Rejection Sampling Methods 233
6.6.3 Importance Sampling 234
6.7 Markov Chain Monte Carlo Algorithms 236
6.7.1 Introduction 236
CONTENTS XV
6.7.2 Essence of MCMC 238
6.7.3 Metropolis-Hastings Algorithms 239
6.8 Gibbs Sampling 241
6.8.1 Introduction 241
6.8.2 Rao-Blackwellized Estimates with Gibbs Samples 242
6.8.3 Example 6.7: Why Does Gibbs Sampling Work? 243
6.9 Examples of MCMC Algorithms 245
6.9.1 Example 6.8: M-H Algorithm for Bayesian Probit
Regression 245
6.9.2 Monte Carlo EM with MCMC 246
6.9.3 Example 6.9: Gibbs Sampling for the Mixture Problem 249
6.9.4 Example 6.10: Bayesian Probit Analysis with Data
Augmentation 250
6.9.5 Example 6.11: Gibbs Sampling for Censored Normal 251
6.10 Relationship of EM to Gibbs Sampling 254
6.10.1 EM-Gibbs Sampling Connection 254
6.10.2 Example 6.12: EM-Gibbs Connection for Censored Data from
Normal (Example 6.11 Continued) 256
6.10.3 Example 6.13: EM-Gibbs Connection for Normal Mixtures 257
6.10.4 Rate of Convergence of Gibbs Sampling and EM 257
6.11 Data Augmentation and Gibbs Sampling 258
6.11.1 Introduction 258
6.11.2 Example 6.14: Data Augmentation and Gibbs Sampling for
Censored Normal (Example 6.12 Continued) 259
6.11.3 Example 6.15: Gibbs Sampling for a Complex Multinomial
(Example 2.4 Continued) 260
6.11.4 Gibbs Sampling Analogs of ECM and ECME Algorithms 261
6.12 Empirical Bayes and EM 263
6.13 Multiple Imputation 264
6.14 Missing-Data Mechanism, Ignorability. and EM Algorithm 265
7 SOME GENERALIZATIONS OF THE EM ALGORITHM 269
7.1 Introduction 269
7.2 Estimating Equations and Estimating Functions 270
7.3 Quasi-Score and the Projection-Solution Algorithm 270
7.4 Expectation-Solution (ES) Algorithm 273
7.4.1 Introduction 273
XVi CONTENTS
7.4.2 Computational and Asymptotic Properties of the ES
Algorithm 274
7.4.3 Example 7.1: Multinomial Example by ES Algorithm (Example
I.I Continued) 274
7.5 Other Generalizations 275
7.6 Variational Bayesian EM Algorithm 276
7.7 MM Algorithm 278
7.7.1 Introduction 278
7.7.2 Methods for Constructing Majorizing/Minorizing Functions 279
7.7.3 Example 7.2: MM Algorithm for the Complex Multinomial
(Example 1.1 Continued) 280
7.8 Lower Bound Maximization 281
7.9 Interval EM Algorithm 283
7.9.1 The Algorithm 283
7.9.2 Example 7.3: Interval-EM Algorithm for the Complex
Multinomial (Example 2.4 Continued) 283
7.10 Competing Methods and Some Comparisons with EM 284
7.10.1 Introduction 284
7.10.2 Simulated Annealing 284
7.10.3 Comparison of SA and EM Algorithm for Normal Mixtures 285
7.11 The Delta Algorithm 286
7.12 Image Space Reconstruction Algorithm 287
8 FURTHER APPLICATIONS OF THE EM ALGORITHM 289
8.1 Introduction 289
8.2 Hidden Markov Models 290
8.3 AIDS Epidemiology 293
8.4 Neural Networks 295
8.4.1 Introduction 295
8.4.2 EM Framework for NNs 296
8.4.3 Training Multi-Layer Perceptron Networks 297
8.4.4 Intractibility of the Exact E-Step for MLPs 300
8.4.5 An Integration of the Methodology Related to EM Training of
RBF Networks 300
8.4.6 Mixture of Experts 301
8.4.7 Simulation Experiment 305
8.4.8 Normalized Mixtures of Experts 306
CONTENTS XVN
8.4.9 Hierarchical Mixture of Experts 307
8.4.10 Boltzmann Machine 308
8.5 Data Mining 309
8.6 Bioinformatics 310
REFERENCES 311
AUTHOR INDEX 339
SUBJECT INDEX 347 |
adam_txt |
CONTENTS
PREFACE TO THE SECOND EDITION xix
PREFACE TO THE FIRST EDITION xxi
LIST OF EXAMPLES xxv
1 GENERAL INTRODUCTION 1
1.1 Introduction ]
1.2 Maximum Likelihood Estimation 3
1.3 Newton-Type Methods 5
1.3.1 Introduction 5
1.3.2 Newton-Raphson Method 5
1.3.3 Quasi-Newton Methods 6
1.3.4 Modified Newton Methods 6
1.4 Introductory Examples 8
1.4.1 Introduction 8
1.4.2 Example 1.1: A Multinomial Example 8
vii
ViM CONTENTS
1.4.3 Example 1.2: Estimation of Mixing Proportions 13
1.5 Formulation of the EM Algorithm 18
1.5.1 EM Algorithm 18
1.5.2 Example 1.3: Censored Exponentially Distributed Survival
Times 20
1.5.3 E-and M-Steps for the Regular Exponential Family 22
1.5.4 Example 1.4: Censored Exponentially Distributed Survival
Times (Example 1.3 Continued) 23
1.5.5 Generalized EM Algorithm 24
1.5.6 GEM Algorithm Based on One Newton-Raphson Step 24
1.5.7 EM Gradient Algorithm 25
1.5.8 EM Mapping 26
1.6 EM Algorithm for MAP and MPL Estimation 26
1.6.1 Maximum a Posteriori Estimation 26
1.6.2 Example 1.5: A Multinomial Example (Example 1.1
Continued) 27
1.6.3 Maximum Penalized Estimation 27
1.7 Brief Summary of the Properties of the EM Algorithm 28
1.8 History of the EM Algorithm 29
1.8.1 Early EM History 29
1.8.2 Work Before Dempster, Laird, and Rubin (1977) 29
1.8.3 EM Examples and Applications Since Dempster, Laird, and
Rubin (1977) 31
1.8.4 Two Interpretations of EM 32
1.8.5 Developments in EM Theory, Methodology, and Applications
33
1.9 Overview of the Book 36
1.10 Notations 37
2 EXAMPLES OF THE EM ALGORITHM 41
2.1 Introduction 41
2.2 Multivariate Data with Missing Values 42
2.2.1 Example 2.1: Bivariate Normal Data with Missing Values 42
2.2.2 Numerical Illustration 45
2.2.3 Multivariate Data: Buck's Method 45
2.3 Least Squares with Missing Data 47
2.3.1 Healy-Westmacott Procedure 47
CONTENTS IX
2.3.2 Example 2.2: Linear Regression with Missing Dependent
Values 47
2.3.3 Example 2.3: Missing Values in a Latin Square Design 49
2.3.4 Healy-Westmacott Procedure as an EM Algorithm 49
2.4 Example 2.4: Multinomial with Complex Cell Structure 51
2.5 Example 2.5: Analysis of PET and SPECT Data 54
2.6 Example 2.6: Multivariate ^-Distribution (Known D.F.) 58
2.6.1 ML Estimation of Multivariate ^-Distribution 58
2.6.2 Numerical Example: Stack Loss Data 61
2.7 Finite Normal Mixtures 61
2.7.1 Example 2.7: Univariate Component Densities 61
2.7.2 Example 2.8: Multivariate Component Densities 64
2.7.3 Numerical Example: Red Blood Cell Volume Data 65
2.8 Example 2.9: Grouped and Truncated Data 66
2.8.1 Introduction 66
2.8.2 Specification of Complete Data 66
2.8.3 E-Step 69
2.8.4 M-Step 70
2.8.5 Confirmation of Incomplete-Data Score Statistic 70
2.8.6 M-Step for Grouped Normal Data 71
2.8.7 Numerical Example: Grouped Log Normal Data 72
2.9 Example 2.10: A Hidden Markov AR( 1) model 73
3 BASIC THEORY OF THE EM ALGORITHM 77
3.1 Introduction 77
3.2 Monotonicity of the EM Algorithm 78
3.3 Monotonicity of a Generalized EM Algorithm 79
3.4 Convergence of an EM Sequence to a Stationary Value 79
3.4.1 Introduction 79
3.4.2 Regularity Conditions of Wu (1983) 80
3.4.3 Main Convergence Theorem for a Generalized EM Sequence 81
3.4.4 A Convergence Theorem for an EM Sequence 82
3.5 Convergence of an EM Sequence of Iterates 83
3.5.1 Introduction 83
3.5.2 Two Convergence Theorems of Wu (1983) 83
3.5.3 Convergence of an EM Sequence to a Unique Maximum
Likelihood Estimate 84
X CONTENTS
3.5.4 Constrained Parameter Spaces 84
3.6 Examples of Nontypical Behavior of an EM (GEM) Sequence 85
3.6.1 Example 3.1: Convergence to a Saddle Point 85
3.6.2 Example 3.2: Convergence to a Local Minimum 88
3.6.3 Example 3.3: Nonconvergence of a Generalized EM
Sequence 90
3.6.4 Example 3.4: Some E-Step Pathologies 93
3.7 Score Statistic 95
3.8 Missing Information 95
3.8.1 Missing Information Principle 95
3.8.2 Example 3.5: Censored Exponentially Distributed Survival
Times (Example 1.3 Continued) 96
3.9 Rate of Convergence of the EM Algorithm 99
3.9.1 Rate Matrix for Linear Convergence 99
3.9.2 Measuring the Linear Rate of Convergence 100
3.9.3 Rate Matrix in Terms of Information Matrices 101
3.9.4 Rate Matrix for Maximum a Posteriori Estimation 102
3.9.5 Derivation of Rate Matrix in Terms of Information Matrices 102
3.9.6 Example 3.6: Censored Exponentially Distributed Survival
Times (Example 1.3 Continued) 103
4 STANDARD ERRORS AND SPEEDING UP CONVERGENCE 105
4.1 Introduction 105
4.2 Observed Information Matrix 106
4.2.1 Direct Evaluation 106
4.2.2 Extraction of Observed Information Matrix in Terms of the
Complete-Data Log Likelihood 106
4.2.3 Regular Case 108
4.2.4 Evaluation of the Conditional Expected Complete-Data
Information Matrix 108
4.2.5 Examples 109
4.3 Approximations to Observed Information Matrix: i.i.d. Case 114
4.4 Observed Information Matrix for Grouped Data 116
4.4.1 Approximation Based on Empirical Information 116
4.4.2 Example 4.3: Grouped Data from an Exponential
Distribution 117
4.5 Supplemented EM Algorithm 120
CONTENTS Xi
4.5.1 Definition 120
4.5.2 Calculation of J(* ) via Numerical Differentiation 122
4.5.3 Stability 123
4.5.4 Monitoring Convergence 124
4.5.5 Difficulties of the SEM Algorithm 124
4.5.6 Example 4.4: Univariate Contaminated Normal Data 125
4.5.7 Example 4.5: Bivariate Normal Data with Missing Values 128
4.6 Bootstrap Approach to Standard Error Approximation 130
4.7 Baker's, Louis', and Oakes' Methods for Standard Error Computation 131
4.7.1 Baker's Method for Standard Error Computation 131
4.7.2 Louis'Method of Standard Error Computation 132
4.7.3 Oakes'Formula for Standard Error Computation 133
4.7.4 Example 4.6: Oakes'Standard Error for Example 1.1 134
4.7.5 Example 4.7: Louis' Method for Example 2.4 134
4.7.6 Baker's Method for Standard Error for Categorical Data 135
4.7.7 Example 4.8: Baker's Method for Example 2.4 136
4.8 Acceleration of the EM Algorithm via Aitken's Method 137
4.8.1 Aitken's Acceleration Method 137
4.8.2 Louis'Method 137
4.8.3 Example 4.9: Multinomial Data 138
4.8.4 Example 4.10: Geometric Mixture 139
4.8.5 Example 4.11: Grouped and Truncated Data. (Example 2.8
Continued) 142
4.9 An Aitken Acceleration-Based Stopping Criterion 142
4.10 Conjugate Gradient Acceleration of EM Algorithm 144
4.10.1 Conjugate Gradient Method 144
4.10.2 A Generalized Conjugate Gradient Algorithm 144
4.10.3 Accelerating the EM Algorithm 145
4.11 Hybrid Methods for Finding the MLE 146
4.11.1 Introduction 146
4.11.2 Combined EM and Modified Newton-Raphson Algorithm 146
4.12 A GEM Algorithm Based on One Newton-Raphson Step 148
4.12.1 Derivation of a Condition to be a Generalized EM Sequence 148
4.12.2 Simulation Experiment 149
4.13 EM Gradient Algorithm 149
4.14 A Quasi-Newton Acceleration of the EM Algorithm 151
4.14.1 The Method 151
XII CONTENTS
4.14.2 Example 4.12: Dirichlet Distribution 153
4.15 Ikeda Acceleration 15 7
5 EXTENSIONS OF THE EM ALGORITHM 159
5.1 Introduction 159
5.2 ECM Algorithm 160
5.2.1 Motivation 160
5.2.2 Formal Definition 160
5.2.3 Convergence Properties 162
5.2.4 Speed of Convergence 162
5.2.5 Convergence Rates of EM and ECM 163
5.2.6 Example 5.1: ECM Algorithm for Hidden Markov AR(1)
Model 164
5.2.7 Discussion 164
5.3 Multicycle ECM Algorithm 165
5.4 Example 5.2: Normal Mixtures with Equal Correlations 166
5.4.1 Normal Components with Equal Correlations 166
5.4.2 Application of ECM Algorithm 166
5.4.3 Fisher's Ms Data 168
5.5 Example 5.3: Mixture Models for Survival Data 168
5.5.1 Competing Risks in Survival Analysis 168
5.5.2 A Two-Component Mixture Regression Model 169
5.5.3 Observed Data 169
5.5.4 Application of EM Algorithm 170
5.5.5 M-Step for Gompertz Components 171
5.5.6 Application of a Multicycle ECM Algorithm 172
5.5.7 Other Examples of EM Algorithm in Survival Analysis 173
5.6 Example 5.4: Contingency Tables with Incomplete Data 174
5.7 ECME Algorithm 175
5.8 Example 5.5: MLE of f-Distribution with Unknown D.F. 176
5.8.1 Application of the EM Algorithm 176
5.8.2 M-Step 177
5.8.3 Application of ECM Algorithm 177
5.8.4 Application of ECME Algorithm 178
5.8.5 Some Standard Results 178
5.8.6 Missing Data 179
5.8.7 Numerical Examples 181
CONTENTS Xiii
5.8.8 Theoretical Results on the Rate of Convergence 181
5.9 Example 5.6: Variance Components 182
5.9.1 A Variance Components Model 182
5.9.2 E-Step 183
5.9.3 M-Step 184
5.9.4 Application of Two Versions of ECME Algorithm 185
5.9.5 Numerical Example 185
5.10 Linear Mixed Models 186
5.10.1 Introduction 186
5.10.2 General Form of Linear Mixed Model 187
5.10.3 REML Estimation 188
5.10.4 Example 5.7: REML Estimation in a Hierarchical Random
Effects Model 188
5.10.5 Some Other EM-Related Approaches to Mixed Model
Estimation 191
5.10.6 Generalized Linear Mixed Models 191
5.11 Example 5.8: Factor Analysis 193
5.11.1 EM Algorithm for Factor Analysis 193
5.11.2 ECME Algorithm for Factor Analysis 196
5.11.3 Numerical Example 196
5.11.4 EM Algorithm in Principal Component Analysis 196
5.12 Efficient Data Augmentation 198
5.12.1 Motivation 198
5.12.2 Maximum Likelihood Estimation of f-Distribution 198
5.12.3 Variance Components Model 202
5.13 Alternating ECM Algorithm 202
5.14 Example 5.9: Mixtures of Factor Analyzers 204
5.14.1 Normal Component Factor Analyzers 205
5.14.2 E-step 205
5.14.3 CM-steps 206
5.14.4 if-Component Factor Analyzers 207
5.14.5 E-step 210
5.14.6 CM-steps 211
5.15 Parameter-Expanded EM (PX-EM) Algorithm 212
5.16 EMS Algorithm 213
5.17 One-Step-Late Algorithm 213
5.18 Variance Estimation for Penalized EM and OSL Algorithms 214
XiV CONTENTS
5.18.1 Penalized EM Algorithm 214
5.18.2 OSL Algorithm 215
5.18.3 Example 5.9: Variance of MPLE for the Multinomial (Examples
I.1 and 4.1 Continued) 215
5.19 Incremental EM 216
5.20 Linear Inverse Problems 217
6 MONTE CARLO VERSIONS OF THE EM ALGORITHM 219
6.1 Introduction 219
6.2 Monte Carlo Techniques 220
6.2.1 Integration and Optimization 220
6.2.2 Example 6.1: Monte Carlo Integration 221
6.3 Monte Carlo EM 221
6.3.1 Introduction 221
6.3.2 Example 6.2: Monte Carlo EM for Censored Data from
Normal 223
6.3.3 Example 6.3: MCEM for a Two-Parameter Multinomial
(Example 2.4 Continued) 224
6.3.4 MCEM in Generalized Linear Mixed Models 224
6.3.5 Estimation of Standard Error with MCEM 225
6.3.6 Example 6.4: MCEM Estimate of Standard Error for
One-Parameter Multinomial (Example I.I Continued) 226
6.3.7 Stochastic EM Algorithm 227
6.4 Data Augmentation 228
6.4.1 The Algorithm 228
6.4.2 Example 6.5: Data Augmentation in the Multinomial (Examples
I.I, 1.5 Continued) 229
6.5 Bayesian EM 230
6.5.1 Posterior Mode by EM 230
6.5.2 Example 6.6: Bayesian EM for Normal with Semi-Conjugate
Prior 231
6.6 I.I.D. Monte Carlo Algorithms 232
6.6.1 Introduction 232
6.6.2 Rejection Sampling Methods 233
6.6.3 Importance Sampling 234
6.7 Markov Chain Monte Carlo Algorithms 236
6.7.1 Introduction 236
CONTENTS XV
6.7.2 Essence of MCMC 238
6.7.3 Metropolis-Hastings Algorithms 239
6.8 Gibbs Sampling 241
6.8.1 Introduction 241
6.8.2 Rao-Blackwellized Estimates with Gibbs Samples 242
6.8.3 Example 6.7: Why Does Gibbs Sampling Work? 243
6.9 Examples of MCMC Algorithms 245
6.9.1 Example 6.8: M-H Algorithm for Bayesian Probit
Regression 245
6.9.2 Monte Carlo EM with MCMC 246
6.9.3 Example 6.9: Gibbs Sampling for the Mixture Problem 249
6.9.4 Example 6.10: Bayesian Probit Analysis with Data
Augmentation 250
6.9.5 Example 6.11: Gibbs Sampling for Censored Normal 251
6.10 Relationship of EM to Gibbs Sampling 254
6.10.1 EM-Gibbs Sampling Connection 254
6.10.2 Example 6.12: EM-Gibbs Connection for Censored Data from
Normal (Example 6.11 Continued) 256
6.10.3 Example 6.13: EM-Gibbs Connection for Normal Mixtures 257
6.10.4 Rate of Convergence of Gibbs Sampling and EM 257
6.11 Data Augmentation and Gibbs Sampling 258
6.11.1 Introduction 258
6.11.2 Example 6.14: Data Augmentation and Gibbs Sampling for
Censored Normal (Example 6.12 Continued) 259
6.11.3 Example 6.15: Gibbs Sampling for a Complex Multinomial
(Example 2.4 Continued) 260
6.11.4 Gibbs Sampling Analogs of ECM and ECME Algorithms 261
6.12 Empirical Bayes and EM 263
6.13 Multiple Imputation 264
6.14 Missing-Data Mechanism, Ignorability. and EM Algorithm 265
7 SOME GENERALIZATIONS OF THE EM ALGORITHM 269
7.1 Introduction 269
7.2 Estimating Equations and Estimating Functions 270
7.3 Quasi-Score and the Projection-Solution Algorithm 270
7.4 Expectation-Solution (ES) Algorithm 273
7.4.1 Introduction 273
XVi CONTENTS
7.4.2 Computational and Asymptotic Properties of the ES
Algorithm 274
7.4.3 Example 7.1: Multinomial Example by ES Algorithm (Example
I.I Continued) 274
7.5 Other Generalizations 275
7.6 Variational Bayesian EM Algorithm 276
7.7 MM Algorithm 278
7.7.1 Introduction 278
7.7.2 Methods for Constructing Majorizing/Minorizing Functions 279
7.7.3 Example 7.2: MM Algorithm for the Complex Multinomial
(Example 1.1 Continued) 280
7.8 Lower Bound Maximization 281
7.9 Interval EM Algorithm 283
7.9.1 The Algorithm 283
7.9.2 Example 7.3: Interval-EM Algorithm for the Complex
Multinomial (Example 2.4 Continued) 283
7.10 Competing Methods and Some Comparisons with EM 284
7.10.1 Introduction 284
7.10.2 Simulated Annealing 284
7.10.3 Comparison of SA and EM Algorithm for Normal Mixtures 285
7.11 The Delta Algorithm 286
7.12 Image Space Reconstruction Algorithm 287
8 FURTHER APPLICATIONS OF THE EM ALGORITHM 289
8.1 Introduction 289
8.2 Hidden Markov Models 290
8.3 AIDS Epidemiology 293
8.4 Neural Networks 295
8.4.1 Introduction 295
8.4.2 EM Framework for NNs 296
8.4.3 Training Multi-Layer Perceptron Networks 297
8.4.4 Intractibility of the Exact E-Step for MLPs 300
8.4.5 An Integration of the Methodology Related to EM Training of
RBF Networks 300
8.4.6 Mixture of Experts 301
8.4.7 Simulation Experiment 305
8.4.8 Normalized Mixtures of Experts 306
CONTENTS XVN
8.4.9 Hierarchical Mixture of Experts 307
8.4.10 Boltzmann Machine 308
8.5 Data Mining 309
8.6 Bioinformatics 310
REFERENCES 311
AUTHOR INDEX 339
SUBJECT INDEX 347 |
any_adam_object | 1 |
any_adam_object_boolean | 1 |
author | McLachlan, Geoffrey J. 1946- Krishnan, Thriyambakam 1938- |
author_GND | (DE-588)128823348 (DE-588)104344466 |
author_facet | McLachlan, Geoffrey J. 1946- Krishnan, Thriyambakam 1938- |
author_role | aut aut |
author_sort | McLachlan, Geoffrey J. 1946- |
author_variant | g j m gj gjm t k tk |
building | Verbundindex |
bvnumber | BV023115621 |
callnumber-first | Q - Science |
callnumber-label | QA276 |
callnumber-raw | QA276.8 |
callnumber-search | QA276.8 |
callnumber-sort | QA 3276.8 |
callnumber-subject | QA - Mathematics |
classification_rvk | QH 233 SK 830 |
classification_tum | MAT 620f |
ctrlnum | (OCoLC)137325058 (DE-599)DNB 2007017908 |
dewey-full | 519.5/44 |
dewey-hundreds | 500 - Natural sciences and mathematics |
dewey-ones | 519 - Probabilities and applied mathematics |
dewey-raw | 519.5/44 |
dewey-search | 519.5/44 |
dewey-sort | 3519.5 244 |
dewey-tens | 510 - Mathematics |
discipline | Mathematik Wirtschaftswissenschaften |
discipline_str_mv | Mathematik Wirtschaftswissenschaften |
edition | 2. ed. |
format | Book |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>00000nam a2200000zc 4500</leader><controlfield tag="001">BV023115621</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20241210</controlfield><controlfield tag="007">t|</controlfield><controlfield tag="008">080206s2008 xxu |||| 00||| eng d</controlfield><datafield tag="010" ind1=" " ind2=" "><subfield code="a">2007017908</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9780471201700</subfield><subfield code="c">cl</subfield><subfield code="9">978-0-471-20170-0</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">0471201707</subfield><subfield code="9">0-471-20170-7</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)137325058</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DNB 2007017908</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">aacr</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="044" ind1=" " ind2=" "><subfield code="a">xxu</subfield><subfield code="c">US</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-91G</subfield><subfield code="a">DE-473</subfield><subfield code="a">DE-355</subfield><subfield code="a">DE-11</subfield><subfield code="a">DE-578</subfield><subfield code="a">DE-19</subfield><subfield code="a">DE-20</subfield><subfield code="a">DE-29T</subfield><subfield code="a">DE-634</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">QA276.8</subfield></datafield><datafield tag="082" ind1="0" ind2=" "><subfield code="a">519.5/44</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">QH 233</subfield><subfield code="0">(DE-625)141548:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">SK 830</subfield><subfield code="0">(DE-625)143259:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">MAT 620f</subfield><subfield code="2">stub</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">McLachlan, Geoffrey J.</subfield><subfield code="d">1946-</subfield><subfield code="e">Verfasser</subfield><subfield code="0">(DE-588)128823348</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">The EM algorithm and extensions</subfield><subfield code="c">Geoffrey J. McLachlan ; Thriyambakam Krishnan</subfield></datafield><datafield tag="250" ind1=" " ind2=" "><subfield code="a">2. ed.</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Hoboken, NJ</subfield><subfield code="b">Wiley-Interscience</subfield><subfield code="c">2008</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">XXVII, 359 S.</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="490" ind1="0" ind2=" "><subfield code="a">Wiley series in probability and statistics</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Expectation-maximization algorithms</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Estimation theory</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Missing observations (Statistics)</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Fehlende Daten</subfield><subfield code="0">(DE-588)4264715-0</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Maximum-Likelihood-Schätzung</subfield><subfield code="0">(DE-588)4194624-8</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">EM-Algorithmus</subfield><subfield code="0">(DE-588)4659559-4</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Maximum-Likelihood-Schätzung</subfield><subfield code="0">(DE-588)4194624-8</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="1"><subfield code="a">Fehlende Daten</subfield><subfield code="0">(DE-588)4264715-0</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="8">1\p</subfield><subfield code="5">DE-604</subfield></datafield><datafield tag="689" ind1="1" ind2="0"><subfield code="a">EM-Algorithmus</subfield><subfield code="0">(DE-588)4659559-4</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="1" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Krishnan, Thriyambakam</subfield><subfield code="d">1938-</subfield><subfield code="e">Verfasser</subfield><subfield code="0">(DE-588)104344466</subfield><subfield code="4">aut</subfield></datafield><datafield tag="856" ind1="4" ind2=" "><subfield code="u">http://www.loc.gov/catdir/toc/ecip0716/2007017908.html</subfield><subfield code="3">Table of contents only</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="m">HBZ Datenaustausch</subfield><subfield code="q">application/pdf</subfield><subfield code="u">http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=016318148&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA</subfield><subfield code="3">Inhaltsverzeichnis</subfield></datafield><datafield tag="883" ind1="1" ind2=" "><subfield code="8">1\p</subfield><subfield code="a">cgwrk</subfield><subfield code="d">20201028</subfield><subfield code="q">DE-101</subfield><subfield code="u">https://d-nb.info/provenance/plan#cgwrk</subfield></datafield><datafield tag="943" ind1="1" ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-016318148</subfield></datafield></record></collection> |
id | DE-604.BV023115621 |
illustrated | Not Illustrated |
index_date | 2024-07-02T19:49:54Z |
indexdate | 2024-12-10T15:00:42Z |
institution | BVB |
isbn | 9780471201700 0471201707 |
language | English |
lccn | 2007017908 |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-016318148 |
oclc_num | 137325058 |
open_access_boolean | |
owner | DE-91G DE-BY-TUM DE-473 DE-BY-UBG DE-355 DE-BY-UBR DE-11 DE-578 DE-19 DE-BY-UBM DE-20 DE-29T DE-634 |
owner_facet | DE-91G DE-BY-TUM DE-473 DE-BY-UBG DE-355 DE-BY-UBR DE-11 DE-578 DE-19 DE-BY-UBM DE-20 DE-29T DE-634 |
physical | XXVII, 359 S. |
publishDate | 2008 |
publishDateSearch | 2008 |
publishDateSort | 2008 |
publisher | Wiley-Interscience |
record_format | marc |
series2 | Wiley series in probability and statistics |
spelling | McLachlan, Geoffrey J. 1946- Verfasser (DE-588)128823348 aut The EM algorithm and extensions Geoffrey J. McLachlan ; Thriyambakam Krishnan 2. ed. Hoboken, NJ Wiley-Interscience 2008 XXVII, 359 S. txt rdacontent n rdamedia nc rdacarrier Wiley series in probability and statistics Expectation-maximization algorithms Estimation theory Missing observations (Statistics) Fehlende Daten (DE-588)4264715-0 gnd rswk-swf Maximum-Likelihood-Schätzung (DE-588)4194624-8 gnd rswk-swf EM-Algorithmus (DE-588)4659559-4 gnd rswk-swf Maximum-Likelihood-Schätzung (DE-588)4194624-8 s Fehlende Daten (DE-588)4264715-0 s 1\p DE-604 EM-Algorithmus (DE-588)4659559-4 s DE-604 Krishnan, Thriyambakam 1938- Verfasser (DE-588)104344466 aut http://www.loc.gov/catdir/toc/ecip0716/2007017908.html Table of contents only HBZ Datenaustausch application/pdf http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=016318148&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA Inhaltsverzeichnis 1\p cgwrk 20201028 DE-101 https://d-nb.info/provenance/plan#cgwrk |
spellingShingle | McLachlan, Geoffrey J. 1946- Krishnan, Thriyambakam 1938- The EM algorithm and extensions Expectation-maximization algorithms Estimation theory Missing observations (Statistics) Fehlende Daten (DE-588)4264715-0 gnd Maximum-Likelihood-Schätzung (DE-588)4194624-8 gnd EM-Algorithmus (DE-588)4659559-4 gnd |
subject_GND | (DE-588)4264715-0 (DE-588)4194624-8 (DE-588)4659559-4 |
title | The EM algorithm and extensions |
title_auth | The EM algorithm and extensions |
title_exact_search | The EM algorithm and extensions |
title_exact_search_txtP | The EM algorithm and extensions |
title_full | The EM algorithm and extensions Geoffrey J. McLachlan ; Thriyambakam Krishnan |
title_fullStr | The EM algorithm and extensions Geoffrey J. McLachlan ; Thriyambakam Krishnan |
title_full_unstemmed | The EM algorithm and extensions Geoffrey J. McLachlan ; Thriyambakam Krishnan |
title_short | The EM algorithm and extensions |
title_sort | the em algorithm and extensions |
topic | Expectation-maximization algorithms Estimation theory Missing observations (Statistics) Fehlende Daten (DE-588)4264715-0 gnd Maximum-Likelihood-Schätzung (DE-588)4194624-8 gnd EM-Algorithmus (DE-588)4659559-4 gnd |
topic_facet | Expectation-maximization algorithms Estimation theory Missing observations (Statistics) Fehlende Daten Maximum-Likelihood-Schätzung EM-Algorithmus |
url | http://www.loc.gov/catdir/toc/ecip0716/2007017908.html http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=016318148&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |
work_keys_str_mv | AT mclachlangeoffreyj theemalgorithmandextensions AT krishnanthriyambakam theemalgorithmandextensions |