Introduction to environmental data science:
"Statistical and machine learning methods have many applications in the environmental sciences, including prediction and data analysis in meteorology, hydrology and oceanography; pattern recognition for satellite images from remote sensing; management ofagriculture and forests; assessment of cl...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Buch |
Sprache: | English |
Veröffentlicht: |
Cambridge, United Kingdom ; New York, USA
Cambridge University Press
2023
|
Schlagworte: | |
Online-Zugang: | Inhaltsverzeichnis |
Zusammenfassung: | "Statistical and machine learning methods have many applications in the environmental sciences, including prediction and data analysis in meteorology, hydrology and oceanography; pattern recognition for satellite images from remote sensing; management ofagriculture and forests; assessment of climate change; and much more. With rapid advances in machine learning in the last decade, this book provides an urgently needed, comprehensive guide to machine learning and statistics for students and researchers interested in environmental data science. It includes intuitive explanations covering the relevant background mathematics, with examples drawn from the environmental sciences. A broad range of topics are covered, including correlation, regression, classification, clustering, neural networks, random forests, boosting, kernel methods, evolutionary algorithms and deep learning, as well as the recent merging of machine learning and physics. End-of-chapter exercises allow readers to develop their problem-solving skills, and online datasets allow readers to practise analysis of real data. William W. Hsieh is a professor emeritus in the Department of Earth, Ocean and Atmospheric Sciences at the University of British Columbia. Known as a pioneer in introducing machine learning to environmental science, he has written over 100 peer-reviewed journal papers on climate variability, machine learning, atmospheric science, oceanography, hydrology and agricultural science. He is the author of the book Machine Learning Methods in the Environmental Sciences (2009, Cambridge University Press), the first single-authored textbook on machine learning for environmental scientists. Currently retired in Victoria, British Columbia, he enjoys growing organic vegetables"-- A comprehensive guide to machine learning and statistics for students and researchers of environmental data science. |
Beschreibung: | xx, 627 Seiten Illustrationen, Diagramme |
ISBN: | 9781107065550 1107065550 |
Internformat
MARC
LEADER | 00000nam a2200000 c 4500 | ||
---|---|---|---|
001 | BV048860216 | ||
003 | DE-604 | ||
005 | 20240409 | ||
007 | t| | ||
008 | 230315s2023 xx a||| b||| 00||| eng d | ||
020 | |a 9781107065550 |c hardback |9 978-1-107-06555-0 | ||
020 | |a 1107065550 |c hardback |9 1-107-06555-0 | ||
035 | |a (OCoLC)1370400344 | ||
035 | |a (DE-599)BVBBV048860216 | ||
040 | |a DE-604 |b ger |e rda | ||
041 | 0 | |a eng | |
049 | |a DE-355 |a DE-188 |a DE-1050 |a DE-11 | ||
084 | |a WC 7786 |0 (DE-625)164864: |2 rvk | ||
084 | |a ST 630 |0 (DE-625)143685: |2 rvk | ||
084 | |a WC 7700 |0 (DE-625)148144: |2 rvk | ||
084 | |a WI 1500 |0 (DE-625)148757: |2 rvk | ||
100 | 1 | |a Hsieh, William Wei |d 1955- |e Verfasser |0 (DE-588)139904557 |4 aut | |
245 | 1 | 0 | |a Introduction to environmental data science |c William W. Hsieh, University of British Columbia |
264 | 1 | |a Cambridge, United Kingdom ; New York, USA |b Cambridge University Press |c 2023 | |
300 | |a xx, 627 Seiten |b Illustrationen, Diagramme | ||
336 | |b txt |2 rdacontent | ||
337 | |b n |2 rdamedia | ||
338 | |b nc |2 rdacarrier | ||
520 | 3 | |a "Statistical and machine learning methods have many applications in the environmental sciences, including prediction and data analysis in meteorology, hydrology and oceanography; pattern recognition for satellite images from remote sensing; management ofagriculture and forests; assessment of climate change; and much more. With rapid advances in machine learning in the last decade, this book provides an urgently needed, comprehensive guide to machine learning and statistics for students and researchers interested in environmental data science. It includes intuitive explanations covering the relevant background mathematics, with examples drawn from the environmental sciences. A broad range of topics are covered, including correlation, regression, classification, clustering, neural networks, random forests, boosting, kernel methods, evolutionary algorithms and deep learning, as well as the recent merging of machine learning and physics. End-of-chapter exercises allow readers to develop their problem-solving skills, and online datasets allow readers to practise analysis of real data. William W. Hsieh is a professor emeritus in the Department of Earth, Ocean and Atmospheric Sciences at the University of British Columbia. Known as a pioneer in introducing machine learning to environmental science, he has written over 100 peer-reviewed journal papers on climate variability, machine learning, atmospheric science, oceanography, hydrology and agricultural science. He is the author of the book Machine Learning Methods in the Environmental Sciences (2009, Cambridge University Press), the first single-authored textbook on machine learning for environmental scientists. Currently retired in Victoria, British Columbia, he enjoys growing organic vegetables"-- | |
520 | |a A comprehensive guide to machine learning and statistics for students and researchers of environmental data science. | ||
650 | 0 | 7 | |a Klimaänderung |0 (DE-588)4164199-1 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Deep Learning |0 (DE-588)1135597375 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Datenanalyse |0 (DE-588)4123037-1 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Data Science |0 (DE-588)1140936166 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Umweltwissenschaften |0 (DE-588)4137364-9 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Statistik |0 (DE-588)4056995-0 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Maschinelles Lernen |0 (DE-588)4193754-5 |2 gnd |9 rswk-swf |
653 | |a Neuronale Netzwerke | ||
653 | 0 | |a Environmental sciences / Data processing | |
653 | 0 | |a Environmental protection / Data processing | |
653 | 0 | |a Environmental management / Data processing | |
653 | |a Mustererkennung für Fernerkundungssatellitenbilder | ||
653 | |a Korrelation und Regression in den Umweltwissenschaften | ||
689 | 0 | 0 | |a Maschinelles Lernen |0 (DE-588)4193754-5 |D s |
689 | 0 | 1 | |a Deep Learning |0 (DE-588)1135597375 |D s |
689 | 0 | 2 | |a Statistik |0 (DE-588)4056995-0 |D s |
689 | 0 | 3 | |a Datenanalyse |0 (DE-588)4123037-1 |D s |
689 | 0 | 4 | |a Umweltwissenschaften |0 (DE-588)4137364-9 |D s |
689 | 0 | 5 | |a Klimaänderung |0 (DE-588)4164199-1 |D s |
689 | 0 | |5 DE-604 | |
689 | 1 | 0 | |a Data Science |0 (DE-588)1140936166 |D s |
689 | 1 | 1 | |a Maschinelles Lernen |0 (DE-588)4193754-5 |D s |
689 | 1 | 2 | |a Umweltwissenschaften |0 (DE-588)4137364-9 |D s |
689 | 1 | |5 DE-604 | |
776 | 0 | 8 | |i Erscheint auch als |n Online-Ausgabe |z 978-1-107-58849-3 |
856 | 4 | 2 | |m Digitalisierung UB Regensburg - ADAM Catalogue Enrichment |q application/pdf |u http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=034125345&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |3 Inhaltsverzeichnis |
943 | 1 | |a oai:aleph.bib-bvb.de:BVB01-034125345 |
Datensatz im Suchindex
_version_ | 1823932151949688832 |
---|---|
adam_text |
Contents Preface Page xv Notation Used. xviii List of Abbreviations. xix 1 Introduction 1.1 Statistics and MachineLearning[Āi©. 1.2 Environmental DataScience IĀI©. 1.3 A Simple Example of Curve Fitting ÍÃ1©. 1.4 Main Types of Data Problems IĀ1©. . 1.4.1 Supervised Learning [Ā]©. 1.4.2 Unsupervised Learning [Ā]©. 1.4.3 Reinforced Learning IĀ1© . 1.5 Curse of Dimensionality IĀ1©. 1 1 6 9 12 15 16 17 17 2 Basics 2.1 Data Types И©. 2.2 Probability IĀ1©. 2.3 Probability Density [Ā1©. 2.4 Expectation and Mean [Ā1© . 2.5 Variance and Standard Deviation [Āļ© . 2.6 Covariance IĀ1©. 2.7 Online Algorithms for Mean, Variance and Covariance [Cl© · · · 2.8
Median and Median Absolute Deviation [A]©. 2.9 Quantiles [Āļ©. 2.10 Skewness and Kurtosis ІВІ©. 2.11 Correlation И©. 2.11.1 Pearson Correlation И©. 2.11.2 Serial Correlation IĀ]©. 2.11.3 Spearman Rank Correlation [Ā1©. 2.11.4 Kendall Rank Correlation IĀ1© . 2.11.5 Biweight Mid correlai ion [В]©. 2.12 Exploratory Data Analysis [ĀI©. 2.12.1 Histograms |Āļ©. 2.12.2 Quantile-Quantile (Q-Q) Plots [Bi© . 19 19 19 22 24 25 26 27 29 30 32 33 33 36 38 39 40 41 41 42 v
vi Contents 2.12.3 Boxplots [Āl©. 2.13 Mahalanobis Distance IĀ1©. 2.13.1 Mahalanobis Distance and Principal Component Analysis E©. 47 2.14 Bayes Theorem IĀ1©. 2.15 Classification [Ā1©. 2.16 Clustering [Ā1© . 2.17 Information Theory E©. 2.17.1 Entropy E© . 2.17.2 Joint Entropy and Conditional Entropy E© . 2.17.3 Relative Entropy E© . 2.17.4 Mutual Information E©. Exercises . 45 47 49 52 54 56 58 59 60 61 62 3 Probability Distributions 65 3.1 Binomial Distribution IĀ1©. 66 3.2 Poisson Distribution E©. 68 3.3 Multinomial Distribution E©. 68 3.4 Gaussian Distribution
[Āļ©. 69 3.5 Maximum Likelihood Estimation [Āļ©. 73 3.6 Multivariate Gaussian Distribution E©. 75 3.7 Conditional and Marginal GaussianDistributions И© . 77 3.8 Gamma Distribution E©. 78 3.9 Beta Distribution E© . 80 3.10 Von Mises Distribution E©. 82 3.11 Extreme Value Distributions E©. 83 3.12 Gaussian Mixture Model E©. 86 3.12.1 Expectation-Maximization (EM) Algorithm E® . 90 3.13 Kernel Density Estimation E©. 91 3.14 Re-expressing Data [Āļ©. 93 3.15 Student ¿-distribution E©. 95 3.16 Chi-squared Distribution E©. 97 Exercises . 99 4 Statistical Inference 101 4.1 Hypothesis Testing [ĀI©. 101 4.2 Student i-test [Ā1© . 104 4.2.1 One-Sample i-test [Āļ©
. 105 4.2.2 Independent Two-Sample ¿-test [Āļ©. 105 4.2.3 Dependent i-test for Paired Samples IĀ1© . 107 4.2.4 Serial Correlation [ĀI©. 107 4.2.5 Significance Test for Correlation [Ā1©. 109 4.3 Non-parametric Alternatives to i-test E®. Ш 4.3.1 Wilcoxon-Mann-Whitney Test E© . Hl 4.3.2 Wilcoxon Signed-Rank Test E© . П4 4.4 Confidence Interval [Ā©. 115
Contents vii 4.4.1 Confidence Interval for PopulationMean 0©. 116 4.4.2 Confidence Interval for Correlation 0© . 118 4.5 Goodness-of-Fit Tests 0©. 119 4.5.1 One-Sample Goodness-of-Fit Tests 0© .119 4.5.2 Two-Sample Goodness-of-Fit Tests 0© .121 4.6 Test of Variances 0©. 124 4.7 Mann-Kendall Trend Test 0© . 125 4.8 Bootstrapping ÍÃ]©. 126 4.9 Field Significance [0© . 131 Exercises .134 5 Linear Regression 137 Simple Linear RegressionIĀ1©. 137 5.1.1 Partition of Sums ofSquares [Ā1©.139 5.1.2 Confidence Interval for Regression Parameters 0© . 141 5.1.3 Confidence Interval and Prediction Interval for the Response Variable [0© . 143 5.1.4 Serial Correlation 10©. 144 5.2 Multiple Linear Regression Ш©. 145 5.2.1 Gauss-Markov Theorem 10© . 147
5.2.2 Partition of Sums of Squares 0©. 148 5.2.3 Standardized Predictors ÍÃ1© .148 5.2.4 Analysis of Variance (ANOVA) 0© . 149 5.2.5 Confidence and Prediction Intervals 0©.151 5.3 Multivariate Linear Regression 0©. 153 5.4 Online Learning with Linear Regression 0©.154 5.5 Circular and Categorical Data IĀ1©. 156 5.6 Predictor Selection 0©.158 5.7 Ridge Regression 0©. 161 5.8 Lasso 0©. 165 5.9 Quantile Regression 0©.167 5.10 Generalized Least Squares 0©. 168 5.10.1 Optimal Fingerprinting in Climate Change 0©. 170 Exercises . 170 5.1 6 Neural Networks 6.1 6.2 6.3 6.4 173 McCulloch and Pitts Model 0©. 174 Perceptrons |Āļ©. 175 6.2.1 Limitation of Perceptrons 0© .
178 Multi-layer Perceptrons IĀi©. 180 6.3.1 Comparison with Polynomials0©. 185 6.3.2 Hidden Neurons [Ā1©. 186 6.3.3 Monotonic Multi-layer Perceptron Model 0©. 187 Extreme Learning Machines [Al© . 189 6.4.1 Online Learning 0©.193 6.4.2 Random Vector Functional Link0©. 194
Contents viii Radial Basis Functions 0© . 195 Modelling Conditional Distributions 0©. 199 6.6.1 Mixture Density Network 0© . 201 6.7 Quantile Regression 0©. 204 6.8 Historical Development of NN in Environmental Science 0© . . 207 6.8.1 Remote Sensing 0©. 208 6.8.2 Hydrology 0© . 212 6.8.3 Atmospheric Science0© . 213 6.8.4 Oceanography [0©. 213 Exercises . 214 6.5 6.6 7 Non-linear Optimization 216 7.1 Extrema and Saddle Points И©. 216 7.2 Gradient Vector in Optimization ÍÃ]©. 219 7.3 Back-Propagation Į0©. 221 7.4 Training Protocol [ĀI©. 224 7.5 Gradient Descent Method |Āļ©. 225 7.6 Stochastic Gradient Descent [0©. 227 7.7 Conjugate Gradient Method
[0©.229 7.8 Quasi-Newton Methods 0©. 232 7.9 Non-linear Least Squares Methods 0©. 234 7.10 Evolutionary Algorithms 0©. 236 7.11 Hill Climbing 0©. 238 7.12 Genetic Algorithm 0©. 239 7.13 Differential Evolution [0©. 241 Exercises . 244 8 Learning and Generalization 8.1 8.2 8.3 8.4 8.5 8.6 8.7 8.8 8.9 8.10 8.11 245 Mean Squared Error and MaximumLikelihood [Āļ©. 245 Objective Functions and RobustnessÍÂ]©.247 Variance and Bias Errors [Ā1©. 250 Regularization IĀI©. 251 8.4.1 Weight Penalty [ĀI© . 251 8.4.2 Early Stopping [Ā1© . 254 Cross-Validation [ĀI©.255 Hyperparameter Tuning IĀ]©. 258 Ensemble Methods
[0©. 261 8.7.1 Bagging И© . 253 8.7.2 Error of Ensemble 0©. 263 8.7.3 Unequal Weighting of EnsembleMembers 0© . 266 8.7.4 Stacking 0© . 268 Dropout 0©. 269 Maximum Norm Constraint 0©. 271 Bayesian Model Selection 0©. 272 Information Criterion 0©. 273 8.11.1 Bayesian Information Criterion0© . 273
Contents ix 8.11.2 Akaike Information CriterionE© . 275 8.12 Bayesian Model Averaging E®. 278 8.13 Interpretable ML E©. 280 Exercises . 281 9 Principal Components and Canonical Correlation 283 Principal Component Analysis (PCA)[Ā1©. 283 9.1.1 Geometric Approach to PCA O© .284 9.1.2 Eigenvector Approach to PCA [Ā1©. 284 9.1.3 Real and Complex Data E© . 288 9.1.4 Orthogonality Relations [Ā1© . 289 9.1.5 PCA of the Tropical Pacific Climate Variability [Ā1© · · 290 9.1.6 Scaling the PCs and Eigenvectors [Ā]©.298 9.1.7 Degeneracy of Eigenvalues [Āļ©.299 9.1.8 A Smaller Covariance Matrix [Ā]© .299 9.1.9 How Many Modes to Retain E©. 300 9.1.10 Temporal and Spatial Mean Removal E©. 302 9.1.11Singular Value Decomposition E©. 302 9.1.12 Missing Data E© . 304 9.2 Rotated PCA E©. 305 9.2.1
E-frame Rotation E©.308 9.2.2 A-frame Rotation E©. 311 9.2.3 Advantages and Disadvantages of Rotation [Ā1© . 315 9.3 PCA for Two-Dimensional Vectors E©. 317 9.4 Canonical Correlation Analysis (CCA) E©.320 9.4.1 CCA Theory E© . 320 9.4.2 Pre-filter with PCA E©. 324 9.5 Maximum Covariance Analysis E©. 326 Exercises . 327 9.1 10 Unsupervised Learning 330 10.1 Clustering E©. 330 10.1.1 Distance Measure E©. 331 10.1.2 Model Evaluation E©. 332 10.2 Non-hierarchical Clustering E©. 337 10.2.1 А-means Clustering E© . 337 10.2.2 Nucleated Agglomerative Clustering E© . 338 10.2.3 Gaussian Mixture Model E©. 339 10.3 Hierarchical Clustering E©. 339 10.4 Self-Organizing Map
E©. 343 10.4.1 Applications of SOM E©. 345 10.5 Autoencoder [Ā1©. 347 10.6 Non-linear Principal Component Analysis E©. 349 10.6.1 Overfitting E©. 358 10.6.2 Closed Curves E©.360 10.7 Other Non-linear Dimensionality Reduction Methods E© · · ■ 363
x Contents 10.8 Non-linear Canonical Correlation Analysis İÇİ© .365 Exercises .369 11 Time Series 372 11.1 Fourier Analysis [Ā1©. 372 11.1.1 Fourier Series [Ā1© . 373 11.1.2 Discrete Fourier Transform [Āļ© . 374 11.1.3 Continuous-Time Fourier Transform [0© .375 11.1.4 Discrete-Time Fourier Transform [ВІ© . 376 11.2 Windows [Äi©. 376 11.3 Spectrum IĀ1©. 379 11.3.1 Effects of Window Functions [Āļ©. 380 11.3.2 Trend Removal ¡Āļ© . 382 11.3.3 Nyquist Frequency and Aliasing [Ā1©. 382 11.3.4 Smoothing the Spectrum [Āļ©. 384 11.3.5 Confidence Interval [0©. 385 11.3.6 Examples [Bi© . 385 11.3.7 Fast Fourier Transform [Bi©. 386 11.3.8 Relation with Auto-covariance ІВІ©. 387
11.3.9 Rotary Spectrum for 2-D Vectors [Cl© .387 11.3.10 Wavelets [Cl©. 390 11.4 Cross-Spectrum [Bi© . 393 11.5 Filtering [Ā1© . 395 11.5.1 Periodic Signals [ĀI©. 396 11.5.2 Ideal Filters [A]©. 397 11.5.3 Finite Impulse Response Filters1ВІ©. 398 11.6 Averaging [ĀI©. 401 11.6.1 Moving Average Filters [Ā1©. 402 11.6.2 Grid-Scale Noise [Bļ© . 402 11.6.3 Linearization from Time-Averaging [Bi©. 404 11.7 Singular Spectrum Analysis [B]©. 405 11.7.1 Multichannel Singular SpectrumAnalysis [Bi© .407 11.8 Auto-regressive Process [Bļ©. 410 11.8.1 AR(p) Process ¡Bi© . 411 11.8.2 AR(1) Process Е©. 412 11.8.3 AR(2) Process {C]©. 413 11.9 Box-
Jenkins Models ІСІ©. 414 11.9.1 Moving Average (MA) Process [Cl© . 414 11.9.2 Auto-regressive Moving Average (ARMA) Model [Cl© . 414 11.9.3 Auto-regressive Integrated Moving Average (ARIMA) Model И©. 415 Exercises . 416
Contents xi 12 Classification 418 12.1 Linear Discriminant Analysis IĀ1©.419 12.1.1 Fisher Linear Discriminant E©. 421 12.2 Logistic Regression ÍÃ]© . 424 12.2.1 Multiclass Logistic Regression E©. 425 12.3 Naive Bayes Classifier E©. 427 12.4 К-nearest Neighbours E©. 428 12.5 Extreme Learning Machine Classifier [Āļ© . 430 12.6 Cross-Entropy [Ā1©. 432 12.7 Multi-layer Perceptron Classifier [Āļ©. 434 12.8 Class Imbalance [Ā1©. 436 Exercises . 438 13 Kernel Methods 440 13.1 From Neural Networks to Kernel Methods E©. 441 13.2 Primal and Dual Solutions for Linear Regression E©. 442 13.3 Kernels E©. 444 13.4 Kernel Ridge Regression E©. 448 13.5 Advantages and Disadvantages E©. 449 13.6 Pre-image Problem E©
. 451 13.7 Support Vector Machines (SVM) E®. 453 13.7.1 Linearly Separable Case E® . 454 13.7.2 Linearly Non-separable Case E®.458 13.7.3 Non-linear Classification by SVM E©. 460 13.7.4 Multi-class Classification by SVM E©. 461 13.7.5 Support Vector Regression E©. 462 13.8 Gaussian Processes E©. 463 13.8.1 Learning the Hyperparameters E© . 466 13.8.2 Other Common Kernels E© . 467 13.9 Kernel Principal Component Analysis E©. 469 Exercises . 471 14 Decision Trees, Random Forests and Boosting 473 14.1 Classification and Regression Trees (CART) IĀI© . 474 14.1.1 Relative Importance of Predictors E©.479 14.1.2 Surrogate Splits E©. 480 14.2 Random Forests [Ā1©. 482 14.2.1 Extremely Randomized Trees (Extra Trees) E© . 487 14.3 Boosting [Ā1©.
487 14.3.1 Gradient Boosting E© . 488 Exercises . 492
xii Contents 15 Deep Learning 494 15.1 Transfer Learning [Ā1©. 498 15.2 Convolutional Neural Network E©.499 15.2.1 Convolution Operation E©. 499 15.2.2 Pooling E©.502 15.2.3 AlexNet E© . 503 15.2.4 Residual Neural Network (ResNet) E© . 505 15.2.5 Data Augmentation E©. 506 15.2.6 Applications in Environment ScienceE© . 506 15.3 Encoder-Decoder Network E©. 507 15.3.1 U-net E©. 508 15.4 Time Series E©.510 15.4.1 Long Short-Term Memory (LSTM) Network E© · · · · 510 15.4.2 Temporal Convolutional Network E© . 513 15.5 Generative Adversarial Network E©. 514 Exercises . 517 16 Forecast Verification and Post-processing 518 16.1 Binary Classes E©. 519 16.1.1 Skill Scores for Binary Classes
E©. 524 16.2 Multiple Classes E©. 527 16.3 Probabilistic Forecasts for Binary ClassesE©. 528 16.3.1 Reliability Diagram E©. 529 16.4 Probabilistic Forecasts forMultiple Classes E©. 531 16.5 Continuous Variables E©.532 16.5.1 Forecast Scores E© .532 16.5.2 Skill Scores E© . 533 16.6 Probabilistic Forecasts for ContinuousVariables E©. 535 16.7 Minimizing Loss E©. 535 16.8 Spurious Skills [Ā1©. 536 16.9 Extrapolation ¡Āi© . 538 16.10 Post-processing E©.541 16.11 Downscaling E©. 543 16.11.1 Reduced Variance E©. 546 Exercises . 547 17 Merging of Machine Learning and Physics 549 17.1 Physics Emulation and Hybrid Models
E©.550 17.1.1 Radiation in AtmosphericModels E©. 550 17.1.2 Clouds E©. 551 17.1.3 Turbulent Fluxes E© . 553 17.1.4 Hybrid Coupled Atmosphere-Ocean Modelling E© . 554 17.1.5 Wind Wave ModellingE© . 555 17.2 Physics-Informed Machine Learning [Ā1©. 556 17.2.1 Soft Constraint E© .556
Contents xiii 17.2.2 Hard Constraint [Cl©. 558 17.3 Data Assimilation and ML[C]©. 560 17.3.1 3D-Var И®. 562 17.3.2 4D-Var 0®. 563 17.3.3 Neural Networks in4D-Var |C]® . 564 Exercises . 568 Appendices 569 A Trends in Terminology . 569 В Lagrange Multipliers .569 References 573 Index 613 |
adam_txt |
Contents Preface Page xv Notation Used. xviii List of Abbreviations. xix 1 Introduction 1.1 Statistics and MachineLearning[Āi©. 1.2 Environmental DataScience IĀI©. 1.3 A Simple Example of Curve Fitting ÍÃ1©. 1.4 Main Types of Data Problems IĀ1©. . 1.4.1 Supervised Learning [Ā]©. 1.4.2 Unsupervised Learning [Ā]©. 1.4.3 Reinforced Learning IĀ1© . 1.5 Curse of Dimensionality IĀ1©. 1 1 6 9 12 15 16 17 17 2 Basics 2.1 Data Types И©. 2.2 Probability IĀ1©. 2.3 Probability Density [Ā1©. 2.4 Expectation and Mean [Ā1© . 2.5 Variance and Standard Deviation [Āļ© . 2.6 Covariance IĀ1©. 2.7 Online Algorithms for Mean, Variance and Covariance [Cl© · · · 2.8
Median and Median Absolute Deviation [A]©. 2.9 Quantiles [Āļ©. 2.10 Skewness and Kurtosis ІВІ©. 2.11 Correlation И©. 2.11.1 Pearson Correlation И©. 2.11.2 Serial Correlation IĀ]©. 2.11.3 Spearman Rank Correlation [Ā1©. 2.11.4 Kendall Rank Correlation IĀ1© . 2.11.5 Biweight Mid correlai ion [В]©. 2.12 Exploratory Data Analysis [ĀI©. 2.12.1 Histograms |Āļ©. 2.12.2 Quantile-Quantile (Q-Q) Plots [Bi© . 19 19 19 22 24 25 26 27 29 30 32 33 33 36 38 39 40 41 41 42 v
vi Contents 2.12.3 Boxplots [Āl©. 2.13 Mahalanobis Distance IĀ1©. 2.13.1 Mahalanobis Distance and Principal Component Analysis E©. 47 2.14 Bayes Theorem IĀ1©. 2.15 Classification [Ā1©. 2.16 Clustering [Ā1© . 2.17 Information Theory E©. 2.17.1 Entropy E© . 2.17.2 Joint Entropy and Conditional Entropy E© . 2.17.3 Relative Entropy E© . 2.17.4 Mutual Information E©. Exercises . 45 47 49 52 54 56 58 59 60 61 62 3 Probability Distributions 65 3.1 Binomial Distribution IĀ1©. 66 3.2 Poisson Distribution E©. 68 3.3 Multinomial Distribution E©. 68 3.4 Gaussian Distribution
[Āļ©. 69 3.5 Maximum Likelihood Estimation [Āļ©. 73 3.6 Multivariate Gaussian Distribution E©. 75 3.7 Conditional and Marginal GaussianDistributions И© . 77 3.8 Gamma Distribution E©. 78 3.9 Beta Distribution E© . 80 3.10 Von Mises Distribution E©. 82 3.11 Extreme Value Distributions E©. 83 3.12 Gaussian Mixture Model E©. 86 3.12.1 Expectation-Maximization (EM) Algorithm E® . 90 3.13 Kernel Density Estimation E©. 91 3.14 Re-expressing Data [Āļ©. 93 3.15 Student ¿-distribution E©. 95 3.16 Chi-squared Distribution E©. 97 Exercises . 99 4 Statistical Inference 101 4.1 Hypothesis Testing [ĀI©. 101 4.2 Student i-test [Ā1© . 104 4.2.1 One-Sample i-test [Āļ©
. 105 4.2.2 Independent Two-Sample ¿-test [Āļ©. 105 4.2.3 Dependent i-test for Paired Samples IĀ1© . 107 4.2.4 Serial Correlation [ĀI©. 107 4.2.5 Significance Test for Correlation [Ā1©. 109 4.3 Non-parametric Alternatives to i-test E®. Ш 4.3.1 Wilcoxon-Mann-Whitney Test E© . Hl 4.3.2 Wilcoxon Signed-Rank Test E© . П4 4.4 Confidence Interval [Ā©. 115
Contents vii 4.4.1 Confidence Interval for PopulationMean 0©. 116 4.4.2 Confidence Interval for Correlation 0© . 118 4.5 Goodness-of-Fit Tests 0©. 119 4.5.1 One-Sample Goodness-of-Fit Tests 0© .119 4.5.2 Two-Sample Goodness-of-Fit Tests 0© .121 4.6 Test of Variances 0©. 124 4.7 Mann-Kendall Trend Test 0© . 125 4.8 Bootstrapping ÍÃ]©. 126 4.9 Field Significance [0© . 131 Exercises .134 5 Linear Regression 137 Simple Linear RegressionIĀ1©. 137 5.1.1 Partition of Sums ofSquares [Ā1©.139 5.1.2 Confidence Interval for Regression Parameters 0© . 141 5.1.3 Confidence Interval and Prediction Interval for the Response Variable [0© . 143 5.1.4 Serial Correlation 10©. 144 5.2 Multiple Linear Regression Ш©. 145 5.2.1 Gauss-Markov Theorem 10© . 147
5.2.2 Partition of Sums of Squares 0©. 148 5.2.3 Standardized Predictors ÍÃ1© .148 5.2.4 Analysis of Variance (ANOVA) 0© . 149 5.2.5 Confidence and Prediction Intervals 0©.151 5.3 Multivariate Linear Regression 0©. 153 5.4 Online Learning with Linear Regression 0©.154 5.5 Circular and Categorical Data IĀ1©. 156 5.6 Predictor Selection 0©.158 5.7 Ridge Regression 0©. 161 5.8 Lasso 0©. 165 5.9 Quantile Regression 0©.167 5.10 Generalized Least Squares 0©. 168 5.10.1 Optimal Fingerprinting in Climate Change 0©. 170 Exercises . 170 5.1 6 Neural Networks 6.1 6.2 6.3 6.4 173 McCulloch and Pitts Model 0©. 174 Perceptrons |Āļ©. 175 6.2.1 Limitation of Perceptrons 0© .
178 Multi-layer Perceptrons IĀi©. 180 6.3.1 Comparison with Polynomials0©. 185 6.3.2 Hidden Neurons [Ā1©. 186 6.3.3 Monotonic Multi-layer Perceptron Model 0©. 187 Extreme Learning Machines [Al© . 189 6.4.1 Online Learning 0©.193 6.4.2 Random Vector Functional Link0©. 194
Contents viii Radial Basis Functions 0© . 195 Modelling Conditional Distributions 0©. 199 6.6.1 Mixture Density Network 0© . 201 6.7 Quantile Regression 0©. 204 6.8 Historical Development of NN in Environmental Science 0© . . 207 6.8.1 Remote Sensing 0©. 208 6.8.2 Hydrology 0© . 212 6.8.3 Atmospheric Science0© . 213 6.8.4 Oceanography [0©. 213 Exercises . 214 6.5 6.6 7 Non-linear Optimization 216 7.1 Extrema and Saddle Points И©. 216 7.2 Gradient Vector in Optimization ÍÃ]©. 219 7.3 Back-Propagation Į0©. 221 7.4 Training Protocol [ĀI©. 224 7.5 Gradient Descent Method |Āļ©. 225 7.6 Stochastic Gradient Descent [0©. 227 7.7 Conjugate Gradient Method
[0©.229 7.8 Quasi-Newton Methods 0©. 232 7.9 Non-linear Least Squares Methods 0©. 234 7.10 Evolutionary Algorithms 0©. 236 7.11 Hill Climbing 0©. 238 7.12 Genetic Algorithm 0©. 239 7.13 Differential Evolution [0©. 241 Exercises . 244 8 Learning and Generalization 8.1 8.2 8.3 8.4 8.5 8.6 8.7 8.8 8.9 8.10 8.11 245 Mean Squared Error and MaximumLikelihood [Āļ©. 245 Objective Functions and RobustnessÍÂ]©.247 Variance and Bias Errors [Ā1©. 250 Regularization IĀI©. 251 8.4.1 Weight Penalty [ĀI© . 251 8.4.2 Early Stopping [Ā1© . 254 Cross-Validation [ĀI©.255 Hyperparameter Tuning IĀ]©. 258 Ensemble Methods
[0©. 261 8.7.1 Bagging И© . 253 8.7.2 Error of Ensemble 0©. 263 8.7.3 Unequal Weighting of EnsembleMembers 0© . 266 8.7.4 Stacking 0© . 268 Dropout 0©. 269 Maximum Norm Constraint 0©. 271 Bayesian Model Selection 0©. 272 Information Criterion 0©. 273 8.11.1 Bayesian Information Criterion0© . 273
Contents ix 8.11.2 Akaike Information CriterionE© . 275 8.12 Bayesian Model Averaging E®. 278 8.13 Interpretable ML E©. 280 Exercises . 281 9 Principal Components and Canonical Correlation 283 Principal Component Analysis (PCA)[Ā1©. 283 9.1.1 Geometric Approach to PCA O© .284 9.1.2 Eigenvector Approach to PCA [Ā1©. 284 9.1.3 Real and Complex Data E© . 288 9.1.4 Orthogonality Relations [Ā1© . 289 9.1.5 PCA of the Tropical Pacific Climate Variability [Ā1© · · 290 9.1.6 Scaling the PCs and Eigenvectors [Ā]©.298 9.1.7 Degeneracy of Eigenvalues [Āļ©.299 9.1.8 A Smaller Covariance Matrix [Ā]© .299 9.1.9 How Many Modes to Retain E©. 300 9.1.10 Temporal and Spatial Mean Removal E©. 302 9.1.11Singular Value Decomposition E©. 302 9.1.12 Missing Data E© . 304 9.2 Rotated PCA E©. 305 9.2.1
E-frame Rotation E©.308 9.2.2 A-frame Rotation E©. 311 9.2.3 Advantages and Disadvantages of Rotation [Ā1© . 315 9.3 PCA for Two-Dimensional Vectors E©. 317 9.4 Canonical Correlation Analysis (CCA) E©.320 9.4.1 CCA Theory E© . 320 9.4.2 Pre-filter with PCA E©. 324 9.5 Maximum Covariance Analysis E©. 326 Exercises . 327 9.1 10 Unsupervised Learning 330 10.1 Clustering E©. 330 10.1.1 Distance Measure E©. 331 10.1.2 Model Evaluation E©. 332 10.2 Non-hierarchical Clustering E©. 337 10.2.1 А-means Clustering E© . 337 10.2.2 Nucleated Agglomerative Clustering E© . 338 10.2.3 Gaussian Mixture Model E©. 339 10.3 Hierarchical Clustering E©. 339 10.4 Self-Organizing Map
E©. 343 10.4.1 Applications of SOM E©. 345 10.5 Autoencoder [Ā1©. 347 10.6 Non-linear Principal Component Analysis E©. 349 10.6.1 Overfitting E©. 358 10.6.2 Closed Curves E©.360 10.7 Other Non-linear Dimensionality Reduction Methods E© · · ■ 363
x Contents 10.8 Non-linear Canonical Correlation Analysis İÇİ© .365 Exercises .369 11 Time Series 372 11.1 Fourier Analysis [Ā1©. 372 11.1.1 Fourier Series [Ā1© . 373 11.1.2 Discrete Fourier Transform [Āļ© . 374 11.1.3 Continuous-Time Fourier Transform [0© .375 11.1.4 Discrete-Time Fourier Transform [ВІ© . 376 11.2 Windows [Äi©. 376 11.3 Spectrum IĀ1©. 379 11.3.1 Effects of Window Functions [Āļ©. 380 11.3.2 Trend Removal ¡Āļ© . 382 11.3.3 Nyquist Frequency and Aliasing [Ā1©. 382 11.3.4 Smoothing the Spectrum [Āļ©. 384 11.3.5 Confidence Interval [0©. 385 11.3.6 Examples [Bi© . 385 11.3.7 Fast Fourier Transform [Bi©. 386 11.3.8 Relation with Auto-covariance ІВІ©. 387
11.3.9 Rotary Spectrum for 2-D Vectors [Cl© .387 11.3.10 Wavelets [Cl©. 390 11.4 Cross-Spectrum [Bi© . 393 11.5 Filtering [Ā1© . 395 11.5.1 Periodic Signals [ĀI©. 396 11.5.2 Ideal Filters [A]©. 397 11.5.3 Finite Impulse Response Filters1ВІ©. 398 11.6 Averaging [ĀI©. 401 11.6.1 Moving Average Filters [Ā1©. 402 11.6.2 Grid-Scale Noise [Bļ© . 402 11.6.3 Linearization from Time-Averaging [Bi©. 404 11.7 Singular Spectrum Analysis [B]©. 405 11.7.1 Multichannel Singular SpectrumAnalysis [Bi© .407 11.8 Auto-regressive Process [Bļ©. 410 11.8.1 AR(p) Process ¡Bi© . 411 11.8.2 AR(1) Process Е©. 412 11.8.3 AR(2) Process {C]©. 413 11.9 Box-
Jenkins Models ІСІ©. 414 11.9.1 Moving Average (MA) Process [Cl© . 414 11.9.2 Auto-regressive Moving Average (ARMA) Model [Cl© . 414 11.9.3 Auto-regressive Integrated Moving Average (ARIMA) Model И©. 415 Exercises . 416
Contents xi 12 Classification 418 12.1 Linear Discriminant Analysis IĀ1©.419 12.1.1 Fisher Linear Discriminant E©. 421 12.2 Logistic Regression ÍÃ]© . 424 12.2.1 Multiclass Logistic Regression E©. 425 12.3 Naive Bayes Classifier E©. 427 12.4 К-nearest Neighbours E©. 428 12.5 Extreme Learning Machine Classifier [Āļ© . 430 12.6 Cross-Entropy [Ā1©. 432 12.7 Multi-layer Perceptron Classifier [Āļ©. 434 12.8 Class Imbalance [Ā1©. 436 Exercises . 438 13 Kernel Methods 440 13.1 From Neural Networks to Kernel Methods E©. 441 13.2 Primal and Dual Solutions for Linear Regression E©. 442 13.3 Kernels E©. 444 13.4 Kernel Ridge Regression E©. 448 13.5 Advantages and Disadvantages E©. 449 13.6 Pre-image Problem E©
. 451 13.7 Support Vector Machines (SVM) E®. 453 13.7.1 Linearly Separable Case E® . 454 13.7.2 Linearly Non-separable Case E®.458 13.7.3 Non-linear Classification by SVM E©. 460 13.7.4 Multi-class Classification by SVM E©. 461 13.7.5 Support Vector Regression E©. 462 13.8 Gaussian Processes E©. 463 13.8.1 Learning the Hyperparameters E© . 466 13.8.2 Other Common Kernels E© . 467 13.9 Kernel Principal Component Analysis E©. 469 Exercises . 471 14 Decision Trees, Random Forests and Boosting 473 14.1 Classification and Regression Trees (CART) IĀI© . 474 14.1.1 Relative Importance of Predictors E©.479 14.1.2 Surrogate Splits E©. 480 14.2 Random Forests [Ā1©. 482 14.2.1 Extremely Randomized Trees (Extra Trees) E© . 487 14.3 Boosting [Ā1©.
487 14.3.1 Gradient Boosting E© . 488 Exercises . 492
xii Contents 15 Deep Learning 494 15.1 Transfer Learning [Ā1©. 498 15.2 Convolutional Neural Network E©.499 15.2.1 Convolution Operation E©. 499 15.2.2 Pooling E©.502 15.2.3 AlexNet E© . 503 15.2.4 Residual Neural Network (ResNet) E© . 505 15.2.5 Data Augmentation E©. 506 15.2.6 Applications in Environment ScienceE© . 506 15.3 Encoder-Decoder Network E©. 507 15.3.1 U-net E©. 508 15.4 Time Series E©.510 15.4.1 Long Short-Term Memory (LSTM) Network E© · · · · 510 15.4.2 Temporal Convolutional Network E© . 513 15.5 Generative Adversarial Network E©. 514 Exercises . 517 16 Forecast Verification and Post-processing 518 16.1 Binary Classes E©. 519 16.1.1 Skill Scores for Binary Classes
E©. 524 16.2 Multiple Classes E©. 527 16.3 Probabilistic Forecasts for Binary ClassesE©. 528 16.3.1 Reliability Diagram E©. 529 16.4 Probabilistic Forecasts forMultiple Classes E©. 531 16.5 Continuous Variables E©.532 16.5.1 Forecast Scores E© .532 16.5.2 Skill Scores E© . 533 16.6 Probabilistic Forecasts for ContinuousVariables E©. 535 16.7 Minimizing Loss E©. 535 16.8 Spurious Skills [Ā1©. 536 16.9 Extrapolation ¡Āi© . 538 16.10 Post-processing E©.541 16.11 Downscaling E©. 543 16.11.1 Reduced Variance E©. 546 Exercises . 547 17 Merging of Machine Learning and Physics 549 17.1 Physics Emulation and Hybrid Models
E©.550 17.1.1 Radiation in AtmosphericModels E©. 550 17.1.2 Clouds E©. 551 17.1.3 Turbulent Fluxes E© . 553 17.1.4 Hybrid Coupled Atmosphere-Ocean Modelling E© . 554 17.1.5 Wind Wave ModellingE© . 555 17.2 Physics-Informed Machine Learning [Ā1©. 556 17.2.1 Soft Constraint E© .556
Contents xiii 17.2.2 Hard Constraint [Cl©. 558 17.3 Data Assimilation and ML[C]©. 560 17.3.1 3D-Var И®. 562 17.3.2 4D-Var 0®. 563 17.3.3 Neural Networks in4D-Var |C]® . 564 Exercises . 568 Appendices 569 A Trends in Terminology . 569 В Lagrange Multipliers .569 References 573 Index 613 |
any_adam_object | 1 |
any_adam_object_boolean | 1 |
author | Hsieh, William Wei 1955- |
author_GND | (DE-588)139904557 |
author_facet | Hsieh, William Wei 1955- |
author_role | aut |
author_sort | Hsieh, William Wei 1955- |
author_variant | w w h ww wwh |
building | Verbundindex |
bvnumber | BV048860216 |
classification_rvk | WC 7786 ST 630 WC 7700 WI 1500 |
ctrlnum | (OCoLC)1370400344 (DE-599)BVBBV048860216 |
discipline | Biologie Informatik |
discipline_str_mv | Biologie Informatik |
format | Book |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>00000nam a2200000 c 4500</leader><controlfield tag="001">BV048860216</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20240409</controlfield><controlfield tag="007">t|</controlfield><controlfield tag="008">230315s2023 xx a||| b||| 00||| eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9781107065550</subfield><subfield code="c">hardback</subfield><subfield code="9">978-1-107-06555-0</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">1107065550</subfield><subfield code="c">hardback</subfield><subfield code="9">1-107-06555-0</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)1370400344</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)BVBBV048860216</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">rda</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-355</subfield><subfield code="a">DE-188</subfield><subfield code="a">DE-1050</subfield><subfield code="a">DE-11</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">WC 7786</subfield><subfield code="0">(DE-625)164864:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 630</subfield><subfield code="0">(DE-625)143685:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">WC 7700</subfield><subfield code="0">(DE-625)148144:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">WI 1500</subfield><subfield code="0">(DE-625)148757:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Hsieh, William Wei</subfield><subfield code="d">1955-</subfield><subfield code="e">Verfasser</subfield><subfield code="0">(DE-588)139904557</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Introduction to environmental data science</subfield><subfield code="c">William W. Hsieh, University of British Columbia</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Cambridge, United Kingdom ; New York, USA</subfield><subfield code="b">Cambridge University Press</subfield><subfield code="c">2023</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">xx, 627 Seiten</subfield><subfield code="b">Illustrationen, Diagramme</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1="3" ind2=" "><subfield code="a">"Statistical and machine learning methods have many applications in the environmental sciences, including prediction and data analysis in meteorology, hydrology and oceanography; pattern recognition for satellite images from remote sensing; management ofagriculture and forests; assessment of climate change; and much more. With rapid advances in machine learning in the last decade, this book provides an urgently needed, comprehensive guide to machine learning and statistics for students and researchers interested in environmental data science. It includes intuitive explanations covering the relevant background mathematics, with examples drawn from the environmental sciences. A broad range of topics are covered, including correlation, regression, classification, clustering, neural networks, random forests, boosting, kernel methods, evolutionary algorithms and deep learning, as well as the recent merging of machine learning and physics. End-of-chapter exercises allow readers to develop their problem-solving skills, and online datasets allow readers to practise analysis of real data. William W. Hsieh is a professor emeritus in the Department of Earth, Ocean and Atmospheric Sciences at the University of British Columbia. Known as a pioneer in introducing machine learning to environmental science, he has written over 100 peer-reviewed journal papers on climate variability, machine learning, atmospheric science, oceanography, hydrology and agricultural science. He is the author of the book Machine Learning Methods in the Environmental Sciences (2009, Cambridge University Press), the first single-authored textbook on machine learning for environmental scientists. Currently retired in Victoria, British Columbia, he enjoys growing organic vegetables"--</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">A comprehensive guide to machine learning and statistics for students and researchers of environmental data science.</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Klimaänderung</subfield><subfield code="0">(DE-588)4164199-1</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Deep Learning</subfield><subfield code="0">(DE-588)1135597375</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Datenanalyse</subfield><subfield code="0">(DE-588)4123037-1</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Data Science</subfield><subfield code="0">(DE-588)1140936166</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Umweltwissenschaften</subfield><subfield code="0">(DE-588)4137364-9</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Statistik</subfield><subfield code="0">(DE-588)4056995-0</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Maschinelles Lernen</subfield><subfield code="0">(DE-588)4193754-5</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Neuronale Netzwerke</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Environmental sciences / Data processing</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Environmental protection / Data processing</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Environmental management / Data processing</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Mustererkennung für Fernerkundungssatellitenbilder</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Korrelation und Regression in den Umweltwissenschaften</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Maschinelles Lernen</subfield><subfield code="0">(DE-588)4193754-5</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="1"><subfield code="a">Deep Learning</subfield><subfield code="0">(DE-588)1135597375</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="2"><subfield code="a">Statistik</subfield><subfield code="0">(DE-588)4056995-0</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="3"><subfield code="a">Datenanalyse</subfield><subfield code="0">(DE-588)4123037-1</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="4"><subfield code="a">Umweltwissenschaften</subfield><subfield code="0">(DE-588)4137364-9</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="5"><subfield code="a">Klimaänderung</subfield><subfield code="0">(DE-588)4164199-1</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="689" ind1="1" ind2="0"><subfield code="a">Data Science</subfield><subfield code="0">(DE-588)1140936166</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="1" ind2="1"><subfield code="a">Maschinelles Lernen</subfield><subfield code="0">(DE-588)4193754-5</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="1" ind2="2"><subfield code="a">Umweltwissenschaften</subfield><subfield code="0">(DE-588)4137364-9</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="1" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Erscheint auch als</subfield><subfield code="n">Online-Ausgabe</subfield><subfield code="z">978-1-107-58849-3</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="m">Digitalisierung UB Regensburg - ADAM Catalogue Enrichment</subfield><subfield code="q">application/pdf</subfield><subfield code="u">http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=034125345&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA</subfield><subfield code="3">Inhaltsverzeichnis</subfield></datafield><datafield tag="943" ind1="1" ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-034125345</subfield></datafield></record></collection> |
id | DE-604.BV048860216 |
illustrated | Illustrated |
index_date | 2024-07-03T21:41:57Z |
indexdate | 2025-02-13T09:00:47Z |
institution | BVB |
isbn | 9781107065550 1107065550 |
language | English |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-034125345 |
oclc_num | 1370400344 |
open_access_boolean | |
owner | DE-355 DE-BY-UBR DE-188 DE-1050 DE-11 |
owner_facet | DE-355 DE-BY-UBR DE-188 DE-1050 DE-11 |
physical | xx, 627 Seiten Illustrationen, Diagramme |
publishDate | 2023 |
publishDateSearch | 2023 |
publishDateSort | 2023 |
publisher | Cambridge University Press |
record_format | marc |
spelling | Hsieh, William Wei 1955- Verfasser (DE-588)139904557 aut Introduction to environmental data science William W. Hsieh, University of British Columbia Cambridge, United Kingdom ; New York, USA Cambridge University Press 2023 xx, 627 Seiten Illustrationen, Diagramme txt rdacontent n rdamedia nc rdacarrier "Statistical and machine learning methods have many applications in the environmental sciences, including prediction and data analysis in meteorology, hydrology and oceanography; pattern recognition for satellite images from remote sensing; management ofagriculture and forests; assessment of climate change; and much more. With rapid advances in machine learning in the last decade, this book provides an urgently needed, comprehensive guide to machine learning and statistics for students and researchers interested in environmental data science. It includes intuitive explanations covering the relevant background mathematics, with examples drawn from the environmental sciences. A broad range of topics are covered, including correlation, regression, classification, clustering, neural networks, random forests, boosting, kernel methods, evolutionary algorithms and deep learning, as well as the recent merging of machine learning and physics. End-of-chapter exercises allow readers to develop their problem-solving skills, and online datasets allow readers to practise analysis of real data. William W. Hsieh is a professor emeritus in the Department of Earth, Ocean and Atmospheric Sciences at the University of British Columbia. Known as a pioneer in introducing machine learning to environmental science, he has written over 100 peer-reviewed journal papers on climate variability, machine learning, atmospheric science, oceanography, hydrology and agricultural science. He is the author of the book Machine Learning Methods in the Environmental Sciences (2009, Cambridge University Press), the first single-authored textbook on machine learning for environmental scientists. Currently retired in Victoria, British Columbia, he enjoys growing organic vegetables"-- A comprehensive guide to machine learning and statistics for students and researchers of environmental data science. Klimaänderung (DE-588)4164199-1 gnd rswk-swf Deep Learning (DE-588)1135597375 gnd rswk-swf Datenanalyse (DE-588)4123037-1 gnd rswk-swf Data Science (DE-588)1140936166 gnd rswk-swf Umweltwissenschaften (DE-588)4137364-9 gnd rswk-swf Statistik (DE-588)4056995-0 gnd rswk-swf Maschinelles Lernen (DE-588)4193754-5 gnd rswk-swf Neuronale Netzwerke Environmental sciences / Data processing Environmental protection / Data processing Environmental management / Data processing Mustererkennung für Fernerkundungssatellitenbilder Korrelation und Regression in den Umweltwissenschaften Maschinelles Lernen (DE-588)4193754-5 s Deep Learning (DE-588)1135597375 s Statistik (DE-588)4056995-0 s Datenanalyse (DE-588)4123037-1 s Umweltwissenschaften (DE-588)4137364-9 s Klimaänderung (DE-588)4164199-1 s DE-604 Data Science (DE-588)1140936166 s Erscheint auch als Online-Ausgabe 978-1-107-58849-3 Digitalisierung UB Regensburg - ADAM Catalogue Enrichment application/pdf http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=034125345&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA Inhaltsverzeichnis |
spellingShingle | Hsieh, William Wei 1955- Introduction to environmental data science Klimaänderung (DE-588)4164199-1 gnd Deep Learning (DE-588)1135597375 gnd Datenanalyse (DE-588)4123037-1 gnd Data Science (DE-588)1140936166 gnd Umweltwissenschaften (DE-588)4137364-9 gnd Statistik (DE-588)4056995-0 gnd Maschinelles Lernen (DE-588)4193754-5 gnd |
subject_GND | (DE-588)4164199-1 (DE-588)1135597375 (DE-588)4123037-1 (DE-588)1140936166 (DE-588)4137364-9 (DE-588)4056995-0 (DE-588)4193754-5 |
title | Introduction to environmental data science |
title_auth | Introduction to environmental data science |
title_exact_search | Introduction to environmental data science |
title_exact_search_txtP | Introduction to environmental data science |
title_full | Introduction to environmental data science William W. Hsieh, University of British Columbia |
title_fullStr | Introduction to environmental data science William W. Hsieh, University of British Columbia |
title_full_unstemmed | Introduction to environmental data science William W. Hsieh, University of British Columbia |
title_short | Introduction to environmental data science |
title_sort | introduction to environmental data science |
topic | Klimaänderung (DE-588)4164199-1 gnd Deep Learning (DE-588)1135597375 gnd Datenanalyse (DE-588)4123037-1 gnd Data Science (DE-588)1140936166 gnd Umweltwissenschaften (DE-588)4137364-9 gnd Statistik (DE-588)4056995-0 gnd Maschinelles Lernen (DE-588)4193754-5 gnd |
topic_facet | Klimaänderung Deep Learning Datenanalyse Data Science Umweltwissenschaften Statistik Maschinelles Lernen |
url | http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=034125345&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |
work_keys_str_mv | AT hsiehwilliamwei introductiontoenvironmentaldatascience |