Python for probability, statistics, and machine learning:
Introduction -- Part 1 Getting Started with Scientific Python -- Installation and Setup -- Numpy -- Matplotlib -- Ipython -- Jupyter Notebook -- Scipy -- Pandas -- Sympy -- Interfacing with Compiled Libraries -- Integrated Development Environments -- Quick Guide to Performance and Parallel Programmi...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Buch |
Sprache: | English |
Veröffentlicht: |
Cham
Springer
[2019]
|
Ausgabe: | Second edition |
Schlagworte: | |
Online-Zugang: | Inhaltsverzeichnis |
Zusammenfassung: | Introduction -- Part 1 Getting Started with Scientific Python -- Installation and Setup -- Numpy -- Matplotlib -- Ipython -- Jupyter Notebook -- Scipy -- Pandas -- Sympy -- Interfacing with Compiled Libraries -- Integrated Development Environments -- Quick Guide to Performance and Parallel Programming -- Other Resources -- Part 2 Probability -- Introduction -- Projection Methods -- Conditional Expectation as Projection -- Conditional Expectation and Mean Squared Error -- Worked Examples of Conditional Expectation and Mean Square Error Optimization -- Useful Distributions -- Information Entropy -- Moment Generating Functions -- Monte Carlo Sampling Methods -- Useful Inequalities -- Part 3 Statistics -- Python Modules for Statistics -- Types of Convergence -- Estimation Using Maximum Likelihood -- Hypothesis Testing and P-Values -- Confidence Intervals -- Linear Regression -- Maximum A-Posteriori -- Robust Statistics -- Bootstrapping -- Gauss Markov -- Nonparametric Methods -- Survival Analysis -- Part 4 Machine Learning -- Introduction -- Python Machine Learning Modules -- Theory of Learning -- Decision Trees -- Boosting Trees -- Logistic Regression -- Generalized Linear Models -- Regularization -- Support Vector Machines -- Dimensionality Reduction -- Clustering -- Ensemble Methods -- Deep Learning -- Notation -- References -- Index This book, fully updated for Python version 3.6+, covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. All the figures and numerical results are reproducible using the Python codes provided. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Detailed proofs for certain important results are also provided. Modern Python modules like Pandas, Sympy, Scikit-learn, Tensorflow, and Keras are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This updated edition now includes the Fisher Exact Test and the Mann-Whitney-Wilcoxon Test. A new section on survival analysis has been included as well as substantial development of Generalized Linear Models. The new deep learning section for image processing includes an in-depth discussion of gradient descent methods that underpin all deep learning algorithms. As with the prior edition, there are new and updated *Programming Tips* that the illustrate effective Python modules and methods for scientific programming and machine learning. There are 445 run-able code blocks with corresponding outputs that have been tested for accuracy. Over 158 graphical visualizations (almost all generated using Python) illustrate the concepts that are developed both in code and in mathematics. We also discuss and use key Python modules such as Numpy, Scikit-learn, Sympy, Scipy, Lifelines, CvxPy, Theano, Matplotlib, Pandas, Tensorflow, Statsmodels, and Keras. |
Beschreibung: | xiv, 384 Seiten Illustrationen, Diagramme 25 cm |
ISBN: | 9783030185442 |
Internformat
MARC
LEADER | 00000nam a2200000 c 4500 | ||
---|---|---|---|
001 | BV047581782 | ||
003 | DE-604 | ||
005 | 20211209 | ||
007 | t | ||
008 | 211110s2019 a||| |||| 00||| eng d | ||
020 | |a 9783030185442 |9 978-3-030-18544-2 | ||
035 | |a (OCoLC)1117697334 | ||
035 | |a (DE-599)KXP167693930X | ||
040 | |a DE-604 |b ger |e rda | ||
041 | 0 | |a eng | |
049 | |a DE-355 | ||
050 | 0 | |a TK1-9971 | |
084 | |a ST 250 |0 (DE-625)143626: |2 rvk | ||
084 | |a ST 250 |0 (DE-625)143626: |2 rvk | ||
084 | |a 54.53 |2 bkl | ||
100 | 1 | |a Unpingco, José |d 1969- |e Verfasser |0 (DE-588)1113037261 |4 aut | |
245 | 1 | 0 | |a Python for probability, statistics, and machine learning |c José Unpingco |
250 | |a Second edition | ||
264 | 1 | |a Cham |b Springer |c [2019] | |
264 | 4 | |c © 2019 | |
300 | |a xiv, 384 Seiten |b Illustrationen, Diagramme |c 25 cm | ||
336 | |b txt |2 rdacontent | ||
337 | |b n |2 rdamedia | ||
338 | |b nc |2 rdacarrier | ||
520 | 3 | |a Introduction -- Part 1 Getting Started with Scientific Python -- Installation and Setup -- Numpy -- Matplotlib -- Ipython -- Jupyter Notebook -- Scipy -- Pandas -- Sympy -- Interfacing with Compiled Libraries -- Integrated Development Environments -- Quick Guide to Performance and Parallel Programming -- Other Resources -- Part 2 Probability -- Introduction -- Projection Methods -- Conditional Expectation as Projection -- Conditional Expectation and Mean Squared Error -- Worked Examples of Conditional Expectation and Mean Square Error Optimization -- Useful Distributions -- Information Entropy -- Moment Generating Functions -- Monte Carlo Sampling Methods -- Useful Inequalities -- Part 3 Statistics -- Python Modules for Statistics -- Types of Convergence -- Estimation Using Maximum Likelihood -- Hypothesis Testing and P-Values -- Confidence Intervals -- Linear Regression -- Maximum A-Posteriori -- Robust Statistics -- Bootstrapping -- Gauss Markov -- Nonparametric Methods -- Survival Analysis -- Part 4 Machine Learning -- Introduction -- Python Machine Learning Modules -- Theory of Learning -- Decision Trees -- Boosting Trees -- Logistic Regression -- Generalized Linear Models -- Regularization -- Support Vector Machines -- Dimensionality Reduction -- Clustering -- Ensemble Methods -- Deep Learning -- Notation -- References -- Index | |
520 | 3 | |a This book, fully updated for Python version 3.6+, covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. All the figures and numerical results are reproducible using the Python codes provided. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Detailed proofs for certain important results are also provided. Modern Python modules like Pandas, Sympy, Scikit-learn, Tensorflow, and Keras are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This updated edition now includes the Fisher Exact Test and the Mann-Whitney-Wilcoxon Test. | |
520 | 3 | |a A new section on survival analysis has been included as well as substantial development of Generalized Linear Models. The new deep learning section for image processing includes an in-depth discussion of gradient descent methods that underpin all deep learning algorithms. As with the prior edition, there are new and updated *Programming Tips* that the illustrate effective Python modules and methods for scientific programming and machine learning. There are 445 run-able code blocks with corresponding outputs that have been tested for accuracy. Over 158 graphical visualizations (almost all generated using Python) illustrate the concepts that are developed both in code and in mathematics. We also discuss and use key Python modules such as Numpy, Scikit-learn, Sympy, Scipy, Lifelines, CvxPy, Theano, Matplotlib, Pandas, Tensorflow, Statsmodels, and Keras. | |
650 | 0 | 7 | |a Stochastik |0 (DE-588)4121729-9 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Maschinelles Lernen |0 (DE-588)4193754-5 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Python |g Programmiersprache |0 (DE-588)4434275-5 |2 gnd |9 rswk-swf |
653 | 0 | |a Telecommunication | |
653 | 0 | |a Computer science | |
653 | 0 | |a Engineering mathematics | |
653 | 0 | |a Statistics | |
653 | 0 | |a Data mining | |
653 | 0 | |a Communications Engineering, Networks | |
689 | 0 | 0 | |a Python |g Programmiersprache |0 (DE-588)4434275-5 |D s |
689 | 0 | 1 | |a Stochastik |0 (DE-588)4121729-9 |D s |
689 | 0 | 2 | |a Maschinelles Lernen |0 (DE-588)4193754-5 |D s |
689 | 0 | |5 DE-604 | |
776 | 0 | 8 | |i Erscheint auch als |n Online-Ausgabe |z 978-3-030-18545-9 |
856 | 4 | 2 | |m Digitalisierung UB Regensburg - ADAM Catalogue Enrichment |q application/pdf |u http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=032967158&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |3 Inhaltsverzeichnis |
999 | |a oai:aleph.bib-bvb.de:BVB01-032967158 |
Datensatz im Suchindex
_version_ | 1804182935821090816 |
---|---|
adam_text | Contents 1 Getting Started with Scientific Python..................................................... 1.1 Installation and Setup......................................................................... 1.2 Numpy................................................................................................... 1.2.1 Numpy Arrays and Memory............................................... 1.2.2 Numpy Matrices .................................................................. 1.2.3 Numpy Broadcasting........................................................... 1.2.4 Numpy Masked Arrays........................................................ 1.2.5 Floating-Point Numbers...................................................... 1.2.6 Numpy Optimizations and Prospectus.............................. 1.3 Matplotlib............................................................................................ 1.3.1 Alternatives to Matplotlib.................................................... 1.3.2 Extensions to Matplotlib...................................................... 1.4 IPython................................................................................................ 1.5 Jupyter Notebook................................................................................ 1.6 Scipy..................................................................................................... 1.7 Pandas................................................................................................... 1.7.1 Series....................................................................................
1.7.2 Dataframe............................................................................. 1.8 Sympy................................................................................................... 1.9 Interfacing with CompiledLibraries................................................... 1.10 Integrated Development Environments............................................ 1.11 Quick Guide to Performance andParallel Programming............... 1.12 Other Resources.................................................................................. References....................................................................................................... 1 2 4 6 9 10 13 13 17 17 19 20 20 22 24 25 25 27 30 32 33 34 37 38 2 Probability..................................................................................................... 2.1 Introduction......................................................................................... 2.1.1 Understanding Probability Density................................... 2.1.2 Random Variables............................................................... 2.1.3 Continuous Random Variables.......................................... 39 39 40 41 46 xi
xii Contents 2.1.4 Transformation of Variables Beyond Calculus................. 2.1.5 Independent Random Variables............................................ 2.1.6 Classic Broken Rod Example.............................................. 2.2 Projection Methods............................................................................... 2.2.1 Weighted Distance................................................................. 2.3 Conditional Expectation as Projection.............................................. 2.3.1 Appendix................................................................................. 2.4 Conditional Expectation and Mean Squared Error........................... 2.5 Worked Examples of Conditional Expectation and Mean Square Error Optimization.............................................................................. 2.5.1 Example................................................................................. 2.5.2 Exampie................................................................................. 2.5.3 Example................................................................................ 2.5.4 Example................................................................................ 2.5.5 Example................................................................................. 2.5.6 Example................................................................................ 2.6 Useful Distributions............................................................................ 2.6.1 Normal
Distribution.............................................................. 2.6.2 Multinomial Distribution....................................................... 2.6.3 Chi-square Distribution......................................................... 2.6.4 Poisson and Exponential Distributions............................... 2.6.5 Gamma Distribution.............................................................. 2.6.6 Beta Distribution.................................................................. 2.6.7 Dirichlet-Multinomial Distribution...................................... 2.7 Information Entropy........................................................................... 2.7.1 Information Theory Concepts............................................. 2.7.2 Properties of Information Entropy...................................... 2.7.3 Kullback-Leibler Divergence............................................. 2.7.4 Cross-Entropy as Maximum Likelihood............................ 2.8 Moment Generating Functions........................................................... 2.9 Monte Carlo Sampling Methods........................................................ 2.9.1 Inverse CDF Method for DiscreteVariables....................... 2.9.2 Inverse CDF Method for ContinuousVariables................. 2.9.3 Rejection Method.................................................................. 2.10 Sampling ImportanceResampling..................................................... 2.11 Useful Inequalities............................................................................. 2.11.1 Markov’s
Inequality............................................................. 2.11.2 Chebyshev’s Inequality........................................................ 2.11.3 Hoeffding ’ s Inequality........................................................ References....................................................................................................... 49 51 53 55 57 58 64 65 68 69 72 75 78 79 82 83 83 84 86 89 90 91 93 95 96 98 99 100 101 104 105 107 108 113 115 115 116 118 120
Contents 3 Statistics.......................................................................................................... 3.1 Introduction......................................................................................... 3.2 Python Modules for Statistics.......................................................... 3.2.1 Scipy Statistics Module...................................................... 3.2.2 Sympy Statistics Module................................................... 3.2.3 Other Python Modules for Statistics................................ 3.3 Types of Convergence ...................................................................... 3.3.1 Almost Sure Convergence ................................................. 3.3.2 Convergence in Probability................................................. 3.3.3 Convergence in Distribution.............................................. 3.3.4 Limit Theorems................................................................... 3.4 Estimation Using Maximum Likelihood......................................... 3.4.1 Setting Up the Coin-Flipping Experiment....................... 3.4.2 Delta Method........................................................................ 3.5 Hypothesis Testing and P-Values..................................................... 3.5.1 Back to the Coin-Flipping Example................................... 3.5.2 Receiver Operating Characteristic..................................... 3.5.3 P-Values............................................................................... 3.5.4 Test
Statistics........................................................................ 3.5.5 Testing Multiple Hypotheses.............................................. 3.5.6 Fisher Exact Test................................................................. 3.6 Confidence Intervals.......................................................................... 3.7 Linear Regression............................................................................... 3.7.1 Extensions to Multiple Covariates..................................... 3.8 Maximum A-Posteriori...................................................................... 3.9 Robust Statistics.................................................................................. 3.10 Bootstrapping...................................................................................... 3.10.1 Parametric Bootstrap .......................................................... 3.11 Gauss-Markov.................................................................................... 3.12 Nonparametric Methods................................................................... 3.12.1 Kernel Density Estimation................................................. 3.12.2 Kernel Smoothing............................................................... 3.12.3 Nonparametric Regression Estimators.............................. 3.12.4 Nearest Neighbors Regression............................................ 3.12.5 Kernel Regression............................................................... 3.12.6 Curse of
Dimensionality..................................................... 3.12.7 Nonparametric Tests............................................................ 3.13 Survival Analysis............................................................................... 3.13.1 Example............................................................................... References....................................................................................................... xiii 123 123 124 124 125 126 126 126 129 131 132 133 135 145 147 149 152 154 155 163 163 166 169 178 183 188 195 200 201 205 205 207 213 214 218 219 221 228 231 236
xiv 4 Contents Machine Learning........................................................................................... 4.1 Introduction........................................................................................... 4.2 Python Machine Learning Modules................................................... 4.3 Theory of Learning............................................................................... 4.3.1 Introduction to Theory ofMachine Learning..................... 4.3.2 Theory of Generalization..................................................... 4.3.3 Worked Example for Generalization/Approximation Complexity............................................................................. 4.3.4 Cross-Validation ................................................................... 4.3.5 Bias and Variance................................................................. 4.3.6 Learning Noise...................................................................... 4.4 Decision Trees...................................................................................... 4.4.1 Random Forests...................................................................... 4.4.2 Boosting Trees........................................................................ 4.5 Boosting Trees...................................................................................... 4.5.1 Boosting Trees........................................................................ 4.6 Logistic Regression............................................................................... 4.7 Generalized Linear
Models................................................................. 4.8 Regularization ...................................................................................... 4.8.1 Ridge Regression................................................................... 4.8.2 Lasso Regression................................................................... 4.9 Support Vector Machines................................................................... 4.9.1 Kemel Tricks.......................................................................... 4.10 Dimensionality Reduction................................................................... 4.10.1 Independent Component Analysis...................................... 4.11 Clustering ............................................................................................. 4.12 Ensemble Methods.............................................................................. 4.12.1 Bagging................................................................................... 4.12.2 Boosting................................................................................. 4.13 Deep Learning..................................................................................... 4.13.1 Introduction to Tensorflow.................................................. 4.13.2 Understanding Gradient Descent........................................ 4.13.3 Image Processing Using Convolutional Neural Networks................................................................................
References........................................................................................................ 237 237 237 241 244 249 250 256 260 265 268 275 277 281 281 285 295 300 304 309 311 315 317 321 325 329 329 331 334 343 350 363 379 Notation.................................................................................................................. 381 Index...................................................................................................................... 383
|
adam_txt |
Contents 1 Getting Started with Scientific Python. 1.1 Installation and Setup. 1.2 Numpy. 1.2.1 Numpy Arrays and Memory. 1.2.2 Numpy Matrices . 1.2.3 Numpy Broadcasting. 1.2.4 Numpy Masked Arrays. 1.2.5 Floating-Point Numbers. 1.2.6 Numpy Optimizations and Prospectus. 1.3 Matplotlib. 1.3.1 Alternatives to Matplotlib. 1.3.2 Extensions to Matplotlib. 1.4 IPython. 1.5 Jupyter Notebook. 1.6 Scipy. 1.7 Pandas. 1.7.1 Series.
1.7.2 Dataframe. 1.8 Sympy. 1.9 Interfacing with CompiledLibraries. 1.10 Integrated Development Environments. 1.11 Quick Guide to Performance andParallel Programming. 1.12 Other Resources. References. 1 2 4 6 9 10 13 13 17 17 19 20 20 22 24 25 25 27 30 32 33 34 37 38 2 Probability. 2.1 Introduction. 2.1.1 Understanding Probability Density. 2.1.2 Random Variables. 2.1.3 Continuous Random Variables. 39 39 40 41 46 xi
xii Contents 2.1.4 Transformation of Variables Beyond Calculus. 2.1.5 Independent Random Variables. 2.1.6 Classic Broken Rod Example. 2.2 Projection Methods. 2.2.1 Weighted Distance. 2.3 Conditional Expectation as Projection. 2.3.1 Appendix. 2.4 Conditional Expectation and Mean Squared Error. 2.5 Worked Examples of Conditional Expectation and Mean Square Error Optimization. 2.5.1 Example. 2.5.2 Exampie. 2.5.3 Example. 2.5.4 Example. 2.5.5 Example. 2.5.6 Example. 2.6 Useful Distributions. 2.6.1 Normal
Distribution. 2.6.2 Multinomial Distribution. 2.6.3 Chi-square Distribution. 2.6.4 Poisson and Exponential Distributions. 2.6.5 Gamma Distribution. 2.6.6 Beta Distribution. 2.6.7 Dirichlet-Multinomial Distribution. 2.7 Information Entropy. 2.7.1 Information Theory Concepts. 2.7.2 Properties of Information Entropy. 2.7.3 Kullback-Leibler Divergence. 2.7.4 Cross-Entropy as Maximum Likelihood. 2.8 Moment Generating Functions. 2.9 Monte Carlo Sampling Methods. 2.9.1 Inverse CDF Method for DiscreteVariables. 2.9.2 Inverse CDF Method for ContinuousVariables. 2.9.3 Rejection Method. 2.10 Sampling ImportanceResampling. 2.11 Useful Inequalities. 2.11.1 Markov’s
Inequality. 2.11.2 Chebyshev’s Inequality. 2.11.3 Hoeffding ’ s Inequality. References. 49 51 53 55 57 58 64 65 68 69 72 75 78 79 82 83 83 84 86 89 90 91 93 95 96 98 99 100 101 104 105 107 108 113 115 115 116 118 120
Contents 3 Statistics. 3.1 Introduction. 3.2 Python Modules for Statistics. 3.2.1 Scipy Statistics Module. 3.2.2 Sympy Statistics Module. 3.2.3 Other Python Modules for Statistics. 3.3 Types of Convergence . 3.3.1 Almost Sure Convergence . 3.3.2 Convergence in Probability. 3.3.3 Convergence in Distribution. 3.3.4 Limit Theorems. 3.4 Estimation Using Maximum Likelihood. 3.4.1 Setting Up the Coin-Flipping Experiment. 3.4.2 Delta Method. 3.5 Hypothesis Testing and P-Values. 3.5.1 Back to the Coin-Flipping Example. 3.5.2 Receiver Operating Characteristic. 3.5.3 P-Values. 3.5.4 Test
Statistics. 3.5.5 Testing Multiple Hypotheses. 3.5.6 Fisher Exact Test. 3.6 Confidence Intervals. 3.7 Linear Regression. 3.7.1 Extensions to Multiple Covariates. 3.8 Maximum A-Posteriori. 3.9 Robust Statistics. 3.10 Bootstrapping. 3.10.1 Parametric Bootstrap . 3.11 Gauss-Markov. 3.12 Nonparametric Methods. 3.12.1 Kernel Density Estimation. 3.12.2 Kernel Smoothing. 3.12.3 Nonparametric Regression Estimators. 3.12.4 Nearest Neighbors Regression. 3.12.5 Kernel Regression. 3.12.6 Curse of
Dimensionality. 3.12.7 Nonparametric Tests. 3.13 Survival Analysis. 3.13.1 Example. References. xiii 123 123 124 124 125 126 126 126 129 131 132 133 135 145 147 149 152 154 155 163 163 166 169 178 183 188 195 200 201 205 205 207 213 214 218 219 221 228 231 236
xiv 4 Contents Machine Learning. 4.1 Introduction. 4.2 Python Machine Learning Modules. 4.3 Theory of Learning. 4.3.1 Introduction to Theory ofMachine Learning. 4.3.2 Theory of Generalization. 4.3.3 Worked Example for Generalization/Approximation Complexity. 4.3.4 Cross-Validation . 4.3.5 Bias and Variance. 4.3.6 Learning Noise. 4.4 Decision Trees. 4.4.1 Random Forests. 4.4.2 Boosting Trees. 4.5 Boosting Trees. 4.5.1 Boosting Trees. 4.6 Logistic Regression. 4.7 Generalized Linear
Models. 4.8 Regularization . 4.8.1 Ridge Regression. 4.8.2 Lasso Regression. 4.9 Support Vector Machines. 4.9.1 Kemel Tricks. 4.10 Dimensionality Reduction. 4.10.1 Independent Component Analysis. 4.11 Clustering . 4.12 Ensemble Methods. 4.12.1 Bagging. 4.12.2 Boosting. 4.13 Deep Learning. 4.13.1 Introduction to Tensorflow. 4.13.2 Understanding Gradient Descent. 4.13.3 Image Processing Using Convolutional Neural Networks.
References. 237 237 237 241 244 249 250 256 260 265 268 275 277 281 281 285 295 300 304 309 311 315 317 321 325 329 329 331 334 343 350 363 379 Notation. 381 Index. 383 |
any_adam_object | 1 |
any_adam_object_boolean | 1 |
author | Unpingco, José 1969- |
author_GND | (DE-588)1113037261 |
author_facet | Unpingco, José 1969- |
author_role | aut |
author_sort | Unpingco, José 1969- |
author_variant | j u ju |
building | Verbundindex |
bvnumber | BV047581782 |
callnumber-first | T - Technology |
callnumber-label | TK1-9971 |
callnumber-raw | TK1-9971 |
callnumber-search | TK1-9971 |
callnumber-sort | TK 11 49971 |
callnumber-subject | TK - Electrical and Nuclear Engineering |
classification_rvk | ST 250 |
ctrlnum | (OCoLC)1117697334 (DE-599)KXP167693930X |
discipline | Informatik |
discipline_str_mv | Informatik |
edition | Second edition |
format | Book |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>05196nam a2200541 c 4500</leader><controlfield tag="001">BV047581782</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20211209 </controlfield><controlfield tag="007">t</controlfield><controlfield tag="008">211110s2019 a||| |||| 00||| eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9783030185442</subfield><subfield code="9">978-3-030-18544-2</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)1117697334</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)KXP167693930X</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">rda</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-355</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">TK1-9971</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 250</subfield><subfield code="0">(DE-625)143626:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 250</subfield><subfield code="0">(DE-625)143626:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">54.53</subfield><subfield code="2">bkl</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Unpingco, José</subfield><subfield code="d">1969-</subfield><subfield code="e">Verfasser</subfield><subfield code="0">(DE-588)1113037261</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Python for probability, statistics, and machine learning</subfield><subfield code="c">José Unpingco</subfield></datafield><datafield tag="250" ind1=" " ind2=" "><subfield code="a">Second edition</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Cham</subfield><subfield code="b">Springer</subfield><subfield code="c">[2019]</subfield></datafield><datafield tag="264" ind1=" " ind2="4"><subfield code="c">© 2019</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">xiv, 384 Seiten</subfield><subfield code="b">Illustrationen, Diagramme</subfield><subfield code="c">25 cm</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1="3" ind2=" "><subfield code="a">Introduction -- Part 1 Getting Started with Scientific Python -- Installation and Setup -- Numpy -- Matplotlib -- Ipython -- Jupyter Notebook -- Scipy -- Pandas -- Sympy -- Interfacing with Compiled Libraries -- Integrated Development Environments -- Quick Guide to Performance and Parallel Programming -- Other Resources -- Part 2 Probability -- Introduction -- Projection Methods -- Conditional Expectation as Projection -- Conditional Expectation and Mean Squared Error -- Worked Examples of Conditional Expectation and Mean Square Error Optimization -- Useful Distributions -- Information Entropy -- Moment Generating Functions -- Monte Carlo Sampling Methods -- Useful Inequalities -- Part 3 Statistics -- Python Modules for Statistics -- Types of Convergence -- Estimation Using Maximum Likelihood -- Hypothesis Testing and P-Values -- Confidence Intervals -- Linear Regression -- Maximum A-Posteriori -- Robust Statistics -- Bootstrapping -- Gauss Markov -- Nonparametric Methods -- Survival Analysis -- Part 4 Machine Learning -- Introduction -- Python Machine Learning Modules -- Theory of Learning -- Decision Trees -- Boosting Trees -- Logistic Regression -- Generalized Linear Models -- Regularization -- Support Vector Machines -- Dimensionality Reduction -- Clustering -- Ensemble Methods -- Deep Learning -- Notation -- References -- Index</subfield></datafield><datafield tag="520" ind1="3" ind2=" "><subfield code="a">This book, fully updated for Python version 3.6+, covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. All the figures and numerical results are reproducible using the Python codes provided. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Detailed proofs for certain important results are also provided. Modern Python modules like Pandas, Sympy, Scikit-learn, Tensorflow, and Keras are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This updated edition now includes the Fisher Exact Test and the Mann-Whitney-Wilcoxon Test. </subfield></datafield><datafield tag="520" ind1="3" ind2=" "><subfield code="a">A new section on survival analysis has been included as well as substantial development of Generalized Linear Models. The new deep learning section for image processing includes an in-depth discussion of gradient descent methods that underpin all deep learning algorithms. As with the prior edition, there are new and updated *Programming Tips* that the illustrate effective Python modules and methods for scientific programming and machine learning. There are 445 run-able code blocks with corresponding outputs that have been tested for accuracy. Over 158 graphical visualizations (almost all generated using Python) illustrate the concepts that are developed both in code and in mathematics. We also discuss and use key Python modules such as Numpy, Scikit-learn, Sympy, Scipy, Lifelines, CvxPy, Theano, Matplotlib, Pandas, Tensorflow, Statsmodels, and Keras. </subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Stochastik</subfield><subfield code="0">(DE-588)4121729-9</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Maschinelles Lernen</subfield><subfield code="0">(DE-588)4193754-5</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Python</subfield><subfield code="g">Programmiersprache</subfield><subfield code="0">(DE-588)4434275-5</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Telecommunication</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Computer science</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Engineering mathematics</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Statistics</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Data mining</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Communications Engineering, Networks</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Python</subfield><subfield code="g">Programmiersprache</subfield><subfield code="0">(DE-588)4434275-5</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="1"><subfield code="a">Stochastik</subfield><subfield code="0">(DE-588)4121729-9</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="2"><subfield code="a">Maschinelles Lernen</subfield><subfield code="0">(DE-588)4193754-5</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Erscheint auch als</subfield><subfield code="n">Online-Ausgabe</subfield><subfield code="z">978-3-030-18545-9</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="m">Digitalisierung UB Regensburg - ADAM Catalogue Enrichment</subfield><subfield code="q">application/pdf</subfield><subfield code="u">http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=032967158&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA</subfield><subfield code="3">Inhaltsverzeichnis</subfield></datafield><datafield tag="999" ind1=" " ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-032967158</subfield></datafield></record></collection> |
id | DE-604.BV047581782 |
illustrated | Illustrated |
index_date | 2024-07-03T18:33:24Z |
indexdate | 2024-07-10T09:15:27Z |
institution | BVB |
isbn | 9783030185442 |
language | English |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-032967158 |
oclc_num | 1117697334 |
open_access_boolean | |
owner | DE-355 DE-BY-UBR |
owner_facet | DE-355 DE-BY-UBR |
physical | xiv, 384 Seiten Illustrationen, Diagramme 25 cm |
publishDate | 2019 |
publishDateSearch | 2019 |
publishDateSort | 2019 |
publisher | Springer |
record_format | marc |
spelling | Unpingco, José 1969- Verfasser (DE-588)1113037261 aut Python for probability, statistics, and machine learning José Unpingco Second edition Cham Springer [2019] © 2019 xiv, 384 Seiten Illustrationen, Diagramme 25 cm txt rdacontent n rdamedia nc rdacarrier Introduction -- Part 1 Getting Started with Scientific Python -- Installation and Setup -- Numpy -- Matplotlib -- Ipython -- Jupyter Notebook -- Scipy -- Pandas -- Sympy -- Interfacing with Compiled Libraries -- Integrated Development Environments -- Quick Guide to Performance and Parallel Programming -- Other Resources -- Part 2 Probability -- Introduction -- Projection Methods -- Conditional Expectation as Projection -- Conditional Expectation and Mean Squared Error -- Worked Examples of Conditional Expectation and Mean Square Error Optimization -- Useful Distributions -- Information Entropy -- Moment Generating Functions -- Monte Carlo Sampling Methods -- Useful Inequalities -- Part 3 Statistics -- Python Modules for Statistics -- Types of Convergence -- Estimation Using Maximum Likelihood -- Hypothesis Testing and P-Values -- Confidence Intervals -- Linear Regression -- Maximum A-Posteriori -- Robust Statistics -- Bootstrapping -- Gauss Markov -- Nonparametric Methods -- Survival Analysis -- Part 4 Machine Learning -- Introduction -- Python Machine Learning Modules -- Theory of Learning -- Decision Trees -- Boosting Trees -- Logistic Regression -- Generalized Linear Models -- Regularization -- Support Vector Machines -- Dimensionality Reduction -- Clustering -- Ensemble Methods -- Deep Learning -- Notation -- References -- Index This book, fully updated for Python version 3.6+, covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. All the figures and numerical results are reproducible using the Python codes provided. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Detailed proofs for certain important results are also provided. Modern Python modules like Pandas, Sympy, Scikit-learn, Tensorflow, and Keras are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This updated edition now includes the Fisher Exact Test and the Mann-Whitney-Wilcoxon Test. A new section on survival analysis has been included as well as substantial development of Generalized Linear Models. The new deep learning section for image processing includes an in-depth discussion of gradient descent methods that underpin all deep learning algorithms. As with the prior edition, there are new and updated *Programming Tips* that the illustrate effective Python modules and methods for scientific programming and machine learning. There are 445 run-able code blocks with corresponding outputs that have been tested for accuracy. Over 158 graphical visualizations (almost all generated using Python) illustrate the concepts that are developed both in code and in mathematics. We also discuss and use key Python modules such as Numpy, Scikit-learn, Sympy, Scipy, Lifelines, CvxPy, Theano, Matplotlib, Pandas, Tensorflow, Statsmodels, and Keras. Stochastik (DE-588)4121729-9 gnd rswk-swf Maschinelles Lernen (DE-588)4193754-5 gnd rswk-swf Python Programmiersprache (DE-588)4434275-5 gnd rswk-swf Telecommunication Computer science Engineering mathematics Statistics Data mining Communications Engineering, Networks Python Programmiersprache (DE-588)4434275-5 s Stochastik (DE-588)4121729-9 s Maschinelles Lernen (DE-588)4193754-5 s DE-604 Erscheint auch als Online-Ausgabe 978-3-030-18545-9 Digitalisierung UB Regensburg - ADAM Catalogue Enrichment application/pdf http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=032967158&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA Inhaltsverzeichnis |
spellingShingle | Unpingco, José 1969- Python for probability, statistics, and machine learning Stochastik (DE-588)4121729-9 gnd Maschinelles Lernen (DE-588)4193754-5 gnd Python Programmiersprache (DE-588)4434275-5 gnd |
subject_GND | (DE-588)4121729-9 (DE-588)4193754-5 (DE-588)4434275-5 |
title | Python for probability, statistics, and machine learning |
title_auth | Python for probability, statistics, and machine learning |
title_exact_search | Python for probability, statistics, and machine learning |
title_exact_search_txtP | Python for probability, statistics, and machine learning |
title_full | Python for probability, statistics, and machine learning José Unpingco |
title_fullStr | Python for probability, statistics, and machine learning José Unpingco |
title_full_unstemmed | Python for probability, statistics, and machine learning José Unpingco |
title_short | Python for probability, statistics, and machine learning |
title_sort | python for probability statistics and machine learning |
topic | Stochastik (DE-588)4121729-9 gnd Maschinelles Lernen (DE-588)4193754-5 gnd Python Programmiersprache (DE-588)4434275-5 gnd |
topic_facet | Stochastik Maschinelles Lernen Python Programmiersprache |
url | http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=032967158&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |
work_keys_str_mv | AT unpingcojose pythonforprobabilitystatisticsandmachinelearning |