Training Systems Using Python Statistical Modeling :: Explore Popular Techniques for Modeling Your Data in Python.
This book will acquaint you with various aspects of statistical analysis in Python. You will work with different types of prediction models, such as decision trees, random forests and neural networks. By the end of this book, you will be confident in using various Python packages to train your own m...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Elektronisch E-Book |
Sprache: | English |
Veröffentlicht: |
Birmingham :
Packt Publishing, Limited,
2019.
|
Schlagworte: | |
Online-Zugang: | Volltext |
Zusammenfassung: | This book will acquaint you with various aspects of statistical analysis in Python. You will work with different types of prediction models, such as decision trees, random forests and neural networks. By the end of this book, you will be confident in using various Python packages to train your own models for effective machine learning. |
Beschreibung: | The silhouette method |
Beschreibung: | 1 online resource (284 pages) |
ISBN: | 1838820647 9781838820640 |
Internformat
MARC
LEADER | 00000cam a2200000Mi 4500 | ||
---|---|---|---|
001 | ZDB-4-EBA-on1102478382 | ||
003 | OCoLC | ||
005 | 20241004212047.0 | ||
006 | m o d | ||
007 | cr cnu---unuuu | ||
008 | 190622s2019 enk o 000 0 eng d | ||
040 | |a EBLCP |b eng |e pn |c EBLCP |d TEFOD |d N$T |d OCLCO |d OCLCF |d EBLCP |d OCLCQ |d YDX |d UKAHL |d OCLCQ |d NLW |d OCLCO |d NZAUC |d OCLCQ |d OCLCO |d TMA |d OCLCQ | ||
019 | |a 1102476476 | ||
020 | |a 1838820647 | ||
020 | |a 9781838820640 |q (electronic bk.) | ||
035 | |a (OCoLC)1102478382 |z (OCoLC)1102476476 | ||
037 | |a DE8DD0F1-F2A7-45ED-93BF-FAD5E156EC76 |b OverDrive, Inc. |n http://www.overdrive.com | ||
050 | 4 | |a QA76.73.P98 | |
072 | 7 | |a COM |x 051360 |2 bisacsh | |
082 | 7 | |a 005.133 |2 23 | |
049 | |a MAIN | ||
100 | 1 | |a Miller, Curtis. | |
245 | 1 | 0 | |a Training Systems Using Python Statistical Modeling : |b Explore Popular Techniques for Modeling Your Data in Python. |
260 | |a Birmingham : |b Packt Publishing, Limited, |c 2019. | ||
300 | |a 1 online resource (284 pages) | ||
336 | |a text |b txt |2 rdacontent | ||
337 | |a computer |b c |2 rdamedia | ||
338 | |a online resource |b cr |2 rdacarrier | ||
588 | 0 | |a Print version record. | |
505 | 0 | |a Cover; Title Page; Copyright and Credits; About Packt; Contributors; Table of Contents; Preface; Chapter 1: Classical Statistical Analysis; Technical requirements; Computing descriptive statistics; Preprocessing the data; Computing basic statistics; Classical inference for proportions; Computing confidence intervals for proportions; Hypothesis testing for proportions; Testing for common proportions; Classical inference for means; Computing confidence intervals for means; Hypothesis testing for means; Testing with two samples; One-way analysis of variance (ANOVA); Diving into Bayesian analysis | |
505 | 8 | |a How Bayesian analysis worksUsing Bayesian analysis to solve a hit-and-run; Bayesian analysis for proportions; Conjugate priors for proportions; Credible intervals for proportions; Bayesian hypothesis testing for proportions; Comparing two proportions; Bayesian analysis for means; Credible intervals for means; Bayesian hypothesis testing for means; Testing with two samples; Finding correlations; Testing for correlation; Summary; Chapter 2: Introduction to Supervised Learning; Principles of machine learning; Checking the variables using the iris dataset; The goal of supervised learning | |
505 | 8 | |a Training modelsIssues in training supervised learning models; Splitting data; Cross-validation; Evaluating models; Accuracy; Precision; Recall; F1 score; Classification report; Bayes factor; Summary; Chapter 3: Binary Prediction Models; K-nearest neighbors classifier; Training a kNN classifier; Hyperparameters in kNN classifiers; Decision trees; Fitting the decision tree; Visualizing the tree; Restricting tree depth; Random forests; Optimizing hyperparameters; Naive Bayes classifier; Preprocessing the data; Training the classifier; Support vector machines; Training a SVM; Logistic regression | |
505 | 8 | |a Fitting a logit modelExtending beyond binary classifiers; Multiple outcomes for decision trees; Multiple outcomes for random forests; Multiple outcomes for Naive Bayes; One-versus-all and one-versus-one classification; Summary; Chapter 4: Regression Analysis and How to Use It; Linear models; Fitting a linear model with OLS; Performing cross-validation; Evaluating linear models; Using AIC to pick models; Bayesian linear models; Choosing a polynomial; Performing Bayesian regression; Ridge regression; Finding the right alpha value; LASSO regression; Spline interpolation | |
505 | 8 | |a Using SciPy for interpolation2D interpolation; Summary; Chapter 5: Neural Networks; An introduction to perceptrons; Neural networks; The structure of a neural network; Types of neural networks; The MLP model; MLPs for classification; Optimization techniques; Training the network; Fitting an MLP to the iris dataset; Fitting an MLP to the digits dataset; MLP for regression; Summary; Chapter 6: Clustering Techniques; Introduction to clustering; Computing distances; Exploring the k-means algorithm; Clustering the iris dataset; Compressing images with k-means; Evaluating clusters; The elbow method | |
500 | |a The silhouette method | ||
520 | |a This book will acquaint you with various aspects of statistical analysis in Python. You will work with different types of prediction models, such as decision trees, random forests and neural networks. By the end of this book, you will be confident in using various Python packages to train your own models for effective machine learning. | ||
650 | 0 | |a Python (Computer program language) |0 http://id.loc.gov/authorities/subjects/sh96008834 | |
650 | 0 | |a Graphical modeling (Statistics) |0 http://id.loc.gov/authorities/subjects/sh95000210 | |
650 | 6 | |a Python (Langage de programmation) | |
650 | 6 | |a Modèles graphiques (Statistique) | |
650 | 7 | |a Programming & scripting languages: general. |2 bicssc | |
650 | 7 | |a Database design & theory. |2 bicssc | |
650 | 7 | |a Information architecture. |2 bicssc | |
650 | 7 | |a Data capture & analysis. |2 bicssc | |
650 | 7 | |a COMPUTERS |x Programming Languages |x Python. |2 bisacsh | |
650 | 7 | |a Graphical modeling (Statistics) |2 fast | |
650 | 7 | |a Python (Computer program language) |2 fast | |
776 | 0 | 8 | |i Print version: |a Miller, Curtis. |t Training Systems Using Python Statistical Modeling : Explore Popular Techniques for Modeling Your Data in Python. |d Birmingham : Packt Publishing, Limited, ©2019 |z 9781838823733 |
856 | 4 | 0 | |l FWS01 |p ZDB-4-EBA |q FWS_PDA_EBA |u https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=2142584 |3 Volltext |
938 | |a Askews and Holts Library Services |b ASKH |n AH36312575 | ||
938 | |a ProQuest Ebook Central |b EBLB |n EBL5778835 | ||
938 | |a EBSCOhost |b EBSC |n 2142584 | ||
938 | |a YBP Library Services |b YANK |n 300558430 | ||
994 | |a 92 |b GEBAY | ||
912 | |a ZDB-4-EBA | ||
049 | |a DE-863 |
Datensatz im Suchindex
DE-BY-FWS_katkey | ZDB-4-EBA-on1102478382 |
---|---|
_version_ | 1816882493273407488 |
adam_text | |
any_adam_object | |
author | Miller, Curtis |
author_facet | Miller, Curtis |
author_role | |
author_sort | Miller, Curtis |
author_variant | c m cm |
building | Verbundindex |
bvnumber | localFWS |
callnumber-first | Q - Science |
callnumber-label | QA76 |
callnumber-raw | QA76.73.P98 |
callnumber-search | QA76.73.P98 |
callnumber-sort | QA 276.73 P98 |
callnumber-subject | QA - Mathematics |
collection | ZDB-4-EBA |
contents | Cover; Title Page; Copyright and Credits; About Packt; Contributors; Table of Contents; Preface; Chapter 1: Classical Statistical Analysis; Technical requirements; Computing descriptive statistics; Preprocessing the data; Computing basic statistics; Classical inference for proportions; Computing confidence intervals for proportions; Hypothesis testing for proportions; Testing for common proportions; Classical inference for means; Computing confidence intervals for means; Hypothesis testing for means; Testing with two samples; One-way analysis of variance (ANOVA); Diving into Bayesian analysis How Bayesian analysis worksUsing Bayesian analysis to solve a hit-and-run; Bayesian analysis for proportions; Conjugate priors for proportions; Credible intervals for proportions; Bayesian hypothesis testing for proportions; Comparing two proportions; Bayesian analysis for means; Credible intervals for means; Bayesian hypothesis testing for means; Testing with two samples; Finding correlations; Testing for correlation; Summary; Chapter 2: Introduction to Supervised Learning; Principles of machine learning; Checking the variables using the iris dataset; The goal of supervised learning Training modelsIssues in training supervised learning models; Splitting data; Cross-validation; Evaluating models; Accuracy; Precision; Recall; F1 score; Classification report; Bayes factor; Summary; Chapter 3: Binary Prediction Models; K-nearest neighbors classifier; Training a kNN classifier; Hyperparameters in kNN classifiers; Decision trees; Fitting the decision tree; Visualizing the tree; Restricting tree depth; Random forests; Optimizing hyperparameters; Naive Bayes classifier; Preprocessing the data; Training the classifier; Support vector machines; Training a SVM; Logistic regression Fitting a logit modelExtending beyond binary classifiers; Multiple outcomes for decision trees; Multiple outcomes for random forests; Multiple outcomes for Naive Bayes; One-versus-all and one-versus-one classification; Summary; Chapter 4: Regression Analysis and How to Use It; Linear models; Fitting a linear model with OLS; Performing cross-validation; Evaluating linear models; Using AIC to pick models; Bayesian linear models; Choosing a polynomial; Performing Bayesian regression; Ridge regression; Finding the right alpha value; LASSO regression; Spline interpolation Using SciPy for interpolation2D interpolation; Summary; Chapter 5: Neural Networks; An introduction to perceptrons; Neural networks; The structure of a neural network; Types of neural networks; The MLP model; MLPs for classification; Optimization techniques; Training the network; Fitting an MLP to the iris dataset; Fitting an MLP to the digits dataset; MLP for regression; Summary; Chapter 6: Clustering Techniques; Introduction to clustering; Computing distances; Exploring the k-means algorithm; Clustering the iris dataset; Compressing images with k-means; Evaluating clusters; The elbow method |
ctrlnum | (OCoLC)1102478382 |
dewey-full | 005.133 |
dewey-hundreds | 000 - Computer science, information, general works |
dewey-ones | 005 - Computer programming, programs, data, security |
dewey-raw | 005.133 |
dewey-search | 005.133 |
dewey-sort | 15.133 |
dewey-tens | 000 - Computer science, information, general works |
discipline | Informatik |
format | Electronic eBook |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>05855cam a2200613Mi 4500</leader><controlfield tag="001">ZDB-4-EBA-on1102478382</controlfield><controlfield tag="003">OCoLC</controlfield><controlfield tag="005">20241004212047.0</controlfield><controlfield tag="006">m o d </controlfield><controlfield tag="007">cr cnu---unuuu</controlfield><controlfield tag="008">190622s2019 enk o 000 0 eng d</controlfield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">EBLCP</subfield><subfield code="b">eng</subfield><subfield code="e">pn</subfield><subfield code="c">EBLCP</subfield><subfield code="d">TEFOD</subfield><subfield code="d">N$T</subfield><subfield code="d">OCLCO</subfield><subfield code="d">OCLCF</subfield><subfield code="d">EBLCP</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">YDX</subfield><subfield code="d">UKAHL</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">NLW</subfield><subfield code="d">OCLCO</subfield><subfield code="d">NZAUC</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">OCLCO</subfield><subfield code="d">TMA</subfield><subfield code="d">OCLCQ</subfield></datafield><datafield tag="019" ind1=" " ind2=" "><subfield code="a">1102476476</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">1838820647</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9781838820640</subfield><subfield code="q">(electronic bk.)</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)1102478382</subfield><subfield code="z">(OCoLC)1102476476</subfield></datafield><datafield tag="037" ind1=" " ind2=" "><subfield code="a">DE8DD0F1-F2A7-45ED-93BF-FAD5E156EC76</subfield><subfield code="b">OverDrive, Inc.</subfield><subfield code="n">http://www.overdrive.com</subfield></datafield><datafield tag="050" ind1=" " ind2="4"><subfield code="a">QA76.73.P98</subfield></datafield><datafield tag="072" ind1=" " ind2="7"><subfield code="a">COM</subfield><subfield code="x">051360</subfield><subfield code="2">bisacsh</subfield></datafield><datafield tag="082" ind1="7" ind2=" "><subfield code="a">005.133</subfield><subfield code="2">23</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">MAIN</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Miller, Curtis.</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Training Systems Using Python Statistical Modeling :</subfield><subfield code="b">Explore Popular Techniques for Modeling Your Data in Python.</subfield></datafield><datafield tag="260" ind1=" " ind2=" "><subfield code="a">Birmingham :</subfield><subfield code="b">Packt Publishing, Limited,</subfield><subfield code="c">2019.</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 online resource (284 pages)</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">computer</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">online resource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="588" ind1="0" ind2=" "><subfield code="a">Print version record.</subfield></datafield><datafield tag="505" ind1="0" ind2=" "><subfield code="a">Cover; Title Page; Copyright and Credits; About Packt; Contributors; Table of Contents; Preface; Chapter 1: Classical Statistical Analysis; Technical requirements; Computing descriptive statistics; Preprocessing the data; Computing basic statistics; Classical inference for proportions; Computing confidence intervals for proportions; Hypothesis testing for proportions; Testing for common proportions; Classical inference for means; Computing confidence intervals for means; Hypothesis testing for means; Testing with two samples; One-way analysis of variance (ANOVA); Diving into Bayesian analysis</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">How Bayesian analysis worksUsing Bayesian analysis to solve a hit-and-run; Bayesian analysis for proportions; Conjugate priors for proportions; Credible intervals for proportions; Bayesian hypothesis testing for proportions; Comparing two proportions; Bayesian analysis for means; Credible intervals for means; Bayesian hypothesis testing for means; Testing with two samples; Finding correlations; Testing for correlation; Summary; Chapter 2: Introduction to Supervised Learning; Principles of machine learning; Checking the variables using the iris dataset; The goal of supervised learning</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">Training modelsIssues in training supervised learning models; Splitting data; Cross-validation; Evaluating models; Accuracy; Precision; Recall; F1 score; Classification report; Bayes factor; Summary; Chapter 3: Binary Prediction Models; K-nearest neighbors classifier; Training a kNN classifier; Hyperparameters in kNN classifiers; Decision trees; Fitting the decision tree; Visualizing the tree; Restricting tree depth; Random forests; Optimizing hyperparameters; Naive Bayes classifier; Preprocessing the data; Training the classifier; Support vector machines; Training a SVM; Logistic regression</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">Fitting a logit modelExtending beyond binary classifiers; Multiple outcomes for decision trees; Multiple outcomes for random forests; Multiple outcomes for Naive Bayes; One-versus-all and one-versus-one classification; Summary; Chapter 4: Regression Analysis and How to Use It; Linear models; Fitting a linear model with OLS; Performing cross-validation; Evaluating linear models; Using AIC to pick models; Bayesian linear models; Choosing a polynomial; Performing Bayesian regression; Ridge regression; Finding the right alpha value; LASSO regression; Spline interpolation</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">Using SciPy for interpolation2D interpolation; Summary; Chapter 5: Neural Networks; An introduction to perceptrons; Neural networks; The structure of a neural network; Types of neural networks; The MLP model; MLPs for classification; Optimization techniques; Training the network; Fitting an MLP to the iris dataset; Fitting an MLP to the digits dataset; MLP for regression; Summary; Chapter 6: Clustering Techniques; Introduction to clustering; Computing distances; Exploring the k-means algorithm; Clustering the iris dataset; Compressing images with k-means; Evaluating clusters; The elbow method</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">The silhouette method</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">This book will acquaint you with various aspects of statistical analysis in Python. You will work with different types of prediction models, such as decision trees, random forests and neural networks. By the end of this book, you will be confident in using various Python packages to train your own models for effective machine learning.</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Python (Computer program language)</subfield><subfield code="0">http://id.loc.gov/authorities/subjects/sh96008834</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Graphical modeling (Statistics)</subfield><subfield code="0">http://id.loc.gov/authorities/subjects/sh95000210</subfield></datafield><datafield tag="650" ind1=" " ind2="6"><subfield code="a">Python (Langage de programmation)</subfield></datafield><datafield tag="650" ind1=" " ind2="6"><subfield code="a">Modèles graphiques (Statistique)</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Programming & scripting languages: general.</subfield><subfield code="2">bicssc</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Database design & theory.</subfield><subfield code="2">bicssc</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Information architecture.</subfield><subfield code="2">bicssc</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Data capture & analysis.</subfield><subfield code="2">bicssc</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">COMPUTERS</subfield><subfield code="x">Programming Languages</subfield><subfield code="x">Python.</subfield><subfield code="2">bisacsh</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Graphical modeling (Statistics)</subfield><subfield code="2">fast</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Python (Computer program language)</subfield><subfield code="2">fast</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Print version:</subfield><subfield code="a">Miller, Curtis.</subfield><subfield code="t">Training Systems Using Python Statistical Modeling : Explore Popular Techniques for Modeling Your Data in Python.</subfield><subfield code="d">Birmingham : Packt Publishing, Limited, ©2019</subfield><subfield code="z">9781838823733</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="l">FWS01</subfield><subfield code="p">ZDB-4-EBA</subfield><subfield code="q">FWS_PDA_EBA</subfield><subfield code="u">https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=2142584</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">Askews and Holts Library Services</subfield><subfield code="b">ASKH</subfield><subfield code="n">AH36312575</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">ProQuest Ebook Central</subfield><subfield code="b">EBLB</subfield><subfield code="n">EBL5778835</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">EBSCOhost</subfield><subfield code="b">EBSC</subfield><subfield code="n">2142584</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">YBP Library Services</subfield><subfield code="b">YANK</subfield><subfield code="n">300558430</subfield></datafield><datafield tag="994" ind1=" " ind2=" "><subfield code="a">92</subfield><subfield code="b">GEBAY</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-4-EBA</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-863</subfield></datafield></record></collection> |
id | ZDB-4-EBA-on1102478382 |
illustrated | Not Illustrated |
indexdate | 2024-11-27T13:29:29Z |
institution | BVB |
isbn | 1838820647 9781838820640 |
language | English |
oclc_num | 1102478382 |
open_access_boolean | |
owner | MAIN DE-863 DE-BY-FWS |
owner_facet | MAIN DE-863 DE-BY-FWS |
physical | 1 online resource (284 pages) |
psigel | ZDB-4-EBA |
publishDate | 2019 |
publishDateSearch | 2019 |
publishDateSort | 2019 |
publisher | Packt Publishing, Limited, |
record_format | marc |
spelling | Miller, Curtis. Training Systems Using Python Statistical Modeling : Explore Popular Techniques for Modeling Your Data in Python. Birmingham : Packt Publishing, Limited, 2019. 1 online resource (284 pages) text txt rdacontent computer c rdamedia online resource cr rdacarrier Print version record. Cover; Title Page; Copyright and Credits; About Packt; Contributors; Table of Contents; Preface; Chapter 1: Classical Statistical Analysis; Technical requirements; Computing descriptive statistics; Preprocessing the data; Computing basic statistics; Classical inference for proportions; Computing confidence intervals for proportions; Hypothesis testing for proportions; Testing for common proportions; Classical inference for means; Computing confidence intervals for means; Hypothesis testing for means; Testing with two samples; One-way analysis of variance (ANOVA); Diving into Bayesian analysis How Bayesian analysis worksUsing Bayesian analysis to solve a hit-and-run; Bayesian analysis for proportions; Conjugate priors for proportions; Credible intervals for proportions; Bayesian hypothesis testing for proportions; Comparing two proportions; Bayesian analysis for means; Credible intervals for means; Bayesian hypothesis testing for means; Testing with two samples; Finding correlations; Testing for correlation; Summary; Chapter 2: Introduction to Supervised Learning; Principles of machine learning; Checking the variables using the iris dataset; The goal of supervised learning Training modelsIssues in training supervised learning models; Splitting data; Cross-validation; Evaluating models; Accuracy; Precision; Recall; F1 score; Classification report; Bayes factor; Summary; Chapter 3: Binary Prediction Models; K-nearest neighbors classifier; Training a kNN classifier; Hyperparameters in kNN classifiers; Decision trees; Fitting the decision tree; Visualizing the tree; Restricting tree depth; Random forests; Optimizing hyperparameters; Naive Bayes classifier; Preprocessing the data; Training the classifier; Support vector machines; Training a SVM; Logistic regression Fitting a logit modelExtending beyond binary classifiers; Multiple outcomes for decision trees; Multiple outcomes for random forests; Multiple outcomes for Naive Bayes; One-versus-all and one-versus-one classification; Summary; Chapter 4: Regression Analysis and How to Use It; Linear models; Fitting a linear model with OLS; Performing cross-validation; Evaluating linear models; Using AIC to pick models; Bayesian linear models; Choosing a polynomial; Performing Bayesian regression; Ridge regression; Finding the right alpha value; LASSO regression; Spline interpolation Using SciPy for interpolation2D interpolation; Summary; Chapter 5: Neural Networks; An introduction to perceptrons; Neural networks; The structure of a neural network; Types of neural networks; The MLP model; MLPs for classification; Optimization techniques; Training the network; Fitting an MLP to the iris dataset; Fitting an MLP to the digits dataset; MLP for regression; Summary; Chapter 6: Clustering Techniques; Introduction to clustering; Computing distances; Exploring the k-means algorithm; Clustering the iris dataset; Compressing images with k-means; Evaluating clusters; The elbow method The silhouette method This book will acquaint you with various aspects of statistical analysis in Python. You will work with different types of prediction models, such as decision trees, random forests and neural networks. By the end of this book, you will be confident in using various Python packages to train your own models for effective machine learning. Python (Computer program language) http://id.loc.gov/authorities/subjects/sh96008834 Graphical modeling (Statistics) http://id.loc.gov/authorities/subjects/sh95000210 Python (Langage de programmation) Modèles graphiques (Statistique) Programming & scripting languages: general. bicssc Database design & theory. bicssc Information architecture. bicssc Data capture & analysis. bicssc COMPUTERS Programming Languages Python. bisacsh Graphical modeling (Statistics) fast Python (Computer program language) fast Print version: Miller, Curtis. Training Systems Using Python Statistical Modeling : Explore Popular Techniques for Modeling Your Data in Python. Birmingham : Packt Publishing, Limited, ©2019 9781838823733 FWS01 ZDB-4-EBA FWS_PDA_EBA https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=2142584 Volltext |
spellingShingle | Miller, Curtis Training Systems Using Python Statistical Modeling : Explore Popular Techniques for Modeling Your Data in Python. Cover; Title Page; Copyright and Credits; About Packt; Contributors; Table of Contents; Preface; Chapter 1: Classical Statistical Analysis; Technical requirements; Computing descriptive statistics; Preprocessing the data; Computing basic statistics; Classical inference for proportions; Computing confidence intervals for proportions; Hypothesis testing for proportions; Testing for common proportions; Classical inference for means; Computing confidence intervals for means; Hypothesis testing for means; Testing with two samples; One-way analysis of variance (ANOVA); Diving into Bayesian analysis How Bayesian analysis worksUsing Bayesian analysis to solve a hit-and-run; Bayesian analysis for proportions; Conjugate priors for proportions; Credible intervals for proportions; Bayesian hypothesis testing for proportions; Comparing two proportions; Bayesian analysis for means; Credible intervals for means; Bayesian hypothesis testing for means; Testing with two samples; Finding correlations; Testing for correlation; Summary; Chapter 2: Introduction to Supervised Learning; Principles of machine learning; Checking the variables using the iris dataset; The goal of supervised learning Training modelsIssues in training supervised learning models; Splitting data; Cross-validation; Evaluating models; Accuracy; Precision; Recall; F1 score; Classification report; Bayes factor; Summary; Chapter 3: Binary Prediction Models; K-nearest neighbors classifier; Training a kNN classifier; Hyperparameters in kNN classifiers; Decision trees; Fitting the decision tree; Visualizing the tree; Restricting tree depth; Random forests; Optimizing hyperparameters; Naive Bayes classifier; Preprocessing the data; Training the classifier; Support vector machines; Training a SVM; Logistic regression Fitting a logit modelExtending beyond binary classifiers; Multiple outcomes for decision trees; Multiple outcomes for random forests; Multiple outcomes for Naive Bayes; One-versus-all and one-versus-one classification; Summary; Chapter 4: Regression Analysis and How to Use It; Linear models; Fitting a linear model with OLS; Performing cross-validation; Evaluating linear models; Using AIC to pick models; Bayesian linear models; Choosing a polynomial; Performing Bayesian regression; Ridge regression; Finding the right alpha value; LASSO regression; Spline interpolation Using SciPy for interpolation2D interpolation; Summary; Chapter 5: Neural Networks; An introduction to perceptrons; Neural networks; The structure of a neural network; Types of neural networks; The MLP model; MLPs for classification; Optimization techniques; Training the network; Fitting an MLP to the iris dataset; Fitting an MLP to the digits dataset; MLP for regression; Summary; Chapter 6: Clustering Techniques; Introduction to clustering; Computing distances; Exploring the k-means algorithm; Clustering the iris dataset; Compressing images with k-means; Evaluating clusters; The elbow method Python (Computer program language) http://id.loc.gov/authorities/subjects/sh96008834 Graphical modeling (Statistics) http://id.loc.gov/authorities/subjects/sh95000210 Python (Langage de programmation) Modèles graphiques (Statistique) Programming & scripting languages: general. bicssc Database design & theory. bicssc Information architecture. bicssc Data capture & analysis. bicssc COMPUTERS Programming Languages Python. bisacsh Graphical modeling (Statistics) fast Python (Computer program language) fast |
subject_GND | http://id.loc.gov/authorities/subjects/sh96008834 http://id.loc.gov/authorities/subjects/sh95000210 |
title | Training Systems Using Python Statistical Modeling : Explore Popular Techniques for Modeling Your Data in Python. |
title_auth | Training Systems Using Python Statistical Modeling : Explore Popular Techniques for Modeling Your Data in Python. |
title_exact_search | Training Systems Using Python Statistical Modeling : Explore Popular Techniques for Modeling Your Data in Python. |
title_full | Training Systems Using Python Statistical Modeling : Explore Popular Techniques for Modeling Your Data in Python. |
title_fullStr | Training Systems Using Python Statistical Modeling : Explore Popular Techniques for Modeling Your Data in Python. |
title_full_unstemmed | Training Systems Using Python Statistical Modeling : Explore Popular Techniques for Modeling Your Data in Python. |
title_short | Training Systems Using Python Statistical Modeling : |
title_sort | training systems using python statistical modeling explore popular techniques for modeling your data in python |
title_sub | Explore Popular Techniques for Modeling Your Data in Python. |
topic | Python (Computer program language) http://id.loc.gov/authorities/subjects/sh96008834 Graphical modeling (Statistics) http://id.loc.gov/authorities/subjects/sh95000210 Python (Langage de programmation) Modèles graphiques (Statistique) Programming & scripting languages: general. bicssc Database design & theory. bicssc Information architecture. bicssc Data capture & analysis. bicssc COMPUTERS Programming Languages Python. bisacsh Graphical modeling (Statistics) fast Python (Computer program language) fast |
topic_facet | Python (Computer program language) Graphical modeling (Statistics) Python (Langage de programmation) Modèles graphiques (Statistique) Programming & scripting languages: general. Database design & theory. Information architecture. Data capture & analysis. COMPUTERS Programming Languages Python. |
url | https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=2142584 |
work_keys_str_mv | AT millercurtis trainingsystemsusingpythonstatisticalmodelingexplorepopulartechniquesformodelingyourdatainpython |