Hands-On Gradient Boosting with XGBoost and scikit-learn :: Perform accessible machine learning and extreme gradient boosting with Python. /
This practical XGBoost guide will put your Python and scikit-learn knowledge to work by showing you how to build powerful, fine-tuned XGBoost models with impressive speed and accuracy. This book will help you to apply XGBoost's alternative base learners, use unique transformers for model deploy...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Elektronisch E-Book |
Sprache: | English |
Veröffentlicht: |
Packt Publishing,
2020.
|
Schlagworte: | |
Online-Zugang: | Volltext |
Zusammenfassung: | This practical XGBoost guide will put your Python and scikit-learn knowledge to work by showing you how to build powerful, fine-tuned XGBoost models with impressive speed and accuracy. This book will help you to apply XGBoost's alternative base learners, use unique transformers for model deployment, discover tips from Kaggle masters, and much more! |
Beschreibung: | 1 online resource |
ISBN: | 1839213809 9781839213809 |
Internformat
MARC
LEADER | 00000cam a2200000 a 4500 | ||
---|---|---|---|
001 | ZDB-4-EBA-on1203953302 | ||
003 | OCoLC | ||
005 | 20241004212047.0 | ||
006 | m o d | ||
007 | cr ||||||||||| | ||
008 | 201016s2020 enk o 000 0 eng d | ||
040 | |a UKAHL |b eng |e pn |c UKAHL |d YDX |d N$T |d OCLCO |d OCLCF |d UKMGB |d EBLCP |d OCLCO |d OCLCQ |d KSU |d OCLCQ |d OCLCO |d OCLCL | ||
015 | |a GBC0G8283 |2 bnb | ||
016 | 7 | |a 019991769 |2 Uk | |
019 | |a 1201298605 | ||
020 | |a 1839213809 | ||
020 | |a 9781839213809 |q (electronic bk.) | ||
020 | |z 9781839218354 |q (pbk.) | ||
035 | |a (OCoLC)1203953302 |z (OCoLC)1201298605 | ||
037 | |a 9781839213809 |b Packt Publishing | ||
050 | 4 | |a Q325.5 | |
082 | 7 | |a 006.31 |2 23 | |
049 | |a MAIN | ||
100 | 1 | |a Wade, Corey. | |
245 | 1 | 0 | |a Hands-On Gradient Boosting with XGBoost and scikit-learn : |b Perform accessible machine learning and extreme gradient boosting with Python. / |c Corey Wade, Wade. |
260 | |b Packt Publishing, |c 2020. | ||
300 | |a 1 online resource | ||
336 | |a text |b txt |2 rdacontent | ||
337 | |a computer |b c |2 rdamedia | ||
338 | |a online resource |b cr |2 rdacarrier | ||
520 | |a This practical XGBoost guide will put your Python and scikit-learn knowledge to work by showing you how to build powerful, fine-tuned XGBoost models with impressive speed and accuracy. This book will help you to apply XGBoost's alternative base learners, use unique transformers for model deployment, discover tips from Kaggle masters, and much more! | ||
505 | 0 | |a Cover -- Copyright -- About PACKT -- Contributors -- Table of Contents -- Preface -- Section 1: Bagging and Boosting -- Chapter 1: Machine Learning Landscape -- Previewing XGBoost -- What is machine learning? -- Data wrangling -- Dataset 1 -- Bike rentals -- Understanding the data -- Correcting null values -- Predicting regression -- Predicting bike rentals -- Saving data for future use -- Declaring predictor and target columns -- Understanding regression -- Accessing scikit-learn -- Silencing warnings -- Modeling linear regression -- XGBoost -- XGBRegressor -- Cross-validation | |
505 | 8 | |a Predicting classification -- What is classification? -- Dataset 2 -- The census -- Data wrangling -- Logistic regression -- The XGBoost classifier -- Summary -- Chapter 2: Decision Trees in Depth -- Introducing decision trees with XGBoost -- Exploring decision trees -- First decision tree model -- Inside a decision tree -- Contrasting variance and bias -- Tuning decision tree hyperparameters -- Decision Tree regressor -- Hyperparameters in general -- Putting it all together -- Predicting heart disease -- a case study -- Heart Disease dataset -- Decision Tree classifier -- Choosing hyperparameters | |
505 | 8 | |a Narrowing the range -- feature_importances_ -- Summary -- Chapter 3: Bagging with Random Forests -- Technical requirements -- Bagging ensembles -- Ensemble methods -- Bootstrap aggregation -- Exploring random forests -- Random forest classifiers -- Random forest regressors -- Random forest hyperparameters -- oob_score -- n_estimators -- warm_start -- bootstrap -- Verbose -- Decision Tree hyperparameters -- Pushing random forest boundaries -- case study -- Preparing the dataset -- n_estimators -- cross_val_score -- Fine-tuning hyperparameters -- Random forest drawbacks -- Summary | |
505 | 8 | |a Chapter 4: From Gradient Boosting to XGBoost -- Technical requirements -- From bagging to boosting -- Introducing AdaBoost -- Distinguishing gradient boosting -- How gradient boosting works -- Residuals -- Learning how to build gradient boosting models from scratch -- Building a gradient boosting model in scikit-learn -- Modifying gradient boosting hyperparameters -- learning_rate -- Base learner -- subsample -- RandomizedSearchCV -- XGBoost -- Approaching big data -- gradient boosting versus XGBoost -- Introducing the exoplanet dataset -- Preprocessing the exoplanet dataset | |
505 | 8 | |a Building gradient boosting classifiers -- Timing models -- Comparing speed -- Summary -- Section 2: XGBoost -- Chapter 5: XGBoost Unveiled -- Designing XGBoost -- Historical narrative -- Design features -- Analyzing XGBoost parameters -- Learning objective -- Building XGBoost models -- The Iris dataset -- The Diabetes dataset -- Finding the Higgs boson -- case study -- Physics background -- Kaggle competitions -- XGBoost and the Higgs challenge -- Data -- Scoring -- Weights -- The model -- Summary -- Chapter 6: XGBoost Hyperparameters -- Technical requirements -- Preparing data and base models | |
650 | 0 | |a Machine learning. |0 http://id.loc.gov/authorities/subjects/sh85079324 | |
650 | 0 | |a Python (Computer program language) |0 http://id.loc.gov/authorities/subjects/sh96008834 | |
650 | 6 | |a Apprentissage automatique. | |
650 | 6 | |a Python (Langage de programmation) | |
650 | 7 | |a Machine learning |2 fast | |
650 | 7 | |a Python (Computer program language) |2 fast | |
758 | |i has work: |a Hands-On Gradient Boosting with XGBoost and scikit-learn (Text) |1 https://id.oclc.org/worldcat/entity/E39PD3qqgwpBPGRjCVjjPQXJ7w |4 https://id.oclc.org/worldcat/ontology/hasWork | ||
776 | 0 | 8 | |i Print version : |z 9781839218354 |
856 | 4 | 0 | |l FWS01 |p ZDB-4-EBA |q FWS_PDA_EBA |u https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=2655459 |3 Volltext |
938 | |a Askews and Holts Library Services |b ASKH |n AH37757154 | ||
938 | |a ProQuest Ebook Central |b EBLB |n EBL6414484 | ||
938 | |a EBSCOhost |b EBSC |n 2655459 | ||
938 | |a YBP Library Services |b YANK |n 301634247 | ||
994 | |a 92 |b GEBAY | ||
912 | |a ZDB-4-EBA | ||
049 | |a DE-863 |
Datensatz im Suchindex
DE-BY-FWS_katkey | ZDB-4-EBA-on1203953302 |
---|---|
_version_ | 1816882532437721088 |
adam_text | |
any_adam_object | |
author | Wade, Corey |
author_facet | Wade, Corey |
author_role | |
author_sort | Wade, Corey |
author_variant | c w cw |
building | Verbundindex |
bvnumber | localFWS |
callnumber-first | Q - Science |
callnumber-label | Q325 |
callnumber-raw | Q325.5 |
callnumber-search | Q325.5 |
callnumber-sort | Q 3325.5 |
callnumber-subject | Q - General Science |
collection | ZDB-4-EBA |
contents | Cover -- Copyright -- About PACKT -- Contributors -- Table of Contents -- Preface -- Section 1: Bagging and Boosting -- Chapter 1: Machine Learning Landscape -- Previewing XGBoost -- What is machine learning? -- Data wrangling -- Dataset 1 -- Bike rentals -- Understanding the data -- Correcting null values -- Predicting regression -- Predicting bike rentals -- Saving data for future use -- Declaring predictor and target columns -- Understanding regression -- Accessing scikit-learn -- Silencing warnings -- Modeling linear regression -- XGBoost -- XGBRegressor -- Cross-validation Predicting classification -- What is classification? -- Dataset 2 -- The census -- Data wrangling -- Logistic regression -- The XGBoost classifier -- Summary -- Chapter 2: Decision Trees in Depth -- Introducing decision trees with XGBoost -- Exploring decision trees -- First decision tree model -- Inside a decision tree -- Contrasting variance and bias -- Tuning decision tree hyperparameters -- Decision Tree regressor -- Hyperparameters in general -- Putting it all together -- Predicting heart disease -- a case study -- Heart Disease dataset -- Decision Tree classifier -- Choosing hyperparameters Narrowing the range -- feature_importances_ -- Summary -- Chapter 3: Bagging with Random Forests -- Technical requirements -- Bagging ensembles -- Ensemble methods -- Bootstrap aggregation -- Exploring random forests -- Random forest classifiers -- Random forest regressors -- Random forest hyperparameters -- oob_score -- n_estimators -- warm_start -- bootstrap -- Verbose -- Decision Tree hyperparameters -- Pushing random forest boundaries -- case study -- Preparing the dataset -- n_estimators -- cross_val_score -- Fine-tuning hyperparameters -- Random forest drawbacks -- Summary Chapter 4: From Gradient Boosting to XGBoost -- Technical requirements -- From bagging to boosting -- Introducing AdaBoost -- Distinguishing gradient boosting -- How gradient boosting works -- Residuals -- Learning how to build gradient boosting models from scratch -- Building a gradient boosting model in scikit-learn -- Modifying gradient boosting hyperparameters -- learning_rate -- Base learner -- subsample -- RandomizedSearchCV -- XGBoost -- Approaching big data -- gradient boosting versus XGBoost -- Introducing the exoplanet dataset -- Preprocessing the exoplanet dataset Building gradient boosting classifiers -- Timing models -- Comparing speed -- Summary -- Section 2: XGBoost -- Chapter 5: XGBoost Unveiled -- Designing XGBoost -- Historical narrative -- Design features -- Analyzing XGBoost parameters -- Learning objective -- Building XGBoost models -- The Iris dataset -- The Diabetes dataset -- Finding the Higgs boson -- case study -- Physics background -- Kaggle competitions -- XGBoost and the Higgs challenge -- Data -- Scoring -- Weights -- The model -- Summary -- Chapter 6: XGBoost Hyperparameters -- Technical requirements -- Preparing data and base models |
ctrlnum | (OCoLC)1203953302 |
dewey-full | 006.31 |
dewey-hundreds | 000 - Computer science, information, general works |
dewey-ones | 006 - Special computer methods |
dewey-raw | 006.31 |
dewey-search | 006.31 |
dewey-sort | 16.31 |
dewey-tens | 000 - Computer science, information, general works |
discipline | Informatik |
format | Electronic eBook |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>05477cam a2200565 a 4500</leader><controlfield tag="001">ZDB-4-EBA-on1203953302</controlfield><controlfield tag="003">OCoLC</controlfield><controlfield tag="005">20241004212047.0</controlfield><controlfield tag="006">m o d </controlfield><controlfield tag="007">cr |||||||||||</controlfield><controlfield tag="008">201016s2020 enk o 000 0 eng d</controlfield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">UKAHL</subfield><subfield code="b">eng</subfield><subfield code="e">pn</subfield><subfield code="c">UKAHL</subfield><subfield code="d">YDX</subfield><subfield code="d">N$T</subfield><subfield code="d">OCLCO</subfield><subfield code="d">OCLCF</subfield><subfield code="d">UKMGB</subfield><subfield code="d">EBLCP</subfield><subfield code="d">OCLCO</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">KSU</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">OCLCO</subfield><subfield code="d">OCLCL</subfield></datafield><datafield tag="015" ind1=" " ind2=" "><subfield code="a">GBC0G8283</subfield><subfield code="2">bnb</subfield></datafield><datafield tag="016" ind1="7" ind2=" "><subfield code="a">019991769</subfield><subfield code="2">Uk</subfield></datafield><datafield tag="019" ind1=" " ind2=" "><subfield code="a">1201298605</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">1839213809</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9781839213809</subfield><subfield code="q">(electronic bk.)</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="z">9781839218354</subfield><subfield code="q">(pbk.)</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)1203953302</subfield><subfield code="z">(OCoLC)1201298605</subfield></datafield><datafield tag="037" ind1=" " ind2=" "><subfield code="a">9781839213809</subfield><subfield code="b">Packt Publishing</subfield></datafield><datafield tag="050" ind1=" " ind2="4"><subfield code="a">Q325.5</subfield></datafield><datafield tag="082" ind1="7" ind2=" "><subfield code="a">006.31</subfield><subfield code="2">23</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">MAIN</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Wade, Corey.</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Hands-On Gradient Boosting with XGBoost and scikit-learn :</subfield><subfield code="b">Perform accessible machine learning and extreme gradient boosting with Python. /</subfield><subfield code="c">Corey Wade, Wade.</subfield></datafield><datafield tag="260" ind1=" " ind2=" "><subfield code="b">Packt Publishing,</subfield><subfield code="c">2020.</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 online resource</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">computer</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">online resource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">This practical XGBoost guide will put your Python and scikit-learn knowledge to work by showing you how to build powerful, fine-tuned XGBoost models with impressive speed and accuracy. This book will help you to apply XGBoost's alternative base learners, use unique transformers for model deployment, discover tips from Kaggle masters, and much more!</subfield></datafield><datafield tag="505" ind1="0" ind2=" "><subfield code="a">Cover -- Copyright -- About PACKT -- Contributors -- Table of Contents -- Preface -- Section 1: Bagging and Boosting -- Chapter 1: Machine Learning Landscape -- Previewing XGBoost -- What is machine learning? -- Data wrangling -- Dataset 1 -- Bike rentals -- Understanding the data -- Correcting null values -- Predicting regression -- Predicting bike rentals -- Saving data for future use -- Declaring predictor and target columns -- Understanding regression -- Accessing scikit-learn -- Silencing warnings -- Modeling linear regression -- XGBoost -- XGBRegressor -- Cross-validation</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">Predicting classification -- What is classification? -- Dataset 2 -- The census -- Data wrangling -- Logistic regression -- The XGBoost classifier -- Summary -- Chapter 2: Decision Trees in Depth -- Introducing decision trees with XGBoost -- Exploring decision trees -- First decision tree model -- Inside a decision tree -- Contrasting variance and bias -- Tuning decision tree hyperparameters -- Decision Tree regressor -- Hyperparameters in general -- Putting it all together -- Predicting heart disease -- a case study -- Heart Disease dataset -- Decision Tree classifier -- Choosing hyperparameters</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">Narrowing the range -- feature_importances_ -- Summary -- Chapter 3: Bagging with Random Forests -- Technical requirements -- Bagging ensembles -- Ensemble methods -- Bootstrap aggregation -- Exploring random forests -- Random forest classifiers -- Random forest regressors -- Random forest hyperparameters -- oob_score -- n_estimators -- warm_start -- bootstrap -- Verbose -- Decision Tree hyperparameters -- Pushing random forest boundaries -- case study -- Preparing the dataset -- n_estimators -- cross_val_score -- Fine-tuning hyperparameters -- Random forest drawbacks -- Summary</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">Chapter 4: From Gradient Boosting to XGBoost -- Technical requirements -- From bagging to boosting -- Introducing AdaBoost -- Distinguishing gradient boosting -- How gradient boosting works -- Residuals -- Learning how to build gradient boosting models from scratch -- Building a gradient boosting model in scikit-learn -- Modifying gradient boosting hyperparameters -- learning_rate -- Base learner -- subsample -- RandomizedSearchCV -- XGBoost -- Approaching big data -- gradient boosting versus XGBoost -- Introducing the exoplanet dataset -- Preprocessing the exoplanet dataset</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">Building gradient boosting classifiers -- Timing models -- Comparing speed -- Summary -- Section 2: XGBoost -- Chapter 5: XGBoost Unveiled -- Designing XGBoost -- Historical narrative -- Design features -- Analyzing XGBoost parameters -- Learning objective -- Building XGBoost models -- The Iris dataset -- The Diabetes dataset -- Finding the Higgs boson -- case study -- Physics background -- Kaggle competitions -- XGBoost and the Higgs challenge -- Data -- Scoring -- Weights -- The model -- Summary -- Chapter 6: XGBoost Hyperparameters -- Technical requirements -- Preparing data and base models</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Machine learning.</subfield><subfield code="0">http://id.loc.gov/authorities/subjects/sh85079324</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Python (Computer program language)</subfield><subfield code="0">http://id.loc.gov/authorities/subjects/sh96008834</subfield></datafield><datafield tag="650" ind1=" " ind2="6"><subfield code="a">Apprentissage automatique.</subfield></datafield><datafield tag="650" ind1=" " ind2="6"><subfield code="a">Python (Langage de programmation)</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Machine learning</subfield><subfield code="2">fast</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Python (Computer program language)</subfield><subfield code="2">fast</subfield></datafield><datafield tag="758" ind1=" " ind2=" "><subfield code="i">has work:</subfield><subfield code="a">Hands-On Gradient Boosting with XGBoost and scikit-learn (Text)</subfield><subfield code="1">https://id.oclc.org/worldcat/entity/E39PD3qqgwpBPGRjCVjjPQXJ7w</subfield><subfield code="4">https://id.oclc.org/worldcat/ontology/hasWork</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Print version :</subfield><subfield code="z">9781839218354</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="l">FWS01</subfield><subfield code="p">ZDB-4-EBA</subfield><subfield code="q">FWS_PDA_EBA</subfield><subfield code="u">https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=2655459</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">Askews and Holts Library Services</subfield><subfield code="b">ASKH</subfield><subfield code="n">AH37757154</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">ProQuest Ebook Central</subfield><subfield code="b">EBLB</subfield><subfield code="n">EBL6414484</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">EBSCOhost</subfield><subfield code="b">EBSC</subfield><subfield code="n">2655459</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">YBP Library Services</subfield><subfield code="b">YANK</subfield><subfield code="n">301634247</subfield></datafield><datafield tag="994" ind1=" " ind2=" "><subfield code="a">92</subfield><subfield code="b">GEBAY</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-4-EBA</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-863</subfield></datafield></record></collection> |
id | ZDB-4-EBA-on1203953302 |
illustrated | Not Illustrated |
indexdate | 2024-11-27T13:30:07Z |
institution | BVB |
isbn | 1839213809 9781839213809 |
language | English |
oclc_num | 1203953302 |
open_access_boolean | |
owner | MAIN DE-863 DE-BY-FWS |
owner_facet | MAIN DE-863 DE-BY-FWS |
physical | 1 online resource |
psigel | ZDB-4-EBA |
publishDate | 2020 |
publishDateSearch | 2020 |
publishDateSort | 2020 |
publisher | Packt Publishing, |
record_format | marc |
spelling | Wade, Corey. Hands-On Gradient Boosting with XGBoost and scikit-learn : Perform accessible machine learning and extreme gradient boosting with Python. / Corey Wade, Wade. Packt Publishing, 2020. 1 online resource text txt rdacontent computer c rdamedia online resource cr rdacarrier This practical XGBoost guide will put your Python and scikit-learn knowledge to work by showing you how to build powerful, fine-tuned XGBoost models with impressive speed and accuracy. This book will help you to apply XGBoost's alternative base learners, use unique transformers for model deployment, discover tips from Kaggle masters, and much more! Cover -- Copyright -- About PACKT -- Contributors -- Table of Contents -- Preface -- Section 1: Bagging and Boosting -- Chapter 1: Machine Learning Landscape -- Previewing XGBoost -- What is machine learning? -- Data wrangling -- Dataset 1 -- Bike rentals -- Understanding the data -- Correcting null values -- Predicting regression -- Predicting bike rentals -- Saving data for future use -- Declaring predictor and target columns -- Understanding regression -- Accessing scikit-learn -- Silencing warnings -- Modeling linear regression -- XGBoost -- XGBRegressor -- Cross-validation Predicting classification -- What is classification? -- Dataset 2 -- The census -- Data wrangling -- Logistic regression -- The XGBoost classifier -- Summary -- Chapter 2: Decision Trees in Depth -- Introducing decision trees with XGBoost -- Exploring decision trees -- First decision tree model -- Inside a decision tree -- Contrasting variance and bias -- Tuning decision tree hyperparameters -- Decision Tree regressor -- Hyperparameters in general -- Putting it all together -- Predicting heart disease -- a case study -- Heart Disease dataset -- Decision Tree classifier -- Choosing hyperparameters Narrowing the range -- feature_importances_ -- Summary -- Chapter 3: Bagging with Random Forests -- Technical requirements -- Bagging ensembles -- Ensemble methods -- Bootstrap aggregation -- Exploring random forests -- Random forest classifiers -- Random forest regressors -- Random forest hyperparameters -- oob_score -- n_estimators -- warm_start -- bootstrap -- Verbose -- Decision Tree hyperparameters -- Pushing random forest boundaries -- case study -- Preparing the dataset -- n_estimators -- cross_val_score -- Fine-tuning hyperparameters -- Random forest drawbacks -- Summary Chapter 4: From Gradient Boosting to XGBoost -- Technical requirements -- From bagging to boosting -- Introducing AdaBoost -- Distinguishing gradient boosting -- How gradient boosting works -- Residuals -- Learning how to build gradient boosting models from scratch -- Building a gradient boosting model in scikit-learn -- Modifying gradient boosting hyperparameters -- learning_rate -- Base learner -- subsample -- RandomizedSearchCV -- XGBoost -- Approaching big data -- gradient boosting versus XGBoost -- Introducing the exoplanet dataset -- Preprocessing the exoplanet dataset Building gradient boosting classifiers -- Timing models -- Comparing speed -- Summary -- Section 2: XGBoost -- Chapter 5: XGBoost Unveiled -- Designing XGBoost -- Historical narrative -- Design features -- Analyzing XGBoost parameters -- Learning objective -- Building XGBoost models -- The Iris dataset -- The Diabetes dataset -- Finding the Higgs boson -- case study -- Physics background -- Kaggle competitions -- XGBoost and the Higgs challenge -- Data -- Scoring -- Weights -- The model -- Summary -- Chapter 6: XGBoost Hyperparameters -- Technical requirements -- Preparing data and base models Machine learning. http://id.loc.gov/authorities/subjects/sh85079324 Python (Computer program language) http://id.loc.gov/authorities/subjects/sh96008834 Apprentissage automatique. Python (Langage de programmation) Machine learning fast Python (Computer program language) fast has work: Hands-On Gradient Boosting with XGBoost and scikit-learn (Text) https://id.oclc.org/worldcat/entity/E39PD3qqgwpBPGRjCVjjPQXJ7w https://id.oclc.org/worldcat/ontology/hasWork Print version : 9781839218354 FWS01 ZDB-4-EBA FWS_PDA_EBA https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=2655459 Volltext |
spellingShingle | Wade, Corey Hands-On Gradient Boosting with XGBoost and scikit-learn : Perform accessible machine learning and extreme gradient boosting with Python. / Cover -- Copyright -- About PACKT -- Contributors -- Table of Contents -- Preface -- Section 1: Bagging and Boosting -- Chapter 1: Machine Learning Landscape -- Previewing XGBoost -- What is machine learning? -- Data wrangling -- Dataset 1 -- Bike rentals -- Understanding the data -- Correcting null values -- Predicting regression -- Predicting bike rentals -- Saving data for future use -- Declaring predictor and target columns -- Understanding regression -- Accessing scikit-learn -- Silencing warnings -- Modeling linear regression -- XGBoost -- XGBRegressor -- Cross-validation Predicting classification -- What is classification? -- Dataset 2 -- The census -- Data wrangling -- Logistic regression -- The XGBoost classifier -- Summary -- Chapter 2: Decision Trees in Depth -- Introducing decision trees with XGBoost -- Exploring decision trees -- First decision tree model -- Inside a decision tree -- Contrasting variance and bias -- Tuning decision tree hyperparameters -- Decision Tree regressor -- Hyperparameters in general -- Putting it all together -- Predicting heart disease -- a case study -- Heart Disease dataset -- Decision Tree classifier -- Choosing hyperparameters Narrowing the range -- feature_importances_ -- Summary -- Chapter 3: Bagging with Random Forests -- Technical requirements -- Bagging ensembles -- Ensemble methods -- Bootstrap aggregation -- Exploring random forests -- Random forest classifiers -- Random forest regressors -- Random forest hyperparameters -- oob_score -- n_estimators -- warm_start -- bootstrap -- Verbose -- Decision Tree hyperparameters -- Pushing random forest boundaries -- case study -- Preparing the dataset -- n_estimators -- cross_val_score -- Fine-tuning hyperparameters -- Random forest drawbacks -- Summary Chapter 4: From Gradient Boosting to XGBoost -- Technical requirements -- From bagging to boosting -- Introducing AdaBoost -- Distinguishing gradient boosting -- How gradient boosting works -- Residuals -- Learning how to build gradient boosting models from scratch -- Building a gradient boosting model in scikit-learn -- Modifying gradient boosting hyperparameters -- learning_rate -- Base learner -- subsample -- RandomizedSearchCV -- XGBoost -- Approaching big data -- gradient boosting versus XGBoost -- Introducing the exoplanet dataset -- Preprocessing the exoplanet dataset Building gradient boosting classifiers -- Timing models -- Comparing speed -- Summary -- Section 2: XGBoost -- Chapter 5: XGBoost Unveiled -- Designing XGBoost -- Historical narrative -- Design features -- Analyzing XGBoost parameters -- Learning objective -- Building XGBoost models -- The Iris dataset -- The Diabetes dataset -- Finding the Higgs boson -- case study -- Physics background -- Kaggle competitions -- XGBoost and the Higgs challenge -- Data -- Scoring -- Weights -- The model -- Summary -- Chapter 6: XGBoost Hyperparameters -- Technical requirements -- Preparing data and base models Machine learning. http://id.loc.gov/authorities/subjects/sh85079324 Python (Computer program language) http://id.loc.gov/authorities/subjects/sh96008834 Apprentissage automatique. Python (Langage de programmation) Machine learning fast Python (Computer program language) fast |
subject_GND | http://id.loc.gov/authorities/subjects/sh85079324 http://id.loc.gov/authorities/subjects/sh96008834 |
title | Hands-On Gradient Boosting with XGBoost and scikit-learn : Perform accessible machine learning and extreme gradient boosting with Python. / |
title_auth | Hands-On Gradient Boosting with XGBoost and scikit-learn : Perform accessible machine learning and extreme gradient boosting with Python. / |
title_exact_search | Hands-On Gradient Boosting with XGBoost and scikit-learn : Perform accessible machine learning and extreme gradient boosting with Python. / |
title_full | Hands-On Gradient Boosting with XGBoost and scikit-learn : Perform accessible machine learning and extreme gradient boosting with Python. / Corey Wade, Wade. |
title_fullStr | Hands-On Gradient Boosting with XGBoost and scikit-learn : Perform accessible machine learning and extreme gradient boosting with Python. / Corey Wade, Wade. |
title_full_unstemmed | Hands-On Gradient Boosting with XGBoost and scikit-learn : Perform accessible machine learning and extreme gradient boosting with Python. / Corey Wade, Wade. |
title_short | Hands-On Gradient Boosting with XGBoost and scikit-learn : |
title_sort | hands on gradient boosting with xgboost and scikit learn perform accessible machine learning and extreme gradient boosting with python |
title_sub | Perform accessible machine learning and extreme gradient boosting with Python. / |
topic | Machine learning. http://id.loc.gov/authorities/subjects/sh85079324 Python (Computer program language) http://id.loc.gov/authorities/subjects/sh96008834 Apprentissage automatique. Python (Langage de programmation) Machine learning fast Python (Computer program language) fast |
topic_facet | Machine learning. Python (Computer program language) Apprentissage automatique. Python (Langage de programmation) Machine learning |
url | https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=2655459 |
work_keys_str_mv | AT wadecorey handsongradientboostingwithxgboostandscikitlearnperformaccessiblemachinelearningandextremegradientboostingwithpython |