R deep learning essentials :: a step-by-step guide to building deep learning models using TensorFlow, Keras, and MXNet /
This book demonstrates how to use deep Learning in R for machine learning, image classification, and natural language processing. It covers topics such as convolutional networks, recurrent neural networks, transfer learning and deep learning in the cloud. By the end of this book, you will be able to...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Elektronisch E-Book |
Sprache: | English |
Veröffentlicht: |
Birmingham :
Packt Publishing Ltd,
2018.
|
Ausgabe: | Second edition. |
Schlagworte: | |
Online-Zugang: | Volltext |
Zusammenfassung: | This book demonstrates how to use deep Learning in R for machine learning, image classification, and natural language processing. It covers topics such as convolutional networks, recurrent neural networks, transfer learning and deep learning in the cloud. By the end of this book, you will be able to apply deep learning to real-world projects. |
Beschreibung: | Includes index. Document classification. |
Beschreibung: | 1 online resource |
ISBN: | 9781788997805 1788997808 |
Internformat
MARC
LEADER | 00000cam a2200000 i 4500 | ||
---|---|---|---|
001 | ZDB-4-EBA-on1051140715 | ||
003 | OCoLC | ||
005 | 20240705115654.0 | ||
006 | m o d | ||
007 | cr |n|---||||| | ||
008 | 180908s2018 enk o 001 0 eng d | ||
040 | |a EBLCP |b eng |e pn |c EBLCP |d YDX |d TEFOD |d MERUC |d IDB |d OCLCQ |d CHVBK |d N$T |d OCLCF |d ZCU |d OCLCQ |d K6U |d OCLCO |d OCLCQ |d OCLCO |d NZAUC |d OCLCQ |d PSYSI |d OCLCQ |d OCLCO | ||
019 | |a 1051075116 |a 1079363066 | ||
020 | |a 9781788997805 | ||
020 | |a 1788997808 | ||
020 | |z 178899289X | ||
020 | |z 9781788992893 | ||
035 | |a (OCoLC)1051140715 |z (OCoLC)1051075116 |z (OCoLC)1079363066 | ||
037 | |a 1BBFB26C-0C51-4300-AD4D-8E976B58BE8E |b OverDrive, Inc. |n http://www.overdrive.com | ||
050 | 4 | |a QA276.45.R3 |b .H636 2018 | |
072 | 7 | |a MAT |x 003000 |2 bisacsh | |
072 | 7 | |a MAT |x 029000 |2 bisacsh | |
082 | 7 | |a 519.502855133 |2 23 | |
049 | |a MAIN | ||
100 | 1 | |a Hodnett, Mark, |e author. |0 http://id.loc.gov/authorities/names/nb2008013725 | |
245 | 1 | 0 | |a R deep learning essentials : |b a step-by-step guide to building deep learning models using TensorFlow, Keras, and MXNet / |c Mark Hodnett, Joshua F. Wiley. |
250 | |a Second edition. | ||
260 | |a Birmingham : |b Packt Publishing Ltd, |c 2018. | ||
300 | |a 1 online resource | ||
336 | |a text |b txt |2 rdacontent | ||
337 | |a computer |b c |2 rdamedia | ||
338 | |a online resource |b cr |2 rdacarrier | ||
500 | |a Includes index. | ||
520 | |a This book demonstrates how to use deep Learning in R for machine learning, image classification, and natural language processing. It covers topics such as convolutional networks, recurrent neural networks, transfer learning and deep learning in the cloud. By the end of this book, you will be able to apply deep learning to real-world projects. | ||
505 | 0 | |a Cover; Title Page; Copyright and Credits; Packt Upsell; Contributors; Table of Contents; Preface; Chapter 1: Getting Started with Deep Learning; What is deep learning?; A conceptual overview of neural networks; Neural networks as an extension of linear regression; Neural networks as a network of memory cells; Deep neural networks; Some common myths about deep learning; Setting up your R environment; Deep learning frameworks for R; MXNet; Keras; Do I need a GPU (and what is it, anyway)?; Setting up reproducible results; Summary; Chapter 2: Training a Prediction Model; Neural networks in R. | |
505 | 8 | |a Building neural network modelsGenerating predictions from a neural network; The problem of overfitting data -- the consequences explained; Use case -- building and applying a neural network; Summary; Chapter 3: Deep Learning Fundamentals; Building neural networks from scratch in R; Neural network web application; Neural network code; Back to deep learning; The symbol, X, y, and ctx parameters; The num.round and begin.round parameters; The optimizer parameter; The initializer parameter; The eval.metric and eval.data parameters; The epoch.end.callback parameter; The array.batch.size parameter. | |
505 | 8 | |a Using regularization to overcome overfittingL1 penalty; L1 penalty in action; L2 penalty; L2 penalty in action; Weight decay (L2 penalty in neural networks); Ensembles and model-averaging; Use case -- improving out-of-sample model performance using dropout; Summary; Chapter 4: Training Deep Prediction Models; Getting started with deep feedforward neural networks; Activation functions; Introduction to the MXNet deep learning library; Deep learning layers; Building a deep learning model; Use case -- using MXNet for classification and regression; Data download and exploration. | |
505 | 8 | |a Preparing the data for our modelsThe binary classification model; The regression model; Improving the binary classification model; The unreasonable effectiveness of data; Summary; Chapter 5: Image Classification Using Convolutional Neural Networks; CNNs; Convolutional layers; Pooling layers; Dropout; Flatten layers, dense layers, and softmax; Image classification using the MXNet library; Base model (no convolutional layers); LeNet; Classification using the fashion MNIST dataset; References/further reading; Summary; Chapter 6: Tuning and Optimizing Models. | |
505 | 8 | |a Evaluation metrics and evaluating performanceTypes of evaluation metric; Evaluating performance; Data preparation; Different data distributions; Data partition between training, test, and validation sets; Standardization; Data leakage; Data augmentation; Using data augmentation to increase the training data; Test time augmentation; Using data augmentation in deep learning libraries; Tuning hyperparameters; Grid search; Random search; Use case-using LIME for interpretability; Model interpretability with LIME; Summary; Chapter 7: Natural Language Processing Using Deep Learning. | |
500 | |a Document classification. | ||
588 | 0 | |a Print version record. | |
650 | 0 | |a R (Computer program language) |0 http://id.loc.gov/authorities/subjects/sh2002004407 | |
650 | 0 | |a Artificial intelligence. |0 http://id.loc.gov/authorities/subjects/sh85008180 | |
650 | 0 | |a Machine learning. |0 http://id.loc.gov/authorities/subjects/sh85079324 | |
650 | 0 | |a Neural networks (Computer science) |0 http://id.loc.gov/authorities/subjects/sh90001937 | |
650 | 2 | |a Artificial Intelligence |0 https://id.nlm.nih.gov/mesh/D001185 | |
650 | 2 | |a Neural Networks, Computer |0 https://id.nlm.nih.gov/mesh/D016571 | |
650 | 2 | |a Machine Learning |0 https://id.nlm.nih.gov/mesh/D000069550 | |
650 | 6 | |a R (Langage de programmation) | |
650 | 6 | |a Intelligence artificielle. | |
650 | 6 | |a Apprentissage automatique. | |
650 | 6 | |a Réseaux neuronaux (Informatique) | |
650 | 7 | |a artificial intelligence. |2 aat | |
650 | 7 | |a MATHEMATICS |x Applied. |2 bisacsh | |
650 | 7 | |a MATHEMATICS |x Probability & Statistics |x General. |2 bisacsh | |
650 | 7 | |a Artificial intelligence |2 fast | |
650 | 7 | |a Machine learning |2 fast | |
650 | 7 | |a Neural networks (Computer science) |2 fast | |
650 | 7 | |a R (Computer program language) |2 fast | |
700 | 1 | |a Wiley, Joshua F., |e author. |0 http://id.loc.gov/authorities/names/no2017062342 | |
776 | 0 | 8 | |i Print version: |a Hodnett, Mark. |t R Deep Learning Essentials : A Step-By-step Guide to Building Deep Learning Models Using TensorFlow, Keras, and MXNet, 2nd Edition. |d Birmingham : Packt Publishing Ltd, ©2018 |z 9781788992893 |
856 | 1 | |l FWS01 |p ZDB-4-EBA |q FWS_PDA_EBA |u https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=1879523 |3 Volltext | |
856 | 1 | |l CBO01 |p ZDB-4-EBA |q FWS_PDA_EBA |u https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=1879523 |3 Volltext | |
938 | |a EBL - Ebook Library |b EBLB |n EBL5501083 | ||
938 | |a EBSCOhost |b EBSC |n 1879523 | ||
938 | |a YBP Library Services |b YANK |n 15687049 | ||
994 | |a 92 |b GEBAY | ||
912 | |a ZDB-4-EBA |
Datensatz im Suchindex
DE-BY-FWS_katkey | ZDB-4-EBA-on1051140715 |
---|---|
_version_ | 1813901627995717632 |
adam_text | |
any_adam_object | |
author | Hodnett, Mark Wiley, Joshua F. |
author_GND | http://id.loc.gov/authorities/names/nb2008013725 http://id.loc.gov/authorities/names/no2017062342 |
author_facet | Hodnett, Mark Wiley, Joshua F. |
author_role | aut aut |
author_sort | Hodnett, Mark |
author_variant | m h mh j f w jf jfw |
building | Verbundindex |
bvnumber | localFWS |
callnumber-first | Q - Science |
callnumber-label | QA276 |
callnumber-raw | QA276.45.R3 .H636 2018 |
callnumber-search | QA276.45.R3 .H636 2018 |
callnumber-sort | QA 3276.45 R3 H636 42018 |
callnumber-subject | QA - Mathematics |
collection | ZDB-4-EBA |
contents | Cover; Title Page; Copyright and Credits; Packt Upsell; Contributors; Table of Contents; Preface; Chapter 1: Getting Started with Deep Learning; What is deep learning?; A conceptual overview of neural networks; Neural networks as an extension of linear regression; Neural networks as a network of memory cells; Deep neural networks; Some common myths about deep learning; Setting up your R environment; Deep learning frameworks for R; MXNet; Keras; Do I need a GPU (and what is it, anyway)?; Setting up reproducible results; Summary; Chapter 2: Training a Prediction Model; Neural networks in R. Building neural network modelsGenerating predictions from a neural network; The problem of overfitting data -- the consequences explained; Use case -- building and applying a neural network; Summary; Chapter 3: Deep Learning Fundamentals; Building neural networks from scratch in R; Neural network web application; Neural network code; Back to deep learning; The symbol, X, y, and ctx parameters; The num.round and begin.round parameters; The optimizer parameter; The initializer parameter; The eval.metric and eval.data parameters; The epoch.end.callback parameter; The array.batch.size parameter. Using regularization to overcome overfittingL1 penalty; L1 penalty in action; L2 penalty; L2 penalty in action; Weight decay (L2 penalty in neural networks); Ensembles and model-averaging; Use case -- improving out-of-sample model performance using dropout; Summary; Chapter 4: Training Deep Prediction Models; Getting started with deep feedforward neural networks; Activation functions; Introduction to the MXNet deep learning library; Deep learning layers; Building a deep learning model; Use case -- using MXNet for classification and regression; Data download and exploration. Preparing the data for our modelsThe binary classification model; The regression model; Improving the binary classification model; The unreasonable effectiveness of data; Summary; Chapter 5: Image Classification Using Convolutional Neural Networks; CNNs; Convolutional layers; Pooling layers; Dropout; Flatten layers, dense layers, and softmax; Image classification using the MXNet library; Base model (no convolutional layers); LeNet; Classification using the fashion MNIST dataset; References/further reading; Summary; Chapter 6: Tuning and Optimizing Models. Evaluation metrics and evaluating performanceTypes of evaluation metric; Evaluating performance; Data preparation; Different data distributions; Data partition between training, test, and validation sets; Standardization; Data leakage; Data augmentation; Using data augmentation to increase the training data; Test time augmentation; Using data augmentation in deep learning libraries; Tuning hyperparameters; Grid search; Random search; Use case-using LIME for interpretability; Model interpretability with LIME; Summary; Chapter 7: Natural Language Processing Using Deep Learning. |
ctrlnum | (OCoLC)1051140715 |
dewey-full | 519.502855133 |
dewey-hundreds | 500 - Natural sciences and mathematics |
dewey-ones | 519 - Probabilities and applied mathematics |
dewey-raw | 519.502855133 |
dewey-search | 519.502855133 |
dewey-sort | 3519.502855133 |
dewey-tens | 510 - Mathematics |
discipline | Mathematik |
edition | Second edition. |
format | Electronic eBook |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>06594cam a2200757 i 4500</leader><controlfield tag="001">ZDB-4-EBA-on1051140715</controlfield><controlfield tag="003">OCoLC</controlfield><controlfield tag="005">20240705115654.0</controlfield><controlfield tag="006">m o d </controlfield><controlfield tag="007">cr |n|---|||||</controlfield><controlfield tag="008">180908s2018 enk o 001 0 eng d</controlfield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">EBLCP</subfield><subfield code="b">eng</subfield><subfield code="e">pn</subfield><subfield code="c">EBLCP</subfield><subfield code="d">YDX</subfield><subfield code="d">TEFOD</subfield><subfield code="d">MERUC</subfield><subfield code="d">IDB</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">CHVBK</subfield><subfield code="d">N$T</subfield><subfield code="d">OCLCF</subfield><subfield code="d">ZCU</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">K6U</subfield><subfield code="d">OCLCO</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">OCLCO</subfield><subfield code="d">NZAUC</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">PSYSI</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">OCLCO</subfield></datafield><datafield tag="019" ind1=" " ind2=" "><subfield code="a">1051075116</subfield><subfield code="a">1079363066</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9781788997805</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">1788997808</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="z">178899289X</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="z">9781788992893</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)1051140715</subfield><subfield code="z">(OCoLC)1051075116</subfield><subfield code="z">(OCoLC)1079363066</subfield></datafield><datafield tag="037" ind1=" " ind2=" "><subfield code="a">1BBFB26C-0C51-4300-AD4D-8E976B58BE8E</subfield><subfield code="b">OverDrive, Inc.</subfield><subfield code="n">http://www.overdrive.com</subfield></datafield><datafield tag="050" ind1=" " ind2="4"><subfield code="a">QA276.45.R3</subfield><subfield code="b">.H636 2018</subfield></datafield><datafield tag="072" ind1=" " ind2="7"><subfield code="a">MAT</subfield><subfield code="x">003000</subfield><subfield code="2">bisacsh</subfield></datafield><datafield tag="072" ind1=" " ind2="7"><subfield code="a">MAT</subfield><subfield code="x">029000</subfield><subfield code="2">bisacsh</subfield></datafield><datafield tag="082" ind1="7" ind2=" "><subfield code="a">519.502855133</subfield><subfield code="2">23</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">MAIN</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Hodnett, Mark,</subfield><subfield code="e">author.</subfield><subfield code="0">http://id.loc.gov/authorities/names/nb2008013725</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">R deep learning essentials :</subfield><subfield code="b">a step-by-step guide to building deep learning models using TensorFlow, Keras, and MXNet /</subfield><subfield code="c">Mark Hodnett, Joshua F. Wiley.</subfield></datafield><datafield tag="250" ind1=" " ind2=" "><subfield code="a">Second edition.</subfield></datafield><datafield tag="260" ind1=" " ind2=" "><subfield code="a">Birmingham :</subfield><subfield code="b">Packt Publishing Ltd,</subfield><subfield code="c">2018.</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 online resource</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">computer</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">online resource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">Includes index.</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">This book demonstrates how to use deep Learning in R for machine learning, image classification, and natural language processing. It covers topics such as convolutional networks, recurrent neural networks, transfer learning and deep learning in the cloud. By the end of this book, you will be able to apply deep learning to real-world projects.</subfield></datafield><datafield tag="505" ind1="0" ind2=" "><subfield code="a">Cover; Title Page; Copyright and Credits; Packt Upsell; Contributors; Table of Contents; Preface; Chapter 1: Getting Started with Deep Learning; What is deep learning?; A conceptual overview of neural networks; Neural networks as an extension of linear regression; Neural networks as a network of memory cells; Deep neural networks; Some common myths about deep learning; Setting up your R environment; Deep learning frameworks for R; MXNet; Keras; Do I need a GPU (and what is it, anyway)?; Setting up reproducible results; Summary; Chapter 2: Training a Prediction Model; Neural networks in R.</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">Building neural network modelsGenerating predictions from a neural network; The problem of overfitting data -- the consequences explained; Use case -- building and applying a neural network; Summary; Chapter 3: Deep Learning Fundamentals; Building neural networks from scratch in R; Neural network web application; Neural network code; Back to deep learning; The symbol, X, y, and ctx parameters; The num.round and begin.round parameters; The optimizer parameter; The initializer parameter; The eval.metric and eval.data parameters; The epoch.end.callback parameter; The array.batch.size parameter.</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">Using regularization to overcome overfittingL1 penalty; L1 penalty in action; L2 penalty; L2 penalty in action; Weight decay (L2 penalty in neural networks); Ensembles and model-averaging; Use case -- improving out-of-sample model performance using dropout; Summary; Chapter 4: Training Deep Prediction Models; Getting started with deep feedforward neural networks; Activation functions; Introduction to the MXNet deep learning library; Deep learning layers; Building a deep learning model; Use case -- using MXNet for classification and regression; Data download and exploration.</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">Preparing the data for our modelsThe binary classification model; The regression model; Improving the binary classification model; The unreasonable effectiveness of data; Summary; Chapter 5: Image Classification Using Convolutional Neural Networks; CNNs; Convolutional layers; Pooling layers; Dropout; Flatten layers, dense layers, and softmax; Image classification using the MXNet library; Base model (no convolutional layers); LeNet; Classification using the fashion MNIST dataset; References/further reading; Summary; Chapter 6: Tuning and Optimizing Models.</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">Evaluation metrics and evaluating performanceTypes of evaluation metric; Evaluating performance; Data preparation; Different data distributions; Data partition between training, test, and validation sets; Standardization; Data leakage; Data augmentation; Using data augmentation to increase the training data; Test time augmentation; Using data augmentation in deep learning libraries; Tuning hyperparameters; Grid search; Random search; Use case-using LIME for interpretability; Model interpretability with LIME; Summary; Chapter 7: Natural Language Processing Using Deep Learning.</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">Document classification.</subfield></datafield><datafield tag="588" ind1="0" ind2=" "><subfield code="a">Print version record.</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">R (Computer program language)</subfield><subfield code="0">http://id.loc.gov/authorities/subjects/sh2002004407</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Artificial intelligence.</subfield><subfield code="0">http://id.loc.gov/authorities/subjects/sh85008180</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Machine learning.</subfield><subfield code="0">http://id.loc.gov/authorities/subjects/sh85079324</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Neural networks (Computer science)</subfield><subfield code="0">http://id.loc.gov/authorities/subjects/sh90001937</subfield></datafield><datafield tag="650" ind1=" " ind2="2"><subfield code="a">Artificial Intelligence</subfield><subfield code="0">https://id.nlm.nih.gov/mesh/D001185</subfield></datafield><datafield tag="650" ind1=" " ind2="2"><subfield code="a">Neural Networks, Computer</subfield><subfield code="0">https://id.nlm.nih.gov/mesh/D016571</subfield></datafield><datafield tag="650" ind1=" " ind2="2"><subfield code="a">Machine Learning</subfield><subfield code="0">https://id.nlm.nih.gov/mesh/D000069550</subfield></datafield><datafield tag="650" ind1=" " ind2="6"><subfield code="a">R (Langage de programmation)</subfield></datafield><datafield tag="650" ind1=" " ind2="6"><subfield code="a">Intelligence artificielle.</subfield></datafield><datafield tag="650" ind1=" " ind2="6"><subfield code="a">Apprentissage automatique.</subfield></datafield><datafield tag="650" ind1=" " ind2="6"><subfield code="a">Réseaux neuronaux (Informatique)</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">artificial intelligence.</subfield><subfield code="2">aat</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">MATHEMATICS</subfield><subfield code="x">Applied.</subfield><subfield code="2">bisacsh</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">MATHEMATICS</subfield><subfield code="x">Probability & Statistics</subfield><subfield code="x">General.</subfield><subfield code="2">bisacsh</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Artificial intelligence</subfield><subfield code="2">fast</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Machine learning</subfield><subfield code="2">fast</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Neural networks (Computer science)</subfield><subfield code="2">fast</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">R (Computer program language)</subfield><subfield code="2">fast</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Wiley, Joshua F.,</subfield><subfield code="e">author.</subfield><subfield code="0">http://id.loc.gov/authorities/names/no2017062342</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Print version:</subfield><subfield code="a">Hodnett, Mark.</subfield><subfield code="t">R Deep Learning Essentials : A Step-By-step Guide to Building Deep Learning Models Using TensorFlow, Keras, and MXNet, 2nd Edition.</subfield><subfield code="d">Birmingham : Packt Publishing Ltd, ©2018</subfield><subfield code="z">9781788992893</subfield></datafield><datafield tag="856" ind1="1" ind2=" "><subfield code="l">FWS01</subfield><subfield code="p">ZDB-4-EBA</subfield><subfield code="q">FWS_PDA_EBA</subfield><subfield code="u">https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=1879523</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="856" ind1="1" ind2=" "><subfield code="l">CBO01</subfield><subfield code="p">ZDB-4-EBA</subfield><subfield code="q">FWS_PDA_EBA</subfield><subfield code="u">https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=1879523</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">EBL - Ebook Library</subfield><subfield code="b">EBLB</subfield><subfield code="n">EBL5501083</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">EBSCOhost</subfield><subfield code="b">EBSC</subfield><subfield code="n">1879523</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">YBP Library Services</subfield><subfield code="b">YANK</subfield><subfield code="n">15687049</subfield></datafield><datafield tag="994" ind1=" " ind2=" "><subfield code="a">92</subfield><subfield code="b">GEBAY</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-4-EBA</subfield></datafield></record></collection> |
id | ZDB-4-EBA-on1051140715 |
illustrated | Not Illustrated |
indexdate | 2024-10-25T15:49:55Z |
institution | BVB |
isbn | 9781788997805 1788997808 |
language | English |
oclc_num | 1051140715 |
open_access_boolean | |
owner | MAIN |
owner_facet | MAIN |
physical | 1 online resource |
psigel | ZDB-4-EBA |
publishDate | 2018 |
publishDateSearch | 2018 |
publishDateSort | 2018 |
publisher | Packt Publishing Ltd, |
record_format | marc |
spelling | Hodnett, Mark, author. http://id.loc.gov/authorities/names/nb2008013725 R deep learning essentials : a step-by-step guide to building deep learning models using TensorFlow, Keras, and MXNet / Mark Hodnett, Joshua F. Wiley. Second edition. Birmingham : Packt Publishing Ltd, 2018. 1 online resource text txt rdacontent computer c rdamedia online resource cr rdacarrier Includes index. This book demonstrates how to use deep Learning in R for machine learning, image classification, and natural language processing. It covers topics such as convolutional networks, recurrent neural networks, transfer learning and deep learning in the cloud. By the end of this book, you will be able to apply deep learning to real-world projects. Cover; Title Page; Copyright and Credits; Packt Upsell; Contributors; Table of Contents; Preface; Chapter 1: Getting Started with Deep Learning; What is deep learning?; A conceptual overview of neural networks; Neural networks as an extension of linear regression; Neural networks as a network of memory cells; Deep neural networks; Some common myths about deep learning; Setting up your R environment; Deep learning frameworks for R; MXNet; Keras; Do I need a GPU (and what is it, anyway)?; Setting up reproducible results; Summary; Chapter 2: Training a Prediction Model; Neural networks in R. Building neural network modelsGenerating predictions from a neural network; The problem of overfitting data -- the consequences explained; Use case -- building and applying a neural network; Summary; Chapter 3: Deep Learning Fundamentals; Building neural networks from scratch in R; Neural network web application; Neural network code; Back to deep learning; The symbol, X, y, and ctx parameters; The num.round and begin.round parameters; The optimizer parameter; The initializer parameter; The eval.metric and eval.data parameters; The epoch.end.callback parameter; The array.batch.size parameter. Using regularization to overcome overfittingL1 penalty; L1 penalty in action; L2 penalty; L2 penalty in action; Weight decay (L2 penalty in neural networks); Ensembles and model-averaging; Use case -- improving out-of-sample model performance using dropout; Summary; Chapter 4: Training Deep Prediction Models; Getting started with deep feedforward neural networks; Activation functions; Introduction to the MXNet deep learning library; Deep learning layers; Building a deep learning model; Use case -- using MXNet for classification and regression; Data download and exploration. Preparing the data for our modelsThe binary classification model; The regression model; Improving the binary classification model; The unreasonable effectiveness of data; Summary; Chapter 5: Image Classification Using Convolutional Neural Networks; CNNs; Convolutional layers; Pooling layers; Dropout; Flatten layers, dense layers, and softmax; Image classification using the MXNet library; Base model (no convolutional layers); LeNet; Classification using the fashion MNIST dataset; References/further reading; Summary; Chapter 6: Tuning and Optimizing Models. Evaluation metrics and evaluating performanceTypes of evaluation metric; Evaluating performance; Data preparation; Different data distributions; Data partition between training, test, and validation sets; Standardization; Data leakage; Data augmentation; Using data augmentation to increase the training data; Test time augmentation; Using data augmentation in deep learning libraries; Tuning hyperparameters; Grid search; Random search; Use case-using LIME for interpretability; Model interpretability with LIME; Summary; Chapter 7: Natural Language Processing Using Deep Learning. Document classification. Print version record. R (Computer program language) http://id.loc.gov/authorities/subjects/sh2002004407 Artificial intelligence. http://id.loc.gov/authorities/subjects/sh85008180 Machine learning. http://id.loc.gov/authorities/subjects/sh85079324 Neural networks (Computer science) http://id.loc.gov/authorities/subjects/sh90001937 Artificial Intelligence https://id.nlm.nih.gov/mesh/D001185 Neural Networks, Computer https://id.nlm.nih.gov/mesh/D016571 Machine Learning https://id.nlm.nih.gov/mesh/D000069550 R (Langage de programmation) Intelligence artificielle. Apprentissage automatique. Réseaux neuronaux (Informatique) artificial intelligence. aat MATHEMATICS Applied. bisacsh MATHEMATICS Probability & Statistics General. bisacsh Artificial intelligence fast Machine learning fast Neural networks (Computer science) fast R (Computer program language) fast Wiley, Joshua F., author. http://id.loc.gov/authorities/names/no2017062342 Print version: Hodnett, Mark. R Deep Learning Essentials : A Step-By-step Guide to Building Deep Learning Models Using TensorFlow, Keras, and MXNet, 2nd Edition. Birmingham : Packt Publishing Ltd, ©2018 9781788992893 FWS01 ZDB-4-EBA FWS_PDA_EBA https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=1879523 Volltext CBO01 ZDB-4-EBA FWS_PDA_EBA https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=1879523 Volltext |
spellingShingle | Hodnett, Mark Wiley, Joshua F. R deep learning essentials : a step-by-step guide to building deep learning models using TensorFlow, Keras, and MXNet / Cover; Title Page; Copyright and Credits; Packt Upsell; Contributors; Table of Contents; Preface; Chapter 1: Getting Started with Deep Learning; What is deep learning?; A conceptual overview of neural networks; Neural networks as an extension of linear regression; Neural networks as a network of memory cells; Deep neural networks; Some common myths about deep learning; Setting up your R environment; Deep learning frameworks for R; MXNet; Keras; Do I need a GPU (and what is it, anyway)?; Setting up reproducible results; Summary; Chapter 2: Training a Prediction Model; Neural networks in R. Building neural network modelsGenerating predictions from a neural network; The problem of overfitting data -- the consequences explained; Use case -- building and applying a neural network; Summary; Chapter 3: Deep Learning Fundamentals; Building neural networks from scratch in R; Neural network web application; Neural network code; Back to deep learning; The symbol, X, y, and ctx parameters; The num.round and begin.round parameters; The optimizer parameter; The initializer parameter; The eval.metric and eval.data parameters; The epoch.end.callback parameter; The array.batch.size parameter. Using regularization to overcome overfittingL1 penalty; L1 penalty in action; L2 penalty; L2 penalty in action; Weight decay (L2 penalty in neural networks); Ensembles and model-averaging; Use case -- improving out-of-sample model performance using dropout; Summary; Chapter 4: Training Deep Prediction Models; Getting started with deep feedforward neural networks; Activation functions; Introduction to the MXNet deep learning library; Deep learning layers; Building a deep learning model; Use case -- using MXNet for classification and regression; Data download and exploration. Preparing the data for our modelsThe binary classification model; The regression model; Improving the binary classification model; The unreasonable effectiveness of data; Summary; Chapter 5: Image Classification Using Convolutional Neural Networks; CNNs; Convolutional layers; Pooling layers; Dropout; Flatten layers, dense layers, and softmax; Image classification using the MXNet library; Base model (no convolutional layers); LeNet; Classification using the fashion MNIST dataset; References/further reading; Summary; Chapter 6: Tuning and Optimizing Models. Evaluation metrics and evaluating performanceTypes of evaluation metric; Evaluating performance; Data preparation; Different data distributions; Data partition between training, test, and validation sets; Standardization; Data leakage; Data augmentation; Using data augmentation to increase the training data; Test time augmentation; Using data augmentation in deep learning libraries; Tuning hyperparameters; Grid search; Random search; Use case-using LIME for interpretability; Model interpretability with LIME; Summary; Chapter 7: Natural Language Processing Using Deep Learning. R (Computer program language) http://id.loc.gov/authorities/subjects/sh2002004407 Artificial intelligence. http://id.loc.gov/authorities/subjects/sh85008180 Machine learning. http://id.loc.gov/authorities/subjects/sh85079324 Neural networks (Computer science) http://id.loc.gov/authorities/subjects/sh90001937 Artificial Intelligence https://id.nlm.nih.gov/mesh/D001185 Neural Networks, Computer https://id.nlm.nih.gov/mesh/D016571 Machine Learning https://id.nlm.nih.gov/mesh/D000069550 R (Langage de programmation) Intelligence artificielle. Apprentissage automatique. Réseaux neuronaux (Informatique) artificial intelligence. aat MATHEMATICS Applied. bisacsh MATHEMATICS Probability & Statistics General. bisacsh Artificial intelligence fast Machine learning fast Neural networks (Computer science) fast R (Computer program language) fast |
subject_GND | http://id.loc.gov/authorities/subjects/sh2002004407 http://id.loc.gov/authorities/subjects/sh85008180 http://id.loc.gov/authorities/subjects/sh85079324 http://id.loc.gov/authorities/subjects/sh90001937 https://id.nlm.nih.gov/mesh/D001185 https://id.nlm.nih.gov/mesh/D016571 https://id.nlm.nih.gov/mesh/D000069550 |
title | R deep learning essentials : a step-by-step guide to building deep learning models using TensorFlow, Keras, and MXNet / |
title_auth | R deep learning essentials : a step-by-step guide to building deep learning models using TensorFlow, Keras, and MXNet / |
title_exact_search | R deep learning essentials : a step-by-step guide to building deep learning models using TensorFlow, Keras, and MXNet / |
title_full | R deep learning essentials : a step-by-step guide to building deep learning models using TensorFlow, Keras, and MXNet / Mark Hodnett, Joshua F. Wiley. |
title_fullStr | R deep learning essentials : a step-by-step guide to building deep learning models using TensorFlow, Keras, and MXNet / Mark Hodnett, Joshua F. Wiley. |
title_full_unstemmed | R deep learning essentials : a step-by-step guide to building deep learning models using TensorFlow, Keras, and MXNet / Mark Hodnett, Joshua F. Wiley. |
title_short | R deep learning essentials : |
title_sort | r deep learning essentials a step by step guide to building deep learning models using tensorflow keras and mxnet |
title_sub | a step-by-step guide to building deep learning models using TensorFlow, Keras, and MXNet / |
topic | R (Computer program language) http://id.loc.gov/authorities/subjects/sh2002004407 Artificial intelligence. http://id.loc.gov/authorities/subjects/sh85008180 Machine learning. http://id.loc.gov/authorities/subjects/sh85079324 Neural networks (Computer science) http://id.loc.gov/authorities/subjects/sh90001937 Artificial Intelligence https://id.nlm.nih.gov/mesh/D001185 Neural Networks, Computer https://id.nlm.nih.gov/mesh/D016571 Machine Learning https://id.nlm.nih.gov/mesh/D000069550 R (Langage de programmation) Intelligence artificielle. Apprentissage automatique. Réseaux neuronaux (Informatique) artificial intelligence. aat MATHEMATICS Applied. bisacsh MATHEMATICS Probability & Statistics General. bisacsh Artificial intelligence fast Machine learning fast Neural networks (Computer science) fast R (Computer program language) fast |
topic_facet | R (Computer program language) Artificial intelligence. Machine learning. Neural networks (Computer science) Artificial Intelligence Neural Networks, Computer Machine Learning R (Langage de programmation) Intelligence artificielle. Apprentissage automatique. Réseaux neuronaux (Informatique) artificial intelligence. MATHEMATICS Applied. MATHEMATICS Probability & Statistics General. Artificial intelligence Machine learning |
url | https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=1879523 |
work_keys_str_mv | AT hodnettmark rdeeplearningessentialsastepbystepguidetobuildingdeeplearningmodelsusingtensorflowkerasandmxnet AT wileyjoshuaf rdeeplearningessentialsastepbystepguidetobuildingdeeplearningmodelsusingtensorflowkerasandmxnet |