Deep learning for coders with fastai and PyTorch: AI applications without a PhD
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Buch |
Sprache: | English |
Veröffentlicht: |
Beijing ; Boston ; Farnham ; Sebastopol ; Tokyo
O'Reilly
2021-11-05
|
Ausgabe: | First edition, fifth release |
Schlagworte: | |
Online-Zugang: | Inhaltsverzeichnis |
Beschreibung: | "Revision" der "First Edition" von 2020; Änderungen/"Release details" nagewiesen unter: https://www.oreilly.com/catalog/errata.csp?isbn=9781492045526 |
Beschreibung: | xxiv, 594 Seiten Illustrationen |
ISBN: | 9781492045526 1492045527 |
Internformat
MARC
LEADER | 00000nam a2200000 c 4500 | ||
---|---|---|---|
001 | BV048235306 | ||
003 | DE-604 | ||
005 | 20220807 | ||
007 | t | ||
008 | 220519s2021 a||| |||| 00||| eng d | ||
020 | |a 9781492045526 |9 978-1-492-04552-6 | ||
020 | |a 1492045527 |9 1-492-04552-7 | ||
035 | |a (DE-599)BVBBV048235306 | ||
040 | |a DE-604 |b ger |e rda | ||
041 | 0 | |a eng | |
049 | |a DE-473 |a DE-706 | ||
084 | |a ST 250 |0 (DE-625)143626: |2 rvk | ||
084 | |a ST 300 |0 (DE-625)143650: |2 rvk | ||
084 | |a ST 302 |0 (DE-625)143652: |2 rvk | ||
084 | |a DAT 708 |2 stub | ||
084 | |a DAT 366 |2 stub | ||
100 | 1 | |a Howard, Jeremy |e Verfasser |0 (DE-588)1220638293 |4 aut | |
245 | 1 | 0 | |a Deep learning for coders with fastai and PyTorch |b AI applications without a PhD |c Jeremy Howard and Sylvain Gugger |
250 | |a First edition, fifth release | ||
264 | 1 | |a Beijing ; Boston ; Farnham ; Sebastopol ; Tokyo |b O'Reilly |c 2021-11-05 | |
264 | 4 | |c © 2020 | |
300 | |a xxiv, 594 Seiten |b Illustrationen | ||
336 | |b txt |2 rdacontent | ||
337 | |b n |2 rdamedia | ||
338 | |b nc |2 rdacarrier | ||
500 | |a "Revision" der "First Edition" von 2020; Änderungen/"Release details" nagewiesen unter: https://www.oreilly.com/catalog/errata.csp?isbn=9781492045526 | ||
650 | 0 | 7 | |a Deep learning |0 (DE-588)1135597375 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Python |g Programmiersprache |0 (DE-588)4434275-5 |2 gnd |9 rswk-swf |
689 | 0 | 0 | |a Deep learning |0 (DE-588)1135597375 |D s |
689 | 0 | 1 | |a Python |g Programmiersprache |0 (DE-588)4434275-5 |D s |
689 | 0 | |5 DE-604 | |
700 | 1 | |a Gugger, Sylvain |e Verfasser |0 (DE-588)1220638439 |4 aut | |
856 | 4 | 2 | |m Digitalisierung UB Bamberg - ADAM Catalogue Enrichment |q application/pdf |u http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=033615929&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |3 Inhaltsverzeichnis |
999 | |a oai:aleph.bib-bvb.de:BVB01-033615929 |
Datensatz im Suchindex
_version_ | 1804184020989247488 |
---|---|
adam_text | Table of Contents Preface.................................................................................................... xix Foreword............................................................................................... xxiii Part I. Deep Learning in Practice 1. Your Deep Learning Journey......................... ............................................ 3 Deep Learning Is for Everyone Neural Networks: A Brief History Who We Are How to Learn Deep Learning Your Projects and Your Mindset The Software: PyTorch, fastai, and Jupyter (And Why It Doesn’t Matter) Your First Model Getting a GPU Deep Learning Server Running Your First Notebook What Is Machine Learning? What Is a Neural Network? A Bit of Deep Learning Jargon Limitations Inherent to Machine Learning How Our Image Recognizer Works What Our Image Recognizer Learned Image Recognizers Can Tackle Non-Image Tasks Jargon Recap Deep Learning Is Not Just for Image Classification Validation Sets and Test Sets 3 5 7 9 11 12 13 14 15 20 23 24 25 26 33 36 40 41 48 ix
Use Judgment in Defining Test Sets A Choose Your Own Adventure Moment Questionnaire Further Research 50 54 54 56 2. From Model to Production....................................................................... 57 The Practice of Deep Learning Starting Your Project The State of Deep Learning The Drivetrain Approach Gathering Data From Data to DataLoaders Data Augmentation Training Your Model, and Using It to Clean Your Data Turning Your Model into an Online Application Using the Model for Inference Creating a Notebook App from the Model Turning Your Notebook into a Real App Deploying Your App How to Avoid Disaster Unforeseen Consequences and Feedback Loops Get Writing! Questionnaire Further Research 57 58 60 63 65 70 74 75 78 78 80 82 83 86 89 90 91 92 3. Data Ethics.......................................................................................... 93 Key Examples for Data Ethics Bugs and Recourse: Buggy Algorithm Used for Healthcare Benefits Feedback Loops: YouTube’s Recommendation System Bias: Professor Latanya Sweeney “Arrested” Why Does This Matter? Integrating Machine Learning with Product Design Topics in Data Ethics Recourse and Accountability Feedback Loops Bias Disinformation Identifying and Addressing Ethical Issues Analyze a Project You Are Working On Processes to Implement The Power of Diversity x j Table of Contents 94 95 95 95 96 99 101 101 102 105 116 118 118 119 121
Fairness, Accountability, and Transparency Role of Policy The Effectiveness of Regulation Rights and Policy Cars: A Historical Precedent Conclusion Questionnaire Further Research Deep Learning in Practice: That’s a Wrap! Part II. 122 123 124 125 125 126 127 128 128 Understanding fastaľs Applications 4. Under the Hood: Training a Digit Classifier............................. ....... ........... 133 Pixels: The Foundations of Computer Vision First Try: Pixel Similarity NumPy Arrays and PyTorch Tensors Computing Metrics Using Broadcasting Stochastic Gradient Descent Calculating Gradients Stepping with a Learning Rate An End-to-End SGD Example Summarizing Gradient Descent The MNIST Loss Function Sigmoid SGD and Mini-Batches Putting It All Together Creating an Optimizer Adding a Nonlinearity Going Deeper Jargon Recap Questionnaire Further Research 133 137 143 145 149 153 156 157 162 163 168 170 171 174 176 180 181 182 184 5. Image Classification......................................................... ................... 185 From Dogs and Cats to Pet Breeds Presizing Checking and Debugging a DataBlock Cross-Entropy Loss Viewing Activations and Labels Softmax 186 189 191 194 194 195 Table of Contents | xi
Log Likelihood Taking the log Model Interpretation Improving Our Model The Learning Rate Finder Unfreezing and Transfer Learning Discriminative Learning Rates Selecting the Number of Epochs Deeper Architectures Conclusion Questionnaire Further Research 198 200 203 205 205 207 210 212 213 215 216 217 6. Other Computer Vision Problems............................................................................... 219 Multi-Label Classification The Data Constructing a DataBlock Binary Cross Entropy Regression Assembling the Data Training a Model Conclusion Questionnaire Further Research 219 220 222 226 231 232 235 237 238 238 7. Training a State-of-the-Art Model............................................................................. 239 Imagenette Normalization Progressive Resizing Test Time Augmentation Mixup Label Smoothing Conclusion Questionnaire Further Research 239 241 243 245 246 249 251 251 252 8. Collaborative Filtering Deep Dive................................................................................. 253 A First Look at the Data Learning the Latent Factors Creating the DataLoaders Collaborative Filtering from Scratch xii j Table of Contents 254 256 257 260
Weight Decay Creating Our Own Embedding Module Interpreting Embeddings and Biases Using fastaixollab Embedding Distance Bootstrapping a Collaborative Filtering Model Deep Learning for Collaborative Filtering Conclusion Questionnaire Further Research 264 265 267 269 270 270 272 274 274 276 9. Tabular Modeling Deep Dive................................................................... 277 Categorical Embeddings Beyond Deep Learning The Dataset Kaggle Competitions Look at the Data Decision Trees Handling Dates Using TabularPandas and TabularProc Creating the Decision Tree Categorical Variables Random Forests Creating a Random Forest Out-of-Bag Error Model Interpretation Tree Variance for Prediction Confidence Feature Importance Removing Low-Importance Variables Removing Redundant Features Partial Dependence Data Leakage Tree Interpreter Extrapolation and Neural Networks The Extrapolation Problem Finding Out-of-Domain Data Using a Neural Network Ensembling Boosting Combining Embeddings with Other Methods Conclusion 277 282 284 284 285 287 289 290 292 297 298 299 301 302 302 303 305 306 308 311 312 314 315 316 318 322 323 324 325 Table of Contents | xiii
Questionnaire Further Research 326 327 10. NLP Deep Dive: RNNs........................................................................... 329 Text Preprocessing Tokenization Word Tokenization with fastai Subword Tokenization Numericalization with fastai Putting Our Texts into Batches for a Language Model Training a Text Classifier Language Model Using DataBlock Fine-Tuning the Language Model Saving and Loading Models Text Generation Creating the Classifier DataLoaders Fine-Tuning the Classifier Disinformation and Language Models Conclusion Questionnaire Further Research 331 332 333 336 338 339 342 342 343 345 346 346 349 350 352 353 354 11. Data Munging with fastai s Mid-Level API....................................................355 Going Deeper into fastai’s Layered API Transforms Writing Your Own Transform Pipeline TfmdLists and Datasets: Transformed Collections TfmdLists Datasets Applying the Mid-Level Data API: SiamesePair Conclusion Questionnaire Further Research Understanding fastai’s Applications: Wrap Up Part III. 355 356 358 359 359 360 362 364 368 368 369 369 Foundations of Deep Learning 12. A Language Model from Scratch.............................................................373 The Data xiv I Table of Contents 373
Our First Language Model from Scratch Our Language Model in PyTorch Our First Recurrent Neural Network Improving the RNN Maintaining the State of an RNN Creating More Signal Multilayer RNNs The Model Exploding or Disappearing Activations LSTM Building an LSTM from Scratch Training a Language Model Using LSTMs Regularizing an LSTM Dropout Activation Regularization and Temporal Activation Regularization Training a Weight-Tied Regularized LSTM Conclusion Questionnaire Further Research 375 376 379 381 381 384 386 388 389 390 390 393 394 395 397 398 399 400 402 13. Convolutional Neural Networks............................................................... 403 The Magic of Convolutions Mapping a Convolutional Kernel Convolutions in PyTorch Strides and Padding Understanding the Convolution Equations Our First Convolutional Neural Network Creating the CNN Understanding Convolution Arithmetic Receptive Fields A Note About Twitter Color Images Improving Training Stability A Simple Baseline Increase Batch Size lcycle Training Batch Normalization Conclusion Questionnaire Further Research 403 407 408 411 412 414 415 418 419 421 423 426 427 429 430 435 438 439 440 Table of Contents | xv
14. ResNets......................................................................................... Going Back to Imagenette Building a Modern CNN: ResNet Skip Connections A State-of-the-Art ResNet Bottleneck Layers Conclusion Questionnaire Further Research 15. Application Architectures Deep Dive......................................... Computer Vision cnnjearner unet_learner A Siamese Network Natural Language Processing Tabular Conclusion Questionnaire Further Research 441 445 445 451 454 456 456 457 459 459 459 461 463 465 466 467 469 470 16. The Training Process......................................................................... 471 Establishing a Baseline A Generic Optimizer Momentum RMSProp Adam Decoupled Weight Decay Callbacks Creating a Callback Callback Ordering and Exceptions Conclusion Questionnaire Further Research Foundations of Deep Learning: Wrap Up 471 473 474 477 479 480 480 483 487 488 489 490 490 Part IV. Deep Learning from Scratch 17. A Neural Net from the Foundations.................................................... 493 Building a Neural Net Layer from Scratch xvi j Table of Contents 493
Modeling a Neuron Matrix Multiplication from Scratch Elementwise Arithmetic Broadcasting Einstein Summation The Forward and Backward Passes Defining and Initializing a Layer Gradients and the Backward Pass Refactoring the Model Going to PyTorch Conclusion Questionnaire Further Research 493 495 496 497 502 503 503 508 511 512 515 515 517 18. CNN Interpretation with CAM................................................................. 519 CAM and Hooks Gradient CAM Conclusion Questionnaire Further Research 519 522 525 525 525 19. Afastai Learner from Scratch................................................................. 527 Data Dataset Module and Parameter Simple CNN Loss Learner Callbacks Scheduling the Learning Rate Conclusion Questionnaire Further Research 527 529 531 534 536 537 539 540 542 542 544 20. Concluding Thoughts............................................................................ 545 A. Creating a Blog........................................................................................549 B. Data Project Checklist............................................................................. 559 Index...................................................................................................... 567 Table of Contents | xvii
|
adam_txt |
Table of Contents Preface. xix Foreword. xxiii Part I. Deep Learning in Practice 1. Your Deep Learning Journey. . 3 Deep Learning Is for Everyone Neural Networks: A Brief History Who We Are How to Learn Deep Learning Your Projects and Your Mindset The Software: PyTorch, fastai, and Jupyter (And Why It Doesn’t Matter) Your First Model Getting a GPU Deep Learning Server Running Your First Notebook What Is Machine Learning? What Is a Neural Network? A Bit of Deep Learning Jargon Limitations Inherent to Machine Learning How Our Image Recognizer Works What Our Image Recognizer Learned Image Recognizers Can Tackle Non-Image Tasks Jargon Recap Deep Learning Is Not Just for Image Classification Validation Sets and Test Sets 3 5 7 9 11 12 13 14 15 20 23 24 25 26 33 36 40 41 48 ix
Use Judgment in Defining Test Sets A Choose Your Own Adventure Moment Questionnaire Further Research 50 54 54 56 2. From Model to Production. 57 The Practice of Deep Learning Starting Your Project The State of Deep Learning The Drivetrain Approach Gathering Data From Data to DataLoaders Data Augmentation Training Your Model, and Using It to Clean Your Data Turning Your Model into an Online Application Using the Model for Inference Creating a Notebook App from the Model Turning Your Notebook into a Real App Deploying Your App How to Avoid Disaster Unforeseen Consequences and Feedback Loops Get Writing! Questionnaire Further Research 57 58 60 63 65 70 74 75 78 78 80 82 83 86 89 90 91 92 3. Data Ethics. 93 Key Examples for Data Ethics Bugs and Recourse: Buggy Algorithm Used for Healthcare Benefits Feedback Loops: YouTube’s Recommendation System Bias: Professor Latanya Sweeney “Arrested” Why Does This Matter? Integrating Machine Learning with Product Design Topics in Data Ethics Recourse and Accountability Feedback Loops Bias Disinformation Identifying and Addressing Ethical Issues Analyze a Project You Are Working On Processes to Implement The Power of Diversity x j Table of Contents 94 95 95 95 96 99 101 101 102 105 116 118 118 119 121
Fairness, Accountability, and Transparency Role of Policy The Effectiveness of Regulation Rights and Policy Cars: A Historical Precedent Conclusion Questionnaire Further Research Deep Learning in Practice: That’s a Wrap! Part II. 122 123 124 125 125 126 127 128 128 Understanding fastaľs Applications 4. Under the Hood: Training a Digit Classifier. . . 133 Pixels: The Foundations of Computer Vision First Try: Pixel Similarity NumPy Arrays and PyTorch Tensors Computing Metrics Using Broadcasting Stochastic Gradient Descent Calculating Gradients Stepping with a Learning Rate An End-to-End SGD Example Summarizing Gradient Descent The MNIST Loss Function Sigmoid SGD and Mini-Batches Putting It All Together Creating an Optimizer Adding a Nonlinearity Going Deeper Jargon Recap Questionnaire Further Research 133 137 143 145 149 153 156 157 162 163 168 170 171 174 176 180 181 182 184 5. Image Classification. . 185 From Dogs and Cats to Pet Breeds Presizing Checking and Debugging a DataBlock Cross-Entropy Loss Viewing Activations and Labels Softmax 186 189 191 194 194 195 Table of Contents | xi
Log Likelihood Taking the log Model Interpretation Improving Our Model The Learning Rate Finder Unfreezing and Transfer Learning Discriminative Learning Rates Selecting the Number of Epochs Deeper Architectures Conclusion Questionnaire Further Research 198 200 203 205 205 207 210 212 213 215 216 217 6. Other Computer Vision Problems. 219 Multi-Label Classification The Data Constructing a DataBlock Binary Cross Entropy Regression Assembling the Data Training a Model Conclusion Questionnaire Further Research 219 220 222 226 231 232 235 237 238 238 7. Training a State-of-the-Art Model. 239 Imagenette Normalization Progressive Resizing Test Time Augmentation Mixup Label Smoothing Conclusion Questionnaire Further Research 239 241 243 245 246 249 251 251 252 8. Collaborative Filtering Deep Dive. 253 A First Look at the Data Learning the Latent Factors Creating the DataLoaders Collaborative Filtering from Scratch xii j Table of Contents 254 256 257 260
Weight Decay Creating Our Own Embedding Module Interpreting Embeddings and Biases Using fastaixollab Embedding Distance Bootstrapping a Collaborative Filtering Model Deep Learning for Collaborative Filtering Conclusion Questionnaire Further Research 264 265 267 269 270 270 272 274 274 276 9. Tabular Modeling Deep Dive. 277 Categorical Embeddings Beyond Deep Learning The Dataset Kaggle Competitions Look at the Data Decision Trees Handling Dates Using TabularPandas and TabularProc Creating the Decision Tree Categorical Variables Random Forests Creating a Random Forest Out-of-Bag Error Model Interpretation Tree Variance for Prediction Confidence Feature Importance Removing Low-Importance Variables Removing Redundant Features Partial Dependence Data Leakage Tree Interpreter Extrapolation and Neural Networks The Extrapolation Problem Finding Out-of-Domain Data Using a Neural Network Ensembling Boosting Combining Embeddings with Other Methods Conclusion 277 282 284 284 285 287 289 290 292 297 298 299 301 302 302 303 305 306 308 311 312 314 315 316 318 322 323 324 325 Table of Contents | xiii
Questionnaire Further Research 326 327 10. NLP Deep Dive: RNNs. 329 Text Preprocessing Tokenization Word Tokenization with fastai Subword Tokenization Numericalization with fastai Putting Our Texts into Batches for a Language Model Training a Text Classifier Language Model Using DataBlock Fine-Tuning the Language Model Saving and Loading Models Text Generation Creating the Classifier DataLoaders Fine-Tuning the Classifier Disinformation and Language Models Conclusion Questionnaire Further Research 331 332 333 336 338 339 342 342 343 345 346 346 349 350 352 353 354 11. Data Munging with fastai's Mid-Level API.355 Going Deeper into fastai’s Layered API Transforms Writing Your Own Transform Pipeline TfmdLists and Datasets: Transformed Collections TfmdLists Datasets Applying the Mid-Level Data API: SiamesePair Conclusion Questionnaire Further Research Understanding fastai’s Applications: Wrap Up Part III. 355 356 358 359 359 360 362 364 368 368 369 369 Foundations of Deep Learning 12. A Language Model from Scratch.373 The Data xiv I Table of Contents 373
Our First Language Model from Scratch Our Language Model in PyTorch Our First Recurrent Neural Network Improving the RNN Maintaining the State of an RNN Creating More Signal Multilayer RNNs The Model Exploding or Disappearing Activations LSTM Building an LSTM from Scratch Training a Language Model Using LSTMs Regularizing an LSTM Dropout Activation Regularization and Temporal Activation Regularization Training a Weight-Tied Regularized LSTM Conclusion Questionnaire Further Research 375 376 379 381 381 384 386 388 389 390 390 393 394 395 397 398 399 400 402 13. Convolutional Neural Networks. 403 The Magic of Convolutions Mapping a Convolutional Kernel Convolutions in PyTorch Strides and Padding Understanding the Convolution Equations Our First Convolutional Neural Network Creating the CNN Understanding Convolution Arithmetic Receptive Fields A Note About Twitter Color Images Improving Training Stability A Simple Baseline Increase Batch Size lcycle Training Batch Normalization Conclusion Questionnaire Further Research 403 407 408 411 412 414 415 418 419 421 423 426 427 429 430 435 438 439 440 Table of Contents | xv
14. ResNets. Going Back to Imagenette Building a Modern CNN: ResNet Skip Connections A State-of-the-Art ResNet Bottleneck Layers Conclusion Questionnaire Further Research 15. Application Architectures Deep Dive. Computer Vision cnnjearner unet_learner A Siamese Network Natural Language Processing Tabular Conclusion Questionnaire Further Research 441 445 445 451 454 456 456 457 459 459 459 461 463 465 466 467 469 470 16. The Training Process. 471 Establishing a Baseline A Generic Optimizer Momentum RMSProp Adam Decoupled Weight Decay Callbacks Creating a Callback Callback Ordering and Exceptions Conclusion Questionnaire Further Research Foundations of Deep Learning: Wrap Up 471 473 474 477 479 480 480 483 487 488 489 490 490 Part IV. Deep Learning from Scratch 17. A Neural Net from the Foundations. 493 Building a Neural Net Layer from Scratch xvi j Table of Contents 493
Modeling a Neuron Matrix Multiplication from Scratch Elementwise Arithmetic Broadcasting Einstein Summation The Forward and Backward Passes Defining and Initializing a Layer Gradients and the Backward Pass Refactoring the Model Going to PyTorch Conclusion Questionnaire Further Research 493 495 496 497 502 503 503 508 511 512 515 515 517 18. CNN Interpretation with CAM. 519 CAM and Hooks Gradient CAM Conclusion Questionnaire Further Research 519 522 525 525 525 19. Afastai Learner from Scratch. 527 Data Dataset Module and Parameter Simple CNN Loss Learner Callbacks Scheduling the Learning Rate Conclusion Questionnaire Further Research 527 529 531 534 536 537 539 540 542 542 544 20. Concluding Thoughts. 545 A. Creating a Blog.549 B. Data Project Checklist. 559 Index. 567 Table of Contents | xvii |
any_adam_object | 1 |
any_adam_object_boolean | 1 |
author | Howard, Jeremy Gugger, Sylvain |
author_GND | (DE-588)1220638293 (DE-588)1220638439 |
author_facet | Howard, Jeremy Gugger, Sylvain |
author_role | aut aut |
author_sort | Howard, Jeremy |
author_variant | j h jh s g sg |
building | Verbundindex |
bvnumber | BV048235306 |
classification_rvk | ST 250 ST 300 ST 302 |
classification_tum | DAT 708 DAT 366 |
ctrlnum | (DE-599)BVBBV048235306 |
discipline | Informatik |
discipline_str_mv | Informatik |
edition | First edition, fifth release |
format | Book |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01918nam a2200433 c 4500</leader><controlfield tag="001">BV048235306</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20220807 </controlfield><controlfield tag="007">t</controlfield><controlfield tag="008">220519s2021 a||| |||| 00||| eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9781492045526</subfield><subfield code="9">978-1-492-04552-6</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">1492045527</subfield><subfield code="9">1-492-04552-7</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)BVBBV048235306</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">rda</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-473</subfield><subfield code="a">DE-706</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 250</subfield><subfield code="0">(DE-625)143626:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 300</subfield><subfield code="0">(DE-625)143650:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 302</subfield><subfield code="0">(DE-625)143652:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">DAT 708</subfield><subfield code="2">stub</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">DAT 366</subfield><subfield code="2">stub</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Howard, Jeremy</subfield><subfield code="e">Verfasser</subfield><subfield code="0">(DE-588)1220638293</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Deep learning for coders with fastai and PyTorch</subfield><subfield code="b">AI applications without a PhD</subfield><subfield code="c">Jeremy Howard and Sylvain Gugger</subfield></datafield><datafield tag="250" ind1=" " ind2=" "><subfield code="a">First edition, fifth release</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Beijing ; Boston ; Farnham ; Sebastopol ; Tokyo</subfield><subfield code="b">O'Reilly</subfield><subfield code="c">2021-11-05</subfield></datafield><datafield tag="264" ind1=" " ind2="4"><subfield code="c">© 2020</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">xxiv, 594 Seiten</subfield><subfield code="b">Illustrationen</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">"Revision" der "First Edition" von 2020; Änderungen/"Release details" nagewiesen unter: https://www.oreilly.com/catalog/errata.csp?isbn=9781492045526</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Deep learning</subfield><subfield code="0">(DE-588)1135597375</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Python</subfield><subfield code="g">Programmiersprache</subfield><subfield code="0">(DE-588)4434275-5</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Deep learning</subfield><subfield code="0">(DE-588)1135597375</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="1"><subfield code="a">Python</subfield><subfield code="g">Programmiersprache</subfield><subfield code="0">(DE-588)4434275-5</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Gugger, Sylvain</subfield><subfield code="e">Verfasser</subfield><subfield code="0">(DE-588)1220638439</subfield><subfield code="4">aut</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="m">Digitalisierung UB Bamberg - ADAM Catalogue Enrichment</subfield><subfield code="q">application/pdf</subfield><subfield code="u">http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=033615929&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA</subfield><subfield code="3">Inhaltsverzeichnis</subfield></datafield><datafield tag="999" ind1=" " ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-033615929</subfield></datafield></record></collection> |
id | DE-604.BV048235306 |
illustrated | Illustrated |
index_date | 2024-07-03T19:52:20Z |
indexdate | 2024-07-10T09:32:42Z |
institution | BVB |
isbn | 9781492045526 1492045527 |
language | English |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-033615929 |
open_access_boolean | |
owner | DE-473 DE-BY-UBG DE-706 |
owner_facet | DE-473 DE-BY-UBG DE-706 |
physical | xxiv, 594 Seiten Illustrationen |
publishDate | 2021 |
publishDateSearch | 2021 |
publishDateSort | 2021 |
publisher | O'Reilly |
record_format | marc |
spelling | Howard, Jeremy Verfasser (DE-588)1220638293 aut Deep learning for coders with fastai and PyTorch AI applications without a PhD Jeremy Howard and Sylvain Gugger First edition, fifth release Beijing ; Boston ; Farnham ; Sebastopol ; Tokyo O'Reilly 2021-11-05 © 2020 xxiv, 594 Seiten Illustrationen txt rdacontent n rdamedia nc rdacarrier "Revision" der "First Edition" von 2020; Änderungen/"Release details" nagewiesen unter: https://www.oreilly.com/catalog/errata.csp?isbn=9781492045526 Deep learning (DE-588)1135597375 gnd rswk-swf Python Programmiersprache (DE-588)4434275-5 gnd rswk-swf Deep learning (DE-588)1135597375 s Python Programmiersprache (DE-588)4434275-5 s DE-604 Gugger, Sylvain Verfasser (DE-588)1220638439 aut Digitalisierung UB Bamberg - ADAM Catalogue Enrichment application/pdf http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=033615929&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA Inhaltsverzeichnis |
spellingShingle | Howard, Jeremy Gugger, Sylvain Deep learning for coders with fastai and PyTorch AI applications without a PhD Deep learning (DE-588)1135597375 gnd Python Programmiersprache (DE-588)4434275-5 gnd |
subject_GND | (DE-588)1135597375 (DE-588)4434275-5 |
title | Deep learning for coders with fastai and PyTorch AI applications without a PhD |
title_auth | Deep learning for coders with fastai and PyTorch AI applications without a PhD |
title_exact_search | Deep learning for coders with fastai and PyTorch AI applications without a PhD |
title_exact_search_txtP | Deep learning for coders with fastai and PyTorch AI applications without a PhD |
title_full | Deep learning for coders with fastai and PyTorch AI applications without a PhD Jeremy Howard and Sylvain Gugger |
title_fullStr | Deep learning for coders with fastai and PyTorch AI applications without a PhD Jeremy Howard and Sylvain Gugger |
title_full_unstemmed | Deep learning for coders with fastai and PyTorch AI applications without a PhD Jeremy Howard and Sylvain Gugger |
title_short | Deep learning for coders with fastai and PyTorch |
title_sort | deep learning for coders with fastai and pytorch ai applications without a phd |
title_sub | AI applications without a PhD |
topic | Deep learning (DE-588)1135597375 gnd Python Programmiersprache (DE-588)4434275-5 gnd |
topic_facet | Deep learning Python Programmiersprache |
url | http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=033615929&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |
work_keys_str_mv | AT howardjeremy deeplearningforcoderswithfastaiandpytorchaiapplicationswithoutaphd AT guggersylvain deeplearningforcoderswithfastaiandpytorchaiapplicationswithoutaphd |