Deep learning with Python:
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Buch |
Sprache: | English |
Veröffentlicht: |
Shelter Island, NY
Manning
[2021]
|
Ausgabe: | Second edition |
Schlagworte: | |
Online-Zugang: | Inhaltsverzeichnis |
Beschreibung: | xxiv, 478 Seiten Illustrationen, Diagramme |
ISBN: | 9781617296864 |
Internformat
MARC
LEADER | 00000nam a2200000zc 4500 | ||
---|---|---|---|
001 | BV047817293 | ||
003 | DE-604 | ||
005 | 20241009 | ||
007 | t | ||
008 | 220207s2021 a||| |||| 00||| eng d | ||
020 | |a 9781617296864 |c pbk. |9 978-1-61729-686-4 | ||
035 | |a (OCoLC)1309935476 | ||
035 | |a (DE-599)BVBBV047817293 | ||
040 | |a DE-604 |b ger |e rda | ||
041 | 0 | |a eng | |
049 | |a DE-11 |a DE-706 |a DE-859 |a DE-M347 |a DE-473 |a DE-1050 |a DE-83 |a DE-355 |a DE-Aug4 |a DE-573 |a DE-1102 |a DE-1051 |a DE-739 |a DE-898 |a DE-703 | ||
084 | |a ST 250 |0 (DE-625)143626: |2 rvk | ||
084 | |a ST 300 |0 (DE-625)143650: |2 rvk | ||
084 | |a ST 302 |0 (DE-625)143652: |2 rvk | ||
084 | |a ST 301 |0 (DE-625)143651: |2 rvk | ||
084 | |a DAT 708f |2 stub | ||
084 | |a 68T05 |2 msc | ||
100 | 1 | |a Chollet, François |e Verfasser |0 (DE-588)1151332550 |4 aut | |
245 | 1 | 0 | |a Deep learning with Python |c François Chollet |
250 | |a Second edition | ||
264 | 1 | |a Shelter Island, NY |b Manning |c [2021] | |
264 | 4 | |c © 2021 | |
300 | |a xxiv, 478 Seiten |b Illustrationen, Diagramme | ||
336 | |b txt |2 rdacontent | ||
337 | |b n |2 rdamedia | ||
338 | |b nc |2 rdacarrier | ||
650 | 0 | 7 | |a Keras |g Framework, Informatik |0 (DE-588)1160521077 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Python |g Programmiersprache |0 (DE-588)4434275-5 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Deep learning |0 (DE-588)1135597375 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Maschinelles Lernen |0 (DE-588)4193754-5 |2 gnd |9 rswk-swf |
689 | 0 | 0 | |a Keras |g Framework, Informatik |0 (DE-588)1160521077 |D s |
689 | 0 | 1 | |a Deep learning |0 (DE-588)1135597375 |D s |
689 | 0 | 2 | |a Python |g Programmiersprache |0 (DE-588)4434275-5 |D s |
689 | 0 | |5 DE-604 | |
689 | 1 | 0 | |a Maschinelles Lernen |0 (DE-588)4193754-5 |D s |
689 | 1 | 1 | |a Python |g Programmiersprache |0 (DE-588)4434275-5 |D s |
689 | 1 | |5 DE-604 | |
776 | 0 | 8 | |i Erscheint auch als |n Online-Ausgabe |z 978-1-63835-009-5 |
856 | 4 | 2 | |m Digitalisierung UB Bamberg - ADAM Catalogue Enrichment |q application/pdf |u http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=033200672&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |3 Inhaltsverzeichnis |
943 | 1 | |a oai:aleph.bib-bvb.de:BVB01-033200672 |
Datensatz im Suchindex
_version_ | 1814155231819202560 |
---|---|
adam_text |
contents preface xvii acknowledgments xix about this book xx about the author xxiii about the cover illustration xxiv What is deep learning? 1.1 1 Artificial intelligence, machine learning, and deep learning 2 Artificial intelligence 2 ■ Machine kaming 3 * Learning ruks and representations from data 4 * The “deep” in “deep kaming” 7 ■ Understanding how deep kaming works, in three figures 8 ■ What deep kaming has achieved 50 far 10 Don’t believe the short-term hype 1.2 11 · The promise ofAI 12 Before deep learning: A brief history of machine learning 13 Probabilistic modeling 13 * Early neural networks 14 Kernel methods 14 * Decision trees, random forests, and gradient boosting machines 15 * Back to neural networks 16 What makes deep kaming different 17 * The modem machine kaming landscape 18 ix
CONTENTS 1.3 Why deep learning? Why now? 20 Hardware 20 * Data 21 * Algorithms 22 * A new wave of investment 23 * The democratization of deep learning 24 Will it last? 24 The mathematical building blocks of neural networks 2.1 2.2 26 A first look at a neural network 27 Data representations for neural networks 31 Scalars (rank-0 tensors) 31 * Vectors (rank-1 tensors) 31 Matrices (rank-2 tensors) 32 * Rank-3 and higher-rank tensors 32 * Key attributes 32 * Manipulating tensors in NumPy 34 * The notion of data batches 35 * Real-world examphs of data tensors 35 * Vector data 35 · Timeseries data or sequence data 36* Image data 37* Videodata 37 2.3 The gears of neural networks: Tensor operations 38 Element-wise operations 38 * Broadcasting 40 * Tensor product Tensor reshaping 43 * Geometric interpretation oftensor operations A geometric interpretation of deep learning 47 2.4 The engine of neural networks: Gradient-based optimization 48 What’s a derivative? 49* Derivative of a tensor operation: The gradient 51 * Stochastic gradient descent 52 * Chaining derivatives: The Backpropagation algorithm 55 2.5 Looking back at our first example 61 Reimplementing ourfirst exampk from scratch in TensorHow 63 Running one training step 64 * The full training loop 65 Evaluating the model 66 Introduction to Keras and TensorFlow 68 3.1 What’s TensorFlow? 69 3.2 What’s Keras? 3.3 Keras and TensorFlow: A brief history 71 3.4 Setting up a deep learning workspace 71 69 fupyter notebooks: The preferred way to run deep haming experiments 72 * Using Colaboratory 73 3.5 First steps with TensorFlow 75
Constant tensors and variables 76 * Tensor operations: Doing math in TensorFlow 78 * A second look at the GradientTape API 78 * An endrto-end example: A linear classifier in pure TensorFlow 79 41 44
CONTENTS 3.6 Anatomy of a neural network: Understanding core Keras APIs 84 Layers: The building blocks of deep Laming 84 * From layers to modeh 87 * The “compik" step: Configuring the learning process 88 * Picking a loss function 90 * Understanding the fit() method 91 * Monitoring loss and metrics on validation data 91 ■ Inference: Using a model after training 93 Getting started with neural networks: Classification and regression 95 4.1 Classifying movie reviews: A binary classification example 97 The IMDB dataset 97 а Preparing the data 98 * Building your model 99 * Validating your approach 102 * Using a trained model to generate predictions on new data 105 * Further experiments 105 * Wrapping up 106 4.2 Classifying newswires: A multiclass classification example 106 The Reuters dataset 106 * Preparing the data 107 · Building your model 108 * Validating your approach 109 * Generating predictions on new data 111 * A different way to handle the labels and the loss 112 * The importance of having sufficiently large intermediate layers 112 * Further experiments 113 Wrapping up 113 4.3 Predicting house prices: A regression example 113 The Boston housing price dataset 114 * Preparing the data 114 Building your model 115* Validating your approach using К-fold validation 115* Generating predictions on new data 119 * Wrapping up 119 Fundamentals of machine learning 5.1 Generalization: The goal of machine learning Underfitting and overfitting in deep learning 127 5.2 121 121 122* The nature ofgeneralization Evaluating machine learning models 133 Training, validation, and test sets
133* Beating a common-sense baseline 136 * Things to keep in mind about model evaluation 137 5.3 Improving model fit 138 Tuning key gradient descent parameters 138* Leveraging better architecture priors 139 * Increasing model capacity 140
CONTENTS 5.4 Improving generalization 142 Dataset curation 142 * Feature engineering 143 я Using early stopping 144 · Regularizing your model 145 The universal workflow of machine learning 6.1 Define the task 153 155 Frame the problem 155 я Collect a dataset 156 e Understand your data 160 ■ Choose a measure of success 160 6.2 Develop a model 161 Prepare the data 161 * Choose an evaluation protocol Beat a baseline 163 * Scale up: Develop a model that overfìts 164 K Regularize and tune your model 165 6.3 Deploy the model 162 165 Explain your work to stakeholders and set expectations 165 Ship an inference model 166 * Monitor your model in the wild 169 s Maintain your model 170 Working with Keras: A deep dive 7.1 7.2 172 A spectrum of workflows 173 Different ways to build Keras models 173 The Sequential model 174 * The Functional API 176 Subclassing the Model class 182 s Mixing and matching different components 184 * Remember: Use the right toolfor the job 185 7.3 Using built-in training and evaluation loops 185 Writing your own metrics 186 а Using callbacks 187 Writing your own callbacks 189 s Monitoring and visualization with TensorBoard 190 7.4 Writing your own training and evaluation loops 192 Training versus inference 194 а Low-level usage of metrics 195 A complete training and evaluation loop 195 ■ Make it fast with tffunction 197 * Leveragingfit() with a custom training loop 198 Introduction to deep learningfor computer vision 8.1 Introduction to convnets The convolution operation operation 209 8.2 201 202 204 ՛ The max-pooling Training a convnet from scratch on a small
dataset 211 The relevance of deep learningfor small-data problems 212 Downloading the data 212 ■ Building the model 215 Data preprocessing 217 а Using data augmentation 221
CONTENTS xiii 8.3 Leveraging a pre trained model 224 Feature extraction with a pretrained model pretrained model 234 225 · Fine-tuning a Advanced deep learningfor computer vision 9.1 9.2 9.3 238 Three essential computer vision tasks 238 An image segmentation example 240 Modern commet architecture patterns 248 Modularity, hierarchy, and reuse 249 · Residual connections 251 Batch normalization 255 · Depthwise separable convolutions 257 Putting it together: A mini Xception-like model 259 9.4 Interpreting what commets learn 261 Visualizing intermediate activations 262 » Visualizing convnet filters 268 * Visualizing heatmaps of class activation 273 Deep learningfor timeseries 10.1 10.2 280 Different kinds of timeseries tasks 280 A temperature-forecasting example 281 Preparing the data 285 · A common-sense, non-machine learning baseline 288 * Let’s try a basic machine learning model 289 Let’s try a ID convolutional model 290 · A first recurrent baseline 292 10.3 Understanding recurrent neural networks A recurrent layer in Keros 10.4 293 296 Advanced use of recurrent neural networks 300 Using recurrent dropout to fight overfitting 300 · Stacking recurrent layers 303 « Using bidirectional RNNs 304 Going even further 307 ՜է Deep learningfor text 309 L 11.1 Natural language processing: The bird’s eye view 309 11.2 Preparing text data 311 Text standardization 312 ■ Text splitting (tokenization) Vocabulary indexing 314 * Using the TextVectorization layer 316 11.3 313 Two approaches for representing groups of words: Sets and sequences 319 Preparing the IMDB movie reviews data 320 ’
Processing words as a set: The bag-of-words approach 322 * Processing words as a sequence: The sequence model approach 327
CONTENTS 11.4 The Transformer architecture 336 Understanding self-attention 337 · Multi-head attention 341 The Transformer encoder 342 « When to use sequence models over bag-of-words models 349 11.5 Beyond text classification: Sequence-to-sequence learning 350 A machine translation example 351 * Sequence-to-sequence learning with RNNs 354 · Sequence-to-sequence learning with Transformer 358 Generative deep learning 364 12.1 Text generation 366 A brief history ofgenerative deep learningfor sequence generation 366 я How do you generate sequence data? 367 The importance of the sampling strategy 368 ■ Implementing text generation with Keras 369 * A text-generation callback with variable-temperature sampling 372 * Wrapping up 376 12.2 DeepDream 376 Implementing DeepDream in Keras 12.3 Neural style transfer 377 * Wrapping up 383 The content loss 384 ՞ The styk loss 384 in Kerns 385 * Wrapping up 391 12.4 383 s Neural style transfer Generating images with variational autoencoders 391 Samplingfrom latent spaces of images 391 ‘ Concept vectors for image editing 393 s Variational autoencoders 393 Implementing a VAE with Keras 396 · Wrapping up 401 12.5 Introduction to generative adversarial networks 401 A schematic GAN impkmentation 402 * A bag of tricks 403 ■ Getting our hands on the СекЬА dataset 404 The discriminator 405 * The generator 407 * The adversarial network 408 “ Wrapping up 410 Best practices for the real world 412 13.1 Getting the most out of your models Hyperparameter optimization 13.2 413 413 · Model ensembling Scaling-up model training 421 Speeding up training on GPU
with mixed precision Multi-GPU training 425 ■ TPU training 428 422 420
CONTENTS f xv Conclusions 431 14.1 Key concepts in review 432 Various approaches to AI 432 * What makes deep learning special within thefield of machine learning 432 ■ How to think about deep learning 433 * Key enabling technologies 434 · The universal machine learning workflow 435 · Key network architectures 436 ■ The space ofpossibilities 440 14.2 The limitations of deep learning 442 The risk of anthropomorphizing machine learning models 443 Automatons vs. intelligent agents 445 * Local generalization vs. extreme generalization 446 ■ The purpose of intelligence 448 Climbing the spectrum ofgeneralization 449 14.3 Setting the course toward greater generality in AI 450 On the importance of setting the right objective: The shortcut rule 450 · A new target 452 14.4 Implementing intelligence: The missing ingredients 454 Intelligence as sensitivity to abstract analogies 454 ■ The two poles of abstraction 455 ■ The missing half of the picture 458 14.5 The future of deep learning 459 Modeh as programs 460 ■ Blending together deep learning and program synthesis 461 * Lifelong learning and modular subroutine reuse 463 · The long-term vision 465 14.6 Staying up to date in a fast-moving field 466 Practice on real-world problems using Kaggle 466 · Read about the latest developments on arXiv 466 · Explore the Keras ecosystem 467 14.7 Final words index 469 467 |
adam_txt |
contents preface xvii acknowledgments xix about this book xx about the author xxiii about the cover illustration xxiv What is deep learning? 1.1 1 Artificial intelligence, machine learning, and deep learning 2 Artificial intelligence 2 ■ Machine kaming 3 * Learning ruks and representations from data 4 * The “deep” in “deep kaming” 7 ■ Understanding how deep kaming works, in three figures 8 ■ What deep kaming has achieved 50 far 10 Don’t believe the short-term hype 1.2 11 · The promise ofAI 12 Before deep learning: A brief history of machine learning 13 Probabilistic modeling 13 * Early neural networks 14 Kernel methods 14 * Decision trees, random forests, and gradient boosting machines 15 * Back to neural networks 16 What makes deep kaming different 17 * The modem machine kaming landscape 18 ix
CONTENTS 1.3 Why deep learning? Why now? 20 Hardware 20 * Data 21 * Algorithms 22 * A new wave of investment 23 * The democratization of deep learning 24 Will it last? 24 The mathematical building blocks of neural networks 2.1 2.2 26 A first look at a neural network 27 Data representations for neural networks 31 Scalars (rank-0 tensors) 31 * Vectors (rank-1 tensors) 31 Matrices (rank-2 tensors) 32 * Rank-3 and higher-rank tensors 32 * Key attributes 32 * Manipulating tensors in NumPy 34 * The notion of data batches 35 * Real-world examphs of data tensors 35 * Vector data 35 · Timeseries data or sequence data 36* Image data 37* Videodata 37 2.3 The gears of neural networks: Tensor operations 38 Element-wise operations 38 * Broadcasting 40 * Tensor product Tensor reshaping 43 * Geometric interpretation oftensor operations A geometric interpretation of deep learning 47 2.4 The engine of neural networks: Gradient-based optimization 48 What’s a derivative? 49* Derivative of a tensor operation: The gradient 51 * Stochastic gradient descent 52 * Chaining derivatives: The Backpropagation algorithm 55 2.5 Looking back at our first example 61 Reimplementing ourfirst exampk from scratch in TensorHow 63 Running one training step 64 * The full training loop 65 Evaluating the model 66 Introduction to Keras and TensorFlow 68 3.1 What’s TensorFlow? 69 3.2 What’s Keras? 3.3 Keras and TensorFlow: A brief history 71 3.4 Setting up a deep learning workspace 71 69 fupyter notebooks: The preferred way to run deep haming experiments 72 * Using Colaboratory 73 3.5 First steps with TensorFlow 75
Constant tensors and variables 76 * Tensor operations: Doing math in TensorFlow 78 * A second look at the GradientTape API 78 * An endrto-end example: A linear classifier in pure TensorFlow 79 41 44
CONTENTS 3.6 Anatomy of a neural network: Understanding core Keras APIs 84 Layers: The building blocks of deep Laming 84 * From layers to modeh 87 * The “compik" step: Configuring the learning process 88 * Picking a loss function 90 * Understanding the fit() method 91 * Monitoring loss and metrics on validation data 91 ■ Inference: Using a model after training 93 Getting started with neural networks: Classification and regression 95 4.1 Classifying movie reviews: A binary classification example 97 The IMDB dataset 97 а Preparing the data 98 * Building your model 99 * Validating your approach 102 * Using a trained model to generate predictions on new data 105 * Further experiments 105 * Wrapping up 106 4.2 Classifying newswires: A multiclass classification example 106 The Reuters dataset 106 * Preparing the data 107 · Building your model 108 * Validating your approach 109 * Generating predictions on new data 111 * A different way to handle the labels and the loss 112 * The importance of having sufficiently large intermediate layers 112 * Further experiments 113 Wrapping up 113 4.3 Predicting house prices: A regression example 113 The Boston housing price dataset 114 * Preparing the data 114 Building your model 115* Validating your approach using К-fold validation 115* Generating predictions on new data 119 * Wrapping up 119 Fundamentals of machine learning 5.1 Generalization: The goal of machine learning Underfitting and overfitting in deep learning 127 5.2 121 121 122* The nature ofgeneralization Evaluating machine learning models 133 Training, validation, and test sets
133* Beating a common-sense baseline 136 * Things to keep in mind about model evaluation 137 5.3 Improving model fit 138 Tuning key gradient descent parameters 138* Leveraging better architecture priors 139 * Increasing model capacity 140
CONTENTS 5.4 Improving generalization 142 Dataset curation 142 * Feature engineering 143 я Using early stopping 144 · Regularizing your model 145 The universal workflow of machine learning 6.1 Define the task 153 155 Frame the problem 155 я Collect a dataset 156 e Understand your data 160 ■ Choose a measure of success 160 6.2 Develop a model 161 Prepare the data 161 * Choose an evaluation protocol Beat a baseline 163 * Scale up: Develop a model that overfìts 164 K Regularize and tune your model 165 6.3 Deploy the model 162 165 Explain your work to stakeholders and set expectations 165 Ship an inference model 166 * Monitor your model in the wild 169 s Maintain your model 170 Working with Keras: A deep dive 7.1 7.2 172 A spectrum of workflows 173 Different ways to build Keras models 173 The Sequential model 174 * The Functional API 176 Subclassing the Model class 182 s Mixing and matching different components 184 * Remember: Use the right toolfor the job 185 7.3 Using built-in training and evaluation loops 185 Writing your own metrics 186 а Using callbacks 187 Writing your own callbacks 189 s Monitoring and visualization with TensorBoard 190 7.4 Writing your own training and evaluation loops 192 Training versus inference 194 а Low-level usage of metrics 195 A complete training and evaluation loop 195 ■ Make it fast with tffunction 197 * Leveragingfit() with a custom training loop 198 Introduction to deep learningfor computer vision 8.1 Introduction to convnets The convolution operation operation 209 8.2 201 202 204 ՛ The max-pooling Training a convnet from scratch on a small
dataset 211 The relevance of deep learningfor small-data problems 212 Downloading the data 212 ■ Building the model 215 Data preprocessing 217 а Using data augmentation 221
CONTENTS xiii 8.3 Leveraging a pre trained model 224 Feature extraction with a pretrained model pretrained model 234 225 · Fine-tuning a Advanced deep learningfor computer vision 9.1 9.2 9.3 238 Three essential computer vision tasks 238 An image segmentation example 240 Modern commet architecture patterns 248 Modularity, hierarchy, and reuse 249 · Residual connections 251 Batch normalization 255 · Depthwise separable convolutions 257 Putting it together: A mini Xception-like model 259 9.4 Interpreting what commets learn 261 Visualizing intermediate activations 262 » Visualizing convnet filters 268 * Visualizing heatmaps of class activation 273 Deep learningfor timeseries 10.1 10.2 280 Different kinds of timeseries tasks 280 A temperature-forecasting example 281 Preparing the data 285 · A common-sense, non-machine learning baseline 288 * Let’s try a basic machine learning model 289 Let’s try a ID convolutional model 290 · A first recurrent baseline 292 10.3 Understanding recurrent neural networks A recurrent layer in Keros 10.4 293 296 Advanced use of recurrent neural networks 300 Using recurrent dropout to fight overfitting 300 · Stacking recurrent layers 303 « Using bidirectional RNNs 304 Going even further 307 ՜է Deep learningfor text 309 L 11.1 Natural language processing: The bird’s eye view 309 11.2 Preparing text data 311 Text standardization 312 ■ Text splitting (tokenization) Vocabulary indexing 314 * Using the TextVectorization layer 316 11.3 313 Two approaches for representing groups of words: Sets and sequences 319 Preparing the IMDB movie reviews data 320 ’
Processing words as a set: The bag-of-words approach 322 * Processing words as a sequence: The sequence model approach 327
CONTENTS 11.4 The Transformer architecture 336 Understanding self-attention 337 · Multi-head attention 341 The Transformer encoder 342 « When to use sequence models over bag-of-words models 349 11.5 Beyond text classification: Sequence-to-sequence learning 350 A machine translation example 351 * Sequence-to-sequence learning with RNNs 354 · Sequence-to-sequence learning with Transformer 358 Generative deep learning 364 12.1 Text generation 366 A brief history ofgenerative deep learningfor sequence generation 366 я How do you generate sequence data? 367 The importance of the sampling strategy 368 ■ Implementing text generation with Keras 369 * A text-generation callback with variable-temperature sampling 372 * Wrapping up 376 12.2 DeepDream 376 Implementing DeepDream in Keras 12.3 Neural style transfer 377 * Wrapping up 383 The content loss 384 ՞ The styk loss 384 in Kerns 385 * Wrapping up 391 12.4 383 s Neural style transfer Generating images with variational autoencoders 391 Samplingfrom latent spaces of images 391 ‘ Concept vectors for image editing 393 s Variational autoencoders 393 Implementing a VAE with Keras 396 · Wrapping up 401 12.5 Introduction to generative adversarial networks 401 A schematic GAN impkmentation 402 * A bag of tricks 403 ■ Getting our hands on the СекЬА dataset 404 The discriminator 405 * The generator 407 * The adversarial network 408 “ Wrapping up 410 Best practices for the real world 412 13.1 Getting the most out of your models Hyperparameter optimization 13.2 413 413 · Model ensembling Scaling-up model training 421 Speeding up training on GPU
with mixed precision Multi-GPU training 425 ■ TPU training 428 422 420
CONTENTS f xv Conclusions 431 14.1 Key concepts in review 432 Various approaches to AI 432 * What makes deep learning special within thefield of machine learning 432 ■ How to think about deep learning 433 * Key enabling technologies 434 · The universal machine learning workflow 435 · Key network architectures 436 ■ The space ofpossibilities 440 14.2 The limitations of deep learning 442 The risk of anthropomorphizing machine learning models 443 Automatons vs. intelligent agents 445 * Local generalization vs. extreme generalization 446 ■ The purpose of intelligence 448 Climbing the spectrum ofgeneralization 449 14.3 Setting the course toward greater generality in AI 450 On the importance of setting the right objective: The shortcut rule 450 · A new target 452 14.4 Implementing intelligence: The missing ingredients 454 Intelligence as sensitivity to abstract analogies 454 ■ The two poles of abstraction 455 ■ The missing half of the picture 458 14.5 The future of deep learning 459 Modeh as programs 460 ■ Blending together deep learning and program synthesis 461 * Lifelong learning and modular subroutine reuse 463 · The long-term vision 465 14.6 Staying up to date in a fast-moving field 466 Practice on real-world problems using Kaggle 466 · Read about the latest developments on arXiv 466 · Explore the Keras ecosystem 467 14.7 Final words index 469 467 |
any_adam_object | 1 |
any_adam_object_boolean | 1 |
author | Chollet, François |
author_GND | (DE-588)1151332550 |
author_facet | Chollet, François |
author_role | aut |
author_sort | Chollet, François |
author_variant | f c fc |
building | Verbundindex |
bvnumber | BV047817293 |
classification_rvk | ST 250 ST 300 ST 302 ST 301 |
classification_tum | DAT 708f |
ctrlnum | (OCoLC)1309935476 (DE-599)BVBBV047817293 |
discipline | Informatik |
discipline_str_mv | Informatik |
edition | Second edition |
format | Book |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>00000nam a2200000zc 4500</leader><controlfield tag="001">BV047817293</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20241009</controlfield><controlfield tag="007">t</controlfield><controlfield tag="008">220207s2021 a||| |||| 00||| eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9781617296864</subfield><subfield code="c">pbk.</subfield><subfield code="9">978-1-61729-686-4</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)1309935476</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)BVBBV047817293</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">rda</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-11</subfield><subfield code="a">DE-706</subfield><subfield code="a">DE-859</subfield><subfield code="a">DE-M347</subfield><subfield code="a">DE-473</subfield><subfield code="a">DE-1050</subfield><subfield code="a">DE-83</subfield><subfield code="a">DE-355</subfield><subfield code="a">DE-Aug4</subfield><subfield code="a">DE-573</subfield><subfield code="a">DE-1102</subfield><subfield code="a">DE-1051</subfield><subfield code="a">DE-739</subfield><subfield code="a">DE-898</subfield><subfield code="a">DE-703</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 250</subfield><subfield code="0">(DE-625)143626:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 300</subfield><subfield code="0">(DE-625)143650:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 302</subfield><subfield code="0">(DE-625)143652:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 301</subfield><subfield code="0">(DE-625)143651:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">DAT 708f</subfield><subfield code="2">stub</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">68T05</subfield><subfield code="2">msc</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Chollet, François</subfield><subfield code="e">Verfasser</subfield><subfield code="0">(DE-588)1151332550</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Deep learning with Python</subfield><subfield code="c">François Chollet</subfield></datafield><datafield tag="250" ind1=" " ind2=" "><subfield code="a">Second edition</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Shelter Island, NY</subfield><subfield code="b">Manning</subfield><subfield code="c">[2021]</subfield></datafield><datafield tag="264" ind1=" " ind2="4"><subfield code="c">© 2021</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">xxiv, 478 Seiten</subfield><subfield code="b">Illustrationen, Diagramme</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Keras</subfield><subfield code="g">Framework, Informatik</subfield><subfield code="0">(DE-588)1160521077</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Python</subfield><subfield code="g">Programmiersprache</subfield><subfield code="0">(DE-588)4434275-5</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Deep learning</subfield><subfield code="0">(DE-588)1135597375</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Maschinelles Lernen</subfield><subfield code="0">(DE-588)4193754-5</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Keras</subfield><subfield code="g">Framework, Informatik</subfield><subfield code="0">(DE-588)1160521077</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="1"><subfield code="a">Deep learning</subfield><subfield code="0">(DE-588)1135597375</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="2"><subfield code="a">Python</subfield><subfield code="g">Programmiersprache</subfield><subfield code="0">(DE-588)4434275-5</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="689" ind1="1" ind2="0"><subfield code="a">Maschinelles Lernen</subfield><subfield code="0">(DE-588)4193754-5</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="1" ind2="1"><subfield code="a">Python</subfield><subfield code="g">Programmiersprache</subfield><subfield code="0">(DE-588)4434275-5</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="1" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Erscheint auch als</subfield><subfield code="n">Online-Ausgabe</subfield><subfield code="z">978-1-63835-009-5</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="m">Digitalisierung UB Bamberg - ADAM Catalogue Enrichment</subfield><subfield code="q">application/pdf</subfield><subfield code="u">http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=033200672&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA</subfield><subfield code="3">Inhaltsverzeichnis</subfield></datafield><datafield tag="943" ind1="1" ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-033200672</subfield></datafield></record></collection> |
id | DE-604.BV047817293 |
illustrated | Illustrated |
index_date | 2024-07-03T19:07:14Z |
indexdate | 2024-10-28T11:00:50Z |
institution | BVB |
isbn | 9781617296864 |
language | English |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-033200672 |
oclc_num | 1309935476 |
open_access_boolean | |
owner | DE-11 DE-706 DE-859 DE-M347 DE-473 DE-BY-UBG DE-1050 DE-83 DE-355 DE-BY-UBR DE-Aug4 DE-573 DE-1102 DE-1051 DE-739 DE-898 DE-BY-UBR DE-703 |
owner_facet | DE-11 DE-706 DE-859 DE-M347 DE-473 DE-BY-UBG DE-1050 DE-83 DE-355 DE-BY-UBR DE-Aug4 DE-573 DE-1102 DE-1051 DE-739 DE-898 DE-BY-UBR DE-703 |
physical | xxiv, 478 Seiten Illustrationen, Diagramme |
publishDate | 2021 |
publishDateSearch | 2021 |
publishDateSort | 2021 |
publisher | Manning |
record_format | marc |
spelling | Chollet, François Verfasser (DE-588)1151332550 aut Deep learning with Python François Chollet Second edition Shelter Island, NY Manning [2021] © 2021 xxiv, 478 Seiten Illustrationen, Diagramme txt rdacontent n rdamedia nc rdacarrier Keras Framework, Informatik (DE-588)1160521077 gnd rswk-swf Python Programmiersprache (DE-588)4434275-5 gnd rswk-swf Deep learning (DE-588)1135597375 gnd rswk-swf Maschinelles Lernen (DE-588)4193754-5 gnd rswk-swf Keras Framework, Informatik (DE-588)1160521077 s Deep learning (DE-588)1135597375 s Python Programmiersprache (DE-588)4434275-5 s DE-604 Maschinelles Lernen (DE-588)4193754-5 s Erscheint auch als Online-Ausgabe 978-1-63835-009-5 Digitalisierung UB Bamberg - ADAM Catalogue Enrichment application/pdf http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=033200672&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA Inhaltsverzeichnis |
spellingShingle | Chollet, François Deep learning with Python Keras Framework, Informatik (DE-588)1160521077 gnd Python Programmiersprache (DE-588)4434275-5 gnd Deep learning (DE-588)1135597375 gnd Maschinelles Lernen (DE-588)4193754-5 gnd |
subject_GND | (DE-588)1160521077 (DE-588)4434275-5 (DE-588)1135597375 (DE-588)4193754-5 |
title | Deep learning with Python |
title_auth | Deep learning with Python |
title_exact_search | Deep learning with Python |
title_exact_search_txtP | Deep learning with Python |
title_full | Deep learning with Python François Chollet |
title_fullStr | Deep learning with Python François Chollet |
title_full_unstemmed | Deep learning with Python François Chollet |
title_short | Deep learning with Python |
title_sort | deep learning with python |
topic | Keras Framework, Informatik (DE-588)1160521077 gnd Python Programmiersprache (DE-588)4434275-5 gnd Deep learning (DE-588)1135597375 gnd Maschinelles Lernen (DE-588)4193754-5 gnd |
topic_facet | Keras Framework, Informatik Python Programmiersprache Deep learning Maschinelles Lernen |
url | http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=033200672&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |
work_keys_str_mv | AT cholletfrancois deeplearningwithpython |