Real-world natural language processing:
Training computers to interpret and generate speech and text is a monumental challenge, and the payoff for reducing labor and improving human/computer interaction is huge! The field of Natural language processing (NLP) is advancing rapidly, with countless new tools and practices. This unique book of...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Buch |
Sprache: | English |
Veröffentlicht: |
Shelter Island, NY
Manning Publications
[2021]
|
Schlagworte: | |
Online-Zugang: | Inhaltsverzeichnis |
Zusammenfassung: | Training computers to interpret and generate speech and text is a monumental challenge, and the payoff for reducing labor and improving human/computer interaction is huge! The field of Natural language processing (NLP) is advancing rapidly, with countless new tools and practices. This unique book offers an innovative collection of NLP techniques with applications in machine translation, voice assitants, text generation and more. "Real-world natural language processing" shows you how to build the practical NLP applications that are transforming the way humans and computers work together. Guided by clear explanations of each core NLP topic, you'll create many interesting applications including a sentiment analyzer and a chatbot. Along the way, you'll use Python and open source libraries like AllenNLP and HuggingFace Transformers to speed up your development process |
Beschreibung: | Includes index |
Beschreibung: | xviii, 316 Seiten Illustrationen, Diagramme 23 cm |
Internformat
MARC
LEADER | 00000nam a2200000 c 4500 | ||
---|---|---|---|
001 | BV048630213 | ||
003 | DE-604 | ||
005 | 20230213 | ||
007 | t | ||
008 | 230104s2021 a||| |||| 00||| eng d | ||
015 | |a GBC1I5692 |2 dnb | ||
020 | |z 9781617296420 |9 978-1-61729-642-0 | ||
020 | |z 1617296422 |9 1-61729-642-2 | ||
035 | |a (OCoLC)1291276351 | ||
035 | |a (DE-599)BVBBV048630213 | ||
040 | |a DE-604 |b ger |e rda | ||
041 | 0 | |a eng | |
049 | |a DE-739 | ||
084 | |a ST 306 |0 (DE-625)143654: |2 rvk | ||
100 | 1 | |a Hagiwara, Masato |e Verfasser |0 (DE-588)1252889704 |4 aut | |
245 | 1 | 0 | |a Real-world natural language processing |c Masato Hagiwara |
264 | 1 | |a Shelter Island, NY |b Manning Publications |c [2021] | |
300 | |a xviii, 316 Seiten |b Illustrationen, Diagramme |c 23 cm | ||
336 | |b txt |2 rdacontent | ||
337 | |b n |2 rdamedia | ||
338 | |b nc |2 rdacarrier | ||
500 | |a Includes index | ||
505 | 8 | |a 1. Introduction to natural language processing -- 2. Your first NLP application -- 3. Word and document embeddings -- 4. Sentence classification -- 5. Sequential labeling and language modeling -- 6. Sequence-to-sequence models -- 7. Convolutional neural networks -- 8. Attention and transformer -- 9. Transfer learning with pretrained language models -- 10. Best practices in developing NLP applications -- Deploying and serving NLP applications | |
520 | |a Training computers to interpret and generate speech and text is a monumental challenge, and the payoff for reducing labor and improving human/computer interaction is huge! The field of Natural language processing (NLP) is advancing rapidly, with countless new tools and practices. This unique book offers an innovative collection of NLP techniques with applications in machine translation, voice assitants, text generation and more. "Real-world natural language processing" shows you how to build the practical NLP applications that are transforming the way humans and computers work together. Guided by clear explanations of each core NLP topic, you'll create many interesting applications including a sentiment analyzer and a chatbot. Along the way, you'll use Python and open source libraries like AllenNLP and HuggingFace Transformers to speed up your development process | ||
650 | 4 | |a Natural language processing (Computer science) | |
650 | 4 | |a Natural Language Processing | |
650 | 4 | |a Traitement automatique des langues naturelles | |
650 | 7 | |a Natural language processing (Computer science) |2 fast | |
650 | 0 | 7 | |a Automatische Sprachanalyse |0 (DE-588)4129935-8 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Deep learning |0 (DE-588)1135597375 |2 gnd |9 rswk-swf |
689 | 0 | 0 | |a Automatische Sprachanalyse |0 (DE-588)4129935-8 |D s |
689 | 0 | 1 | |a Deep learning |0 (DE-588)1135597375 |D s |
689 | 0 | |5 DE-604 | |
856 | 4 | 2 | |m Digitalisierung UB Passau - ADAM Catalogue Enrichment |q application/pdf |u http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=034005272&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |3 Inhaltsverzeichnis |
999 | |a oai:aleph.bib-bvb.de:BVB01-034005272 |
Datensatz im Suchindex
_version_ | 1804184761942409216 |
---|---|
adam_text | contents preface xi acknowledgments xiii about this book xiv about the author xvii about the cover illustration Part 1 xviii Basics / Introduction to natural language processing 3 1.1 What is natural language processing (NLP)? What is NLP? 4 ■ What is not NLP? NLP 8 ~Why NLP? 10 1.2 How NLP is used NLP applications 1.3 4 6 ■ AI, ML, DL, and 12 13 ■ NLP tasks Building NLP applications 21 Development of NLP applications applications 24 ) Yourfirst NLP application 15 21 · Structure ofNLP 26 2.1 Introducing sentiment analysis 2.2 Working with NLP datasets 27 28 What is a dataset? 28 ■ Stanford Sentiment Treebank 29 Train, validation, and test sets 30 ■ Loading SST datasets using AUenNLP 33
CONTENTS 2.3 Using word embeddings 34 34 · Using word embeddings What are word embeddings ? for sentiment analysis 36 2.4 Neural networks 37 What are neural networks ? 37 ■ Recurrent neural networks (RNNs) and linear layers 38 ■ Architecture for sentiment analysis 39 2.5 Loss functions and optimization 2.6 Training your own classifier 2.Ί Evaluating your classifier 2.8 Deploying your application Makingpredictions 46 46 ■ Sewing predictions Introducing embeddings What are embeddings? important? 50 44 45 Word and document embeddings 3.2 43 43 ■ Putting everything together Batching 3.1 41 46 49 50 50՝ Why are embeddings Building blocks of language: Characters, words, and phrases 52 Characters 52 · Words, tokens, morphemes, and phrases N-grams 53 3.3 Tokenization, stemming, and lemmatization Tokenization 3.4 53 54 54 ■ Stemming 55 ■ Lemmatization 56 Skip-gram and continuous bag of words (CBOW) 57 Where word embeddings come from 57 ՝ Using word associations 58 ■ Linear layers 59 ■ Softmax 61 Implementing Skip-gram on AllenNLP 62՝ Continuous bag of words (CBOW) model 67 3.5 GloVe 68 How GloVe learns word embeddings GloVe vectors 69 3.6 fastText 68 ■ Using pretrained 72 Making use of subword information fastText toolkit 73 3.7 Document-level embeddings 3.8 Visualizing embeddings 76 74 72 ՝ Using the
vii CONTENTS ¿1. Sentence classification 4.1 80 Recurrent neural networks (RNNs) 81 82 Handling variable-length input 81 ■ RNN abstraction Simple RNNs and nonlinearity 84 4.2 Long short-term memory units (LSTMs) and gated recurrent units (GRUs) 88 Vanishing gradients problem 88 ■ Long short-term memory (LSTM) 90· Gated recurrent units (GRUs) 92 4.3 Accuracy, precision, recall, and F-measure Accuracy 4.4 93 ■ Precision and recall 93 94 ■ F-measure Building AllenNLP training pipelines 96 96 Instances and fields 97 · Vocabulary and token indexers 98 ■ Token embedders and RNNs 99 Building your own model 100 · Puttingit all together 101 4.5 Configuring AllenNLP training pipelines 4.6 Case study: Language detection 102 105 Using characters as input 106· Creating a dataset reader 106 · Building the training pipeline 108 Running the detector on unseen instances 110 ^ Sequential labeling and language modeling 5.1 Introducing sequential labeling 112 113 What is sequential labeling? 113· Using RNNs to encode sequences 113 · Implementing a Seq2Seq encoder in AllenNLP 117 5.2 Building a part-of-speech tagger 118 Reading a dataset 118 · Defining the model and the loss 119· Building the training pipeline 121 5.3 Multilayer and bidirectional RNNs Multilayer RNNs 5.4 122 122 · Bidirectional RNNs Named entity recognition 124 126 What is named entity recognition? 127· Tagging spans 128 ■ Implementing a named entity recognizer 128 5.5 Modeling a language 130 What is a language model? 130 ■ Why are language models useful? 131 ■ Training an RNN language model 132
viii CONTENTS 5.6 Text generation using RNNs 133 Feeding characters to an RNN 134 ■ Evaluating text using a language model 134 ■ Generating text using a language model 136 Part 2 Advanced models ........... ^ Sequence-to-sequence models 141 6.1 Introducing sequence-to-sequence models 6.2 Machine translation 101 6.3 Building your first translator 147 How Seq2Seq models work Evaluating translation systems Fluman evaluation 6.6 150 154 Encoder 154 ■ Decoder 156 · Greedy decoding Beam search decoding 161 6.5 142 144 Preparing the datasets 148· Training the model Running the translator 153 6.4 139 158 163 163 ■ Automatic evaluation Case study: Building a chatbot 163 165 Introducing dialogue systems 165 ■ Preparing a dataset 166 Training and running a chatbot 167· Next steps 169 V Convolutional neural networks 171 7.1 Introducing convolutional neural networks (CNNs) RNNs and their shortcomings 172 · Pattern matching for sentence classification 173 · Convolutional neural networks (CNNs) 174 7.2 174 Convolutional layers Pattern matching usingfilters 175 · Rectified linear unit (ReLU) 176 ■ Combining scores 178 7.3 Pooling layers 7.4 Case study: Text classification 179 180 Review: Text classification 180· Using CnnEncoder Training and running the classifier 182 Q Attention and Transformer 8.1 What is attention? 184 185 Limitation of vanilla Seq2Seq models mechanism 186 185 ■ Attention 181 172
ix CONTENTS 8.2 Sequence-to-sequence with attention 187 Encoder-decoder attention 188 ■ Building a Seq2Seq machine translation with attention 189 8.3 Transformer and self-attention Self-attention 8.4 192 ■ Transformer 195 ■ Experiments Transformer-based language models Transformer as a language model GPT-2 205 ■ XLM 207 8.5 192 Case study: Spell-checker 197 200 200 ■ Transformer-XL 203 208 Spell correction as machine translation 208 ■ Training a spell checker 210 ■ Improving a spell-checker 213 f f Transfer learning with pretrained language models 9.1 Transfer learning 218 219 Traditional machine learning 219· Word embeddings What is transfer learning? 220 9.2 BERT 220 222 Limitations of word embeddings 222 · Self-superoised learning Pretraining BERT 225 ■ Adapting BERT 226 9.3 Case study 1: Sentiment analysis with BERT Tokenizing input 230 ■ Building the model Training the model 233 9.4 Other pretrained language models 229 232 236 ELMo 236 ■ XLNet 237 ■ RoBERTa DistilBERT 240 ■ ALBERT 241 9.5 224 239 Case study 2: Natural language inference with BERT 243 What is natural language inference? 243 · Using BERT for sentence-pair classification 244 · Using Transformers with AllenNLP 246 Part 3 ƒ/į Putting into production ........................ 253 Best practices in developing NLP applications 10.1 Batching instances Padding 10.2 256 256 ■ Sorting 257 · Masking Tokenization for neural models Unknown words models 263 255 25 9 261 261 ■ Character models 262 ■ Subword
CONTENTS 10.3 Avoiding overfitting 265 Regularization 265 ■ Early stopping validation 269 10.4 Dealing with imbalanced datasets 268 ■ Cross- 270 Using appropriate evaluation metrics 270 ■ Upsampling and downsampling 271 ■ Weighting losses 272 10.5 Hyperparameter tuning 273 Examples of hyperparameters 274 ՝ Grid search vs. random search 275 · Hyperparameter tuning with Optuna 276 Deploying and serving NLP applications 11.1 Architecting your NLP application 280 281 Before machine learning 282 ■ Choosing the right architecture 282 ■ Project structure 283 ■ Version control 285 11.2 Deploying your NLP model 286 Testing 286 ■ Train-serve skew Using GPUs 289 11.3 288 ■ Monitoring 289 Case study: Serving and deploying NLP applications Serving models with TorchServe SageMaker 296 292 ■ Deploying models with 11.4 Interpreting and visualizing model predictions 11.5 Where to go from here index 305 302 292 298
|
adam_txt |
contents preface xi acknowledgments xiii about this book xiv about the author xvii about the cover illustration Part 1 xviii Basics / Introduction to natural language processing 3 1.1 What is natural language processing (NLP)? What is NLP? 4 ■ What is not NLP? NLP 8 ~Why NLP? 10 1.2 How NLP is used NLP applications 1.3 4 6 ■ AI, ML, DL, and 12 13 ■ NLP tasks Building NLP applications 21 Development of NLP applications applications 24 ) Yourfirst NLP application 15 21 · Structure ofNLP 26 2.1 Introducing sentiment analysis 2.2 Working with NLP datasets 27 28 What is a dataset? 28 ■ Stanford Sentiment Treebank 29 Train, validation, and test sets 30 ■ Loading SST datasets using AUenNLP 33
CONTENTS 2.3 Using word embeddings 34 34 · Using word embeddings What are word embeddings ? for sentiment analysis 36 2.4 Neural networks 37 What are neural networks ? 37 ■ Recurrent neural networks (RNNs) and linear layers 38 ■ Architecture for sentiment analysis 39 2.5 Loss functions and optimization 2.6 Training your own classifier 2.Ί Evaluating your classifier 2.8 Deploying your application Makingpredictions 46 46 ■ Sewing predictions Introducing embeddings What are embeddings? important? 50 44 45 Word and document embeddings 3.2 43 43 ■ Putting everything together Batching 3.1 41 46 49 50 50՝ Why are embeddings Building blocks of language: Characters, words, and phrases 52 Characters 52 · Words, tokens, morphemes, and phrases N-grams 53 3.3 Tokenization, stemming, and lemmatization Tokenization 3.4 53 54 54 ■ Stemming 55 ■ Lemmatization 56 Skip-gram and continuous bag of words (CBOW) 57 Where word embeddings come from 57 ՝ Using word associations 58 ■ Linear layers 59 ■ Softmax 61 Implementing Skip-gram on AllenNLP 62՝ Continuous bag of words (CBOW) model 67 3.5 GloVe 68 How GloVe learns word embeddings GloVe vectors 69 3.6 fastText 68 ■ Using pretrained 72 Making use of subword information fastText toolkit 73 3.7 Document-level embeddings 3.8 Visualizing embeddings 76 74 72 ՝ Using the
vii CONTENTS ¿1. Sentence classification 4.1 80 Recurrent neural networks (RNNs) 81 82 Handling variable-length input 81 ■ RNN abstraction Simple RNNs and nonlinearity 84 4.2 Long short-term memory units (LSTMs) and gated recurrent units (GRUs) 88 Vanishing gradients problem 88 ■ Long short-term memory (LSTM) 90· Gated recurrent units (GRUs) 92 4.3 Accuracy, precision, recall, and F-measure Accuracy 4.4 93 ■ Precision and recall 93 94 ■ F-measure Building AllenNLP training pipelines 96 96 Instances and fields 97 · Vocabulary and token indexers 98 ■ Token embedders and RNNs 99 Building your own model 100 · Puttingit all together 101 4.5 Configuring AllenNLP training pipelines 4.6 Case study: Language detection 102 105 Using characters as input 106· Creating a dataset reader 106 · Building the training pipeline 108 Running the detector on unseen instances 110 ^ Sequential labeling and language modeling 5.1 Introducing sequential labeling 112 113 What is sequential labeling? 113· Using RNNs to encode sequences 113 · Implementing a Seq2Seq encoder in AllenNLP 117 5.2 Building a part-of-speech tagger 118 Reading a dataset 118 · Defining the model and the loss 119· Building the training pipeline 121 5.3 Multilayer and bidirectional RNNs Multilayer RNNs 5.4 122 122 · Bidirectional RNNs Named entity recognition 124 126 What is named entity recognition? 127· Tagging spans 128 ■ Implementing a named entity recognizer 128 5.5 Modeling a language 130 What is a language model? 130 ■ Why are language models useful? 131 ■ Training an RNN language model 132
viii CONTENTS 5.6 Text generation using RNNs 133 Feeding characters to an RNN 134 ■ Evaluating text using a language model 134 ■ Generating text using a language model 136 Part 2 Advanced models . ^ Sequence-to-sequence models 141 6.1 Introducing sequence-to-sequence models 6.2 Machine translation 101 6.3 Building your first translator 147 How Seq2Seq models work Evaluating translation systems Fluman evaluation 6.6 150 154 Encoder 154 ■ Decoder 156 · Greedy decoding Beam search decoding 161 6.5 142 144 Preparing the datasets 148· Training the model Running the translator 153 6.4 139 158 163 163 ■ Automatic evaluation Case study: Building a chatbot 163 165 Introducing dialogue systems 165 ■ Preparing a dataset 166 Training and running a chatbot 167· Next steps 169 V Convolutional neural networks 171 7.1 Introducing convolutional neural networks (CNNs) RNNs and their shortcomings 172 · Pattern matching for sentence classification 173 · Convolutional neural networks (CNNs) 174 7.2 174 Convolutional layers Pattern matching usingfilters 175 · Rectified linear unit (ReLU) 176 ■ Combining scores 178 7.3 Pooling layers 7.4 Case study: Text classification 179 180 Review: Text classification 180· Using CnnEncoder Training and running the classifier 182 Q Attention and Transformer 8.1 What is attention? 184 185 Limitation of vanilla Seq2Seq models mechanism 186 185 ■ Attention 181 172
ix CONTENTS 8.2 Sequence-to-sequence with attention 187 Encoder-decoder attention 188 ■ Building a Seq2Seq machine translation with attention 189 8.3 Transformer and self-attention Self-attention 8.4 192 ■ Transformer 195 ■ Experiments Transformer-based language models Transformer as a language model GPT-2 205 ■ XLM 207 8.5 192 Case study: Spell-checker 197 200 200 ■ Transformer-XL 203 208 Spell correction as machine translation 208 ■ Training a spell checker 210 ■ Improving a spell-checker 213 f f Transfer learning with pretrained language models 9.1 Transfer learning 218 219 Traditional machine learning 219· Word embeddings What is transfer learning? 220 9.2 BERT 220 222 Limitations of word embeddings 222 · Self-superoised learning Pretraining BERT 225 ■ Adapting BERT 226 9.3 Case study 1: Sentiment analysis with BERT Tokenizing input 230 ■ Building the model Training the model 233 9.4 Other pretrained language models 229 232 236 ELMo 236 ■ XLNet 237 ■ RoBERTa DistilBERT 240 ■ ALBERT 241 9.5 224 239 Case study 2: Natural language inference with BERT 243 What is natural language inference? 243 · Using BERT for sentence-pair classification 244 · Using Transformers with AllenNLP 246 Part 3 ƒ/į Putting into production . 253 Best practices in developing NLP applications 10.1 Batching instances Padding 10.2 256 256 ■ Sorting 257 · Masking Tokenization for neural models Unknown words models 263 255 25 9 261 261 ■ Character models 262 ■ Subword
CONTENTS 10.3 Avoiding overfitting 265 Regularization 265 ■ Early stopping validation 269 10.4 Dealing with imbalanced datasets 268 ■ Cross- 270 Using appropriate evaluation metrics 270 ■ Upsampling and downsampling 271 ■ Weighting losses 272 10.5 Hyperparameter tuning 273 Examples of hyperparameters 274 ՝ Grid search vs. random search 275 · Hyperparameter tuning with Optuna 276 Deploying and serving NLP applications 11.1 Architecting your NLP application 280 281 Before machine learning 282 ■ Choosing the right architecture 282 ■ Project structure 283 ■ Version control 285 11.2 Deploying your NLP model 286 Testing 286 ■ Train-serve skew Using GPUs 289 11.3 288 ■ Monitoring 289 Case study: Serving and deploying NLP applications Serving models with TorchServe SageMaker 296 292 ■ Deploying models with 11.4 Interpreting and visualizing model predictions 11.5 Where to go from here index 305 302 292 298 |
any_adam_object | 1 |
any_adam_object_boolean | 1 |
author | Hagiwara, Masato |
author_GND | (DE-588)1252889704 |
author_facet | Hagiwara, Masato |
author_role | aut |
author_sort | Hagiwara, Masato |
author_variant | m h mh |
building | Verbundindex |
bvnumber | BV048630213 |
classification_rvk | ST 306 |
contents | 1. Introduction to natural language processing -- 2. Your first NLP application -- 3. Word and document embeddings -- 4. Sentence classification -- 5. Sequential labeling and language modeling -- 6. Sequence-to-sequence models -- 7. Convolutional neural networks -- 8. Attention and transformer -- 9. Transfer learning with pretrained language models -- 10. Best practices in developing NLP applications -- Deploying and serving NLP applications |
ctrlnum | (OCoLC)1291276351 (DE-599)BVBBV048630213 |
discipline | Informatik |
discipline_str_mv | Informatik |
format | Book |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>03085nam a2200445 c 4500</leader><controlfield tag="001">BV048630213</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20230213 </controlfield><controlfield tag="007">t</controlfield><controlfield tag="008">230104s2021 a||| |||| 00||| eng d</controlfield><datafield tag="015" ind1=" " ind2=" "><subfield code="a">GBC1I5692</subfield><subfield code="2">dnb</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="z">9781617296420</subfield><subfield code="9">978-1-61729-642-0</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="z">1617296422</subfield><subfield code="9">1-61729-642-2</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)1291276351</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)BVBBV048630213</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">rda</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-739</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 306</subfield><subfield code="0">(DE-625)143654:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Hagiwara, Masato</subfield><subfield code="e">Verfasser</subfield><subfield code="0">(DE-588)1252889704</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Real-world natural language processing</subfield><subfield code="c">Masato Hagiwara</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Shelter Island, NY</subfield><subfield code="b">Manning Publications</subfield><subfield code="c">[2021]</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">xviii, 316 Seiten</subfield><subfield code="b">Illustrationen, Diagramme</subfield><subfield code="c">23 cm</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">Includes index</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">1. Introduction to natural language processing -- 2. Your first NLP application -- 3. Word and document embeddings -- 4. Sentence classification -- 5. Sequential labeling and language modeling -- 6. Sequence-to-sequence models -- 7. Convolutional neural networks -- 8. Attention and transformer -- 9. Transfer learning with pretrained language models -- 10. Best practices in developing NLP applications -- Deploying and serving NLP applications</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Training computers to interpret and generate speech and text is a monumental challenge, and the payoff for reducing labor and improving human/computer interaction is huge! The field of Natural language processing (NLP) is advancing rapidly, with countless new tools and practices. This unique book offers an innovative collection of NLP techniques with applications in machine translation, voice assitants, text generation and more. "Real-world natural language processing" shows you how to build the practical NLP applications that are transforming the way humans and computers work together. Guided by clear explanations of each core NLP topic, you'll create many interesting applications including a sentiment analyzer and a chatbot. Along the way, you'll use Python and open source libraries like AllenNLP and HuggingFace Transformers to speed up your development process</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Natural language processing (Computer science)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Natural Language Processing</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Traitement automatique des langues naturelles</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Natural language processing (Computer science)</subfield><subfield code="2">fast</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Automatische Sprachanalyse</subfield><subfield code="0">(DE-588)4129935-8</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Deep learning</subfield><subfield code="0">(DE-588)1135597375</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Automatische Sprachanalyse</subfield><subfield code="0">(DE-588)4129935-8</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="1"><subfield code="a">Deep learning</subfield><subfield code="0">(DE-588)1135597375</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="m">Digitalisierung UB Passau - ADAM Catalogue Enrichment</subfield><subfield code="q">application/pdf</subfield><subfield code="u">http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=034005272&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA</subfield><subfield code="3">Inhaltsverzeichnis</subfield></datafield><datafield tag="999" ind1=" " ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-034005272</subfield></datafield></record></collection> |
id | DE-604.BV048630213 |
illustrated | Illustrated |
index_date | 2024-07-03T21:15:51Z |
indexdate | 2024-07-10T09:44:29Z |
institution | BVB |
language | English |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-034005272 |
oclc_num | 1291276351 |
open_access_boolean | |
owner | DE-739 |
owner_facet | DE-739 |
physical | xviii, 316 Seiten Illustrationen, Diagramme 23 cm |
publishDate | 2021 |
publishDateSearch | 2021 |
publishDateSort | 2021 |
publisher | Manning Publications |
record_format | marc |
spelling | Hagiwara, Masato Verfasser (DE-588)1252889704 aut Real-world natural language processing Masato Hagiwara Shelter Island, NY Manning Publications [2021] xviii, 316 Seiten Illustrationen, Diagramme 23 cm txt rdacontent n rdamedia nc rdacarrier Includes index 1. Introduction to natural language processing -- 2. Your first NLP application -- 3. Word and document embeddings -- 4. Sentence classification -- 5. Sequential labeling and language modeling -- 6. Sequence-to-sequence models -- 7. Convolutional neural networks -- 8. Attention and transformer -- 9. Transfer learning with pretrained language models -- 10. Best practices in developing NLP applications -- Deploying and serving NLP applications Training computers to interpret and generate speech and text is a monumental challenge, and the payoff for reducing labor and improving human/computer interaction is huge! The field of Natural language processing (NLP) is advancing rapidly, with countless new tools and practices. This unique book offers an innovative collection of NLP techniques with applications in machine translation, voice assitants, text generation and more. "Real-world natural language processing" shows you how to build the practical NLP applications that are transforming the way humans and computers work together. Guided by clear explanations of each core NLP topic, you'll create many interesting applications including a sentiment analyzer and a chatbot. Along the way, you'll use Python and open source libraries like AllenNLP and HuggingFace Transformers to speed up your development process Natural language processing (Computer science) Natural Language Processing Traitement automatique des langues naturelles Natural language processing (Computer science) fast Automatische Sprachanalyse (DE-588)4129935-8 gnd rswk-swf Deep learning (DE-588)1135597375 gnd rswk-swf Automatische Sprachanalyse (DE-588)4129935-8 s Deep learning (DE-588)1135597375 s DE-604 Digitalisierung UB Passau - ADAM Catalogue Enrichment application/pdf http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=034005272&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA Inhaltsverzeichnis |
spellingShingle | Hagiwara, Masato Real-world natural language processing 1. Introduction to natural language processing -- 2. Your first NLP application -- 3. Word and document embeddings -- 4. Sentence classification -- 5. Sequential labeling and language modeling -- 6. Sequence-to-sequence models -- 7. Convolutional neural networks -- 8. Attention and transformer -- 9. Transfer learning with pretrained language models -- 10. Best practices in developing NLP applications -- Deploying and serving NLP applications Natural language processing (Computer science) Natural Language Processing Traitement automatique des langues naturelles Natural language processing (Computer science) fast Automatische Sprachanalyse (DE-588)4129935-8 gnd Deep learning (DE-588)1135597375 gnd |
subject_GND | (DE-588)4129935-8 (DE-588)1135597375 |
title | Real-world natural language processing |
title_auth | Real-world natural language processing |
title_exact_search | Real-world natural language processing |
title_exact_search_txtP | Real-world natural language processing |
title_full | Real-world natural language processing Masato Hagiwara |
title_fullStr | Real-world natural language processing Masato Hagiwara |
title_full_unstemmed | Real-world natural language processing Masato Hagiwara |
title_short | Real-world natural language processing |
title_sort | real world natural language processing |
topic | Natural language processing (Computer science) Natural Language Processing Traitement automatique des langues naturelles Natural language processing (Computer science) fast Automatische Sprachanalyse (DE-588)4129935-8 gnd Deep learning (DE-588)1135597375 gnd |
topic_facet | Natural language processing (Computer science) Natural Language Processing Traitement automatique des langues naturelles Automatische Sprachanalyse Deep learning |
url | http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=034005272&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |
work_keys_str_mv | AT hagiwaramasato realworldnaturallanguageprocessing |