Bayesian analysis in natural language processing:
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Buch |
Sprache: | English |
Veröffentlicht: |
[San Rafael, California]
Morgan & Claypool Publishers
[2019]
|
Ausgabe: | Second edition |
Schriftenreihe: | Synthesis lectures on human language technologies
#41 |
Schlagworte: | |
Online-Zugang: | Inhaltsverzeichnis |
Beschreibung: | xxxi, 311 Seiten Diagramme |
ISBN: | 9781681735283 9781681735269 |
Internformat
MARC
LEADER | 00000nam a2200000 cb4500 | ||
---|---|---|---|
001 | BV045873206 | ||
003 | DE-604 | ||
005 | 20190603 | ||
007 | t | ||
008 | 190513s2019 |||| |||| 00||| eng d | ||
020 | |a 9781681735283 |c hardback |9 978-1-68173-528-3 | ||
020 | |a 9781681735269 |c paperback : $79.95 |9 978-1-68173-526-9 | ||
035 | |a (OCoLC)1104916210 | ||
035 | |a (DE-599)HBZHT020045625 | ||
040 | |a DE-604 |b ger |e rda | ||
041 | 0 | |a eng | |
049 | |a DE-29 |a DE-19 |a DE-739 | ||
084 | |a ST 306 |0 (DE-625)143654: |2 rvk | ||
100 | 1 | |a Cohen, Shay |e Verfasser |0 (DE-588)1108783767 |4 aut | |
245 | 1 | 0 | |a Bayesian analysis in natural language processing |c Shay Cohen (University of Edinburgh) |
250 | |a Second edition | ||
264 | 1 | |a [San Rafael, California] |b Morgan & Claypool Publishers |c [2019] | |
264 | 4 | |c © 2019 | |
300 | |a xxxi, 311 Seiten |b Diagramme | ||
336 | |b txt |2 rdacontent | ||
337 | |b n |2 rdamedia | ||
338 | |b nc |2 rdacarrier | ||
490 | 1 | |a Synthesis lectures on human language technologies |v #41 | |
650 | 0 | 7 | |a Sprachverarbeitung |0 (DE-588)4116579-2 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Bayes-Verfahren |0 (DE-588)4204326-8 |2 gnd |9 rswk-swf |
689 | 0 | 0 | |a Bayes-Verfahren |0 (DE-588)4204326-8 |D s |
689 | 0 | 1 | |a Sprachverarbeitung |0 (DE-588)4116579-2 |D s |
689 | 0 | |5 DE-604 | |
776 | 0 | 8 | |i Erscheint auch als |n Online-Ausgabe |z 978-1-68173-527-6 |
776 | 0 | 8 | |i Erscheint auch als |n Online-Ausgabe, epub |z 978-1-68173-529-0 |
830 | 0 | |a Synthesis lectures on human language technologies |v #41 |w (DE-604)BV035447238 |9 41 | |
856 | 4 | 2 | |m Digitalisierung UB Passau - ADAM Catalogue Enrichment |q application/pdf |u http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=031256494&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |3 Inhaltsverzeichnis |
999 | |a oai:aleph.bib-bvb.de:BVB01-031256494 |
Datensatz im Suchindex
_version_ | 1804180016317071360 |
---|---|
adam_text | List of Figures.......................................................................................................... xix List of Algorithms .................................................................................................. xxi List of Generative Stories ....................................................................................xxiii Preface (First Edition) xxv Acknowledgments (First Edition) ................................................................... xxix Preface (Second Edition)................................................................................... xxxi Preliminaries 1.1 1.2 Probability Measures................................................................................ Random Variables .................................................................................... 1.2.1 Continuous and Discrete Random Variables............................... 1.2.2 Joint Distribution over Multiple Random Variables.................. 1.3 Conditional Distributions........................................................................ 1.3.1 Bayes’Rule .................................................................................... 1.3.2 Independent and Conditionally Independent Random Variables 1.3.3 Exchangeable Random Variables................................................. 1.4 1.5 Expectations of Random Variables.......................................................... Models....................................................................................................... 1.5.1 Parametric vs. Nonparametric
Models......................................... 1.5.2 Inierence with Models .................................................................. 1.5.3 Generative Models........................................................................ 1.5.4 Independence Assumptions in Models......................................... 1.5.5 Directed Graphical Models ......................................................... Learning from Data Scenarios.................................................................. 1.6 1.7 1.8 1.9 Bayesian and Frequentisi Philosophy (Tip of the Iceberg).................... Summary ................................................................................................... Exercises..................................................................................................... 1 1 2 о A 4 5 6 7 8 9 11 11 12 14 16 17 19 22 23 24
Introduction 25 2.1 2.2 Overview: Where Bayesian Statistics and NIP Meet................ First Example: ΊЪс Latent Dirichlet Allocation Alodel.............. 2.2.1 The Dirichlet Distribution ................................................. 2.2.2 Inference.............................................................................. 2.2.3 Summan՜.............................................................................. 26 2.3 2.Վ Second Example: Bayesian Text Regression................................. Conclusion and Summary.............................................................. 40 41 2.5 Exercises........................................................................................... 42 Priors......................................................................................................... 43 3.1 Conjugate Priors.............................................................................. 3.1.1 Conjugate Priors and Normalization Constants.............. 3.1.2 The Use oí Conjugate Priors with Patent Variable ModeP 3.1.3 Mixture of Conjugate Priors............................................... 3.1.Վ Renormalized Conjugate Distributions............................. 3.1.5 Discussion: To Be or not to Be Conjugate?....................... 3.1.6 Summary.............................................................................. Priors Over Multinomial and Categorical Distributions............ 3.2.1 The Dirichlet Distribution Re-Visited............................... 3.2.2 The Logistic Normal Distribution..................................... 3.2.3
Discussion............................................................................ 3.2.Վ Summarv.............................................................................. 44 47 Non-Informative Priors.................................................................. 3.3.1 Uniform and Improper Priors............................................. 3.3.2 Jeflrevs Prior........................................................................ 3.3.3 Discussion............................................................................ 66 3.2 3.3 3.4 Conjugacvand Exponential Models............................................. 3.5 Multiple Parameter Draws in Models........................................... 3.6 3.7 Structural Priors.............................................................................. Conclusion and Summary.............................................................. 3.8 Exercises.......................................................................................... 30 34 36 36 50 51 S.1 xO 6~ 6s 67 os 60 60 t S Bavesian Estimation.......................................................................................................77 4.1 Learning with Latent Variables: Two Views........................................................ 75 4.2 Bavesian Point Estimation.................................................................................. /0
ХНІ 4.2.1 Maximum a Posteriori Estimation............................................................79 4.2.2 Posterior Approximations Based on the MAP Solution.........................87 4.2.3 Deeision-lheoretic Point Estimation........................................................89 4.2.4 Discussion and Summary........................... 90 5 4.3 4.4 Empirical Bayes.....................................................................................................90 Asymptotic Behavior of the Posterior..................................................................93 4.5 Summary ............................................................................................................... 93 4.6 Exercises................................................................................................................. 95 Sampling Methods........................................................................................... 97 5.1 MC MC Algorithms: Overview............................................................................ 98 5.2 NEP Model Structure tor MCMC Inference..................................................... 99 5.2.1 Partitioning the Latent Variables............................................................100 5.3 Gibbs 5.3.1 5.3.2 5.3.3 5.3.4 5.4 lhe Metropolis-Hastings Algorithm.............................................................. 113 5.4.1 Variants of Metropolis-Hastings............................................................114 5.5 Slice
Sampling..................................................................................................... Ю 5.5.1 Auxiliary Variable Sampling.................................................................... 117 5.5.2 dite Use of Slice Sampling and Auxiliary Variable Sampling in NLP. 117 5.6 Simulated Annealing................................. 5.7 Convergence of MCMC Algorithms................................................................ 119 5.8 5.9 Markov Chain: Basic Iheory...............................................................................121 Sampling Algorithms Not in the MCMC Realm ........................................... 123 5.10 Monte Carlo Integration.....................................................................................126 Sampling................................................................................................... 101 Collapsed Gibbs Sampling...................................................................... 105 Operator View...........................................................................................109 Parallelizing the Gibbs Sampler.............................................................. Ill Summary...................................................................................................112 118 5.11 Discussion.............................................................................................................. 127 5.11.1 Computability of Distribution vs. Sampling......................................... 127 5.11.2 Nested MCMC
Sampling...................................................................... 128 5.11.3 Runtime ol MCMC Samplers................................................................ 128 5.11.4 Particle Filtering.......................................................................................128 5.12 Conclusion and Summary.................................................................................. 130 5.13 Exercises............................................................................................................... 132
xiv 6 7 Variational Inference................................................................................................. 135 6.1 Variational Bound on Marginal Log-Likelihood........................................... 135 6.2 6.3 Mean-Field Approximation............................................................................... 13H .Mean-Field Variational Inference Algorithm.................................................. 139 6.3.1 Dirichlet-Multinomial Variational Inference....................................... 14] 6.3.2 Connection to the Expectation-Maximization Algorithm................... 145 6.4 6.5 Empirical Baves with Variational inference .................................................... Discussion........................................................................................................... 6.5.1 Initialization of the Inference Algorithms........................................... 6.5.2 Convergence Diagnosis.......................................................................... 6.5.3 The Use of Variational Inference for Decoding................................... 6.5.4 Variational Inference as KL Divergence Minimization....................... 6.5.5 Online Variational inference.................................................................. 6.6 Summary ........................................................................................................... 152 6.7 Exercises............................................................................................................. 153 147 48 148 149 150 151 15] Nonparametric
Priors...................................................................................................155 7.1 7.2 Tlie Dirichlet Process: fhree Views.................................................................. 7.1.1 ihe Stick-Breaking Process .................................................................. 7.1.2 Ihe Chinese Restaurant Process............................................................ Dirichlet Process Mixtures................................................................................. 7.2.1 Inference with Dirichlet Process Mixtures........................................... 7.2.2 Dirichlet Process Mixture as a Limit of Mixture Models................... 156 157 159 101 161 165 7.3 Hie Hierarchical Dirichlet Process .................................................................. 165 7.4 Hie Pittmn-Yor Process..................................................................................... 16H 7.4.1 Pitman-Yor Process forLanguage Modeling......................................... 169 7.4.2 Power-Law Behavior of the Pitman-Yor Process................................. 170 7.5 Discussion. 7.5.1 Gaussian Processes............................................... 7.5.2 ihe Indian Bullet Process..................................... 7.5.3 Nested Chinese Restaurant Process..................... 7.5.4 Distance-Dependent Chinese Restaurant Process 7.5.5 Sequence Memoizers ........................................... 1 /2 7.6 Summary .......................................................................... 175 7.7 Exercises 1/6
XV 8 Bayesian Grammar Models...................................................................................177 Я.1 Bayesian Midden Markov Models..................................................................... 178 8.1.1 Midden Markov Models with an Infinite State Space.........................179 8.2 Probabilistic Context-Free Grammars................................................................ 181 8.2.1 PCFGs as a Collection of Multinomials............................................. 184 8.2.2 Basic Inference Algorithms for PCFGs............................................... 185 8.2.3 Mídden Markov Models as PCFGs..................................................... 189 8.3 Bayesian Probabilistic Context-Free Grammars................................................189 8.3.1 Priors on PCFGs.......................................................................................189 8.3.2 Monte Carlo Inference with Bayesian PCFGs ...................................190 8.3.3 Variational Inference with Bayesian PCFGs.......................................192 Adaptor Grammars ............................................................................................. 193 8.4.1 Pitman-Yor Adaptor Grammars.............................................................. 194 8.4.2 Stick-Breaking View of PYAG................................................................ 196 8.4.3 Inference with PYAG...............................................................................198 1 lierarchical Dirichlet Process PCl’Gs (MDP-PCFGs) .................................200
8.5.1 Extensions to the MDP-PCFG Model............................................... 201 Dependency Grammars.................................................................................... 202 8.6.1 State-Split Nonparametric Dependency Models...................................203 Synchronous Grammars...................................................................................... 205 Multilingual 1,earning.........................................................................................206 8.8.1 PatT-ol -Speech Fagging..........................................................................206 8.8.2 Grammar Induction ................................................................................ 208 8.4 8.5 8.6 8.7 8.8 8.9 8.10 Further Reading...................................................................................................209 Summary ............................................................................................................. 211 8.11 Exercises.................................................................................................................212 9 Representation Learning and Neural Networks................................................213 9.1 9.2 Neural Networks and Representation Learning: Why Now?...........................214 Word Embeddings...............................................................................................217 9.2.1 Skip-Gram Models for Word Embeddings........................................... 218 9.2.2 Bayesian Skip-Gram Word
Embeddings............................................... 220 9.2.3 Discussion.................................................................................................221 9.3 Neural Networks ...............................................................................................222 9.3.1 Frequentisi Estimation and the Backpropagation Algorithm.............. 225 9.3.2 Priors on Neural Network Weights ........................................................228
XVÏ 9.4 9.5 Modern Use of Neural Networks in NLP.......................................................... 250 9.4.1 Recurrent and Recursive Neural Networks .............................................. 250 9.4.2 Vanishing and Exploding Gradient Problem............................................ 252 9.4.3 Neural Encoder-Decoder Models ............................................................ 255 9.4.4 Convolutional Neural Networks..................................................................254 242 runing Neural Networks.............................. 9.5.1 Regularization.................................... 9.5.2 Hvperparameter Tuning................... 9.6 Generative Modeling with Neural Networks 9.6.1 Variational Autoencoders................. 9.6.2 Generative Adversarial Networks . . . 9.7 Conclusion..................................................... 9.8 Exorcises.......................................................... 24 1 244 246 252 254 25o Closing Remarks ............................................................................................................257 Basic Concepts................................................................................................................. 259 Л.1 Basic Concepts in Information lheorv..................................................................254 A.1.1 Entropy and Cross Entropy....................................................................... 2M A.1.2 Kailllrack-Eeibler Divergence.....................................................................260 В Л.2 Other Basic Concepts
...........................................................................................260 A.2.1 Jensen’s Inequality.........................................................................................MU A.2.2 Tie Chain Rule lor Diflercntiation ........................................................261 A.2.3 Transformation ot Continuous Random Variables .............................. 261 Л.2.4 The Expectation-Maximization Algorithm............................................. 2(42 A.3 Basic Concepts in Optimization............................................................................ 2 4 A.3.1 Stochastic Gradient Descent ......................................................................2o4 A.3.2 Constrained Optimization......................................................................... 2 5 Distribution Catalog...................................... .............................................................267 B.1 The Multinomial Distribution..............................................................................2(l- B.2 Ihe Dirichlet Distribution.................................................................................... 2oS B.3 Tie Poisson Distribution.......................................................................................2i B.4 B.5 Tie Gamma Distribution........................................................................................-64 Tie Multivariate Normal Distribution ............................................................... 2/0 B.6 Tie Laplace
Distribution.......................................................................................2 0 271 Tie Logistic Normal Distribution B.7
xvii ii.H U.Ч lhc Inverse Wishart Distribution.................................................................. 272 lhe Gumbel Distribution ..............................................................................273 Bibliography........................................................................................................... 275 Authors Biography.................................................................................................305 index ....................................................................................................................... 307
|
any_adam_object | 1 |
author | Cohen, Shay |
author_GND | (DE-588)1108783767 |
author_facet | Cohen, Shay |
author_role | aut |
author_sort | Cohen, Shay |
author_variant | s c sc |
building | Verbundindex |
bvnumber | BV045873206 |
classification_rvk | ST 306 |
ctrlnum | (OCoLC)1104916210 (DE-599)HBZHT020045625 |
discipline | Informatik |
edition | Second edition |
format | Book |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01853nam a2200421 cb4500</leader><controlfield tag="001">BV045873206</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20190603 </controlfield><controlfield tag="007">t</controlfield><controlfield tag="008">190513s2019 |||| |||| 00||| eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9781681735283</subfield><subfield code="c">hardback</subfield><subfield code="9">978-1-68173-528-3</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9781681735269</subfield><subfield code="c">paperback : $79.95</subfield><subfield code="9">978-1-68173-526-9</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)1104916210</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)HBZHT020045625</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">rda</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-29</subfield><subfield code="a">DE-19</subfield><subfield code="a">DE-739</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 306</subfield><subfield code="0">(DE-625)143654:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Cohen, Shay</subfield><subfield code="e">Verfasser</subfield><subfield code="0">(DE-588)1108783767</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Bayesian analysis in natural language processing</subfield><subfield code="c">Shay Cohen (University of Edinburgh)</subfield></datafield><datafield tag="250" ind1=" " ind2=" "><subfield code="a">Second edition</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">[San Rafael, California]</subfield><subfield code="b">Morgan & Claypool Publishers</subfield><subfield code="c">[2019]</subfield></datafield><datafield tag="264" ind1=" " ind2="4"><subfield code="c">© 2019</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">xxxi, 311 Seiten</subfield><subfield code="b">Diagramme</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="490" ind1="1" ind2=" "><subfield code="a">Synthesis lectures on human language technologies</subfield><subfield code="v">#41</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Sprachverarbeitung</subfield><subfield code="0">(DE-588)4116579-2</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Bayes-Verfahren</subfield><subfield code="0">(DE-588)4204326-8</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Bayes-Verfahren</subfield><subfield code="0">(DE-588)4204326-8</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="1"><subfield code="a">Sprachverarbeitung</subfield><subfield code="0">(DE-588)4116579-2</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Erscheint auch als</subfield><subfield code="n">Online-Ausgabe</subfield><subfield code="z">978-1-68173-527-6</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Erscheint auch als</subfield><subfield code="n">Online-Ausgabe, epub</subfield><subfield code="z">978-1-68173-529-0</subfield></datafield><datafield tag="830" ind1=" " ind2="0"><subfield code="a">Synthesis lectures on human language technologies</subfield><subfield code="v">#41</subfield><subfield code="w">(DE-604)BV035447238</subfield><subfield code="9">41</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="m">Digitalisierung UB Passau - ADAM Catalogue Enrichment</subfield><subfield code="q">application/pdf</subfield><subfield code="u">http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=031256494&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA</subfield><subfield code="3">Inhaltsverzeichnis</subfield></datafield><datafield tag="999" ind1=" " ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-031256494</subfield></datafield></record></collection> |
id | DE-604.BV045873206 |
illustrated | Not Illustrated |
indexdate | 2024-07-10T08:29:03Z |
institution | BVB |
isbn | 9781681735283 9781681735269 |
language | English |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-031256494 |
oclc_num | 1104916210 |
open_access_boolean | |
owner | DE-29 DE-19 DE-BY-UBM DE-739 |
owner_facet | DE-29 DE-19 DE-BY-UBM DE-739 |
physical | xxxi, 311 Seiten Diagramme |
publishDate | 2019 |
publishDateSearch | 2019 |
publishDateSort | 2019 |
publisher | Morgan & Claypool Publishers |
record_format | marc |
series | Synthesis lectures on human language technologies |
series2 | Synthesis lectures on human language technologies |
spelling | Cohen, Shay Verfasser (DE-588)1108783767 aut Bayesian analysis in natural language processing Shay Cohen (University of Edinburgh) Second edition [San Rafael, California] Morgan & Claypool Publishers [2019] © 2019 xxxi, 311 Seiten Diagramme txt rdacontent n rdamedia nc rdacarrier Synthesis lectures on human language technologies #41 Sprachverarbeitung (DE-588)4116579-2 gnd rswk-swf Bayes-Verfahren (DE-588)4204326-8 gnd rswk-swf Bayes-Verfahren (DE-588)4204326-8 s Sprachverarbeitung (DE-588)4116579-2 s DE-604 Erscheint auch als Online-Ausgabe 978-1-68173-527-6 Erscheint auch als Online-Ausgabe, epub 978-1-68173-529-0 Synthesis lectures on human language technologies #41 (DE-604)BV035447238 41 Digitalisierung UB Passau - ADAM Catalogue Enrichment application/pdf http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=031256494&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA Inhaltsverzeichnis |
spellingShingle | Cohen, Shay Bayesian analysis in natural language processing Synthesis lectures on human language technologies Sprachverarbeitung (DE-588)4116579-2 gnd Bayes-Verfahren (DE-588)4204326-8 gnd |
subject_GND | (DE-588)4116579-2 (DE-588)4204326-8 |
title | Bayesian analysis in natural language processing |
title_auth | Bayesian analysis in natural language processing |
title_exact_search | Bayesian analysis in natural language processing |
title_full | Bayesian analysis in natural language processing Shay Cohen (University of Edinburgh) |
title_fullStr | Bayesian analysis in natural language processing Shay Cohen (University of Edinburgh) |
title_full_unstemmed | Bayesian analysis in natural language processing Shay Cohen (University of Edinburgh) |
title_short | Bayesian analysis in natural language processing |
title_sort | bayesian analysis in natural language processing |
topic | Sprachverarbeitung (DE-588)4116579-2 gnd Bayes-Verfahren (DE-588)4204326-8 gnd |
topic_facet | Sprachverarbeitung Bayes-Verfahren |
url | http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=031256494&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |
volume_link | (DE-604)BV035447238 |
work_keys_str_mv | AT cohenshay bayesiananalysisinnaturallanguageprocessing |