The informational complexity of learning: perspectives on neural networks and generative grammar
Among other topics, The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar brings together two important but very different learning problems within the same analytical framework. The first concerns the problem of learning functional mappings using neural ne...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Buch |
Sprache: | English |
Veröffentlicht: |
Boston, Mass. [u.a.]
Kluwer Acad. Publ.
1998
|
Schlagworte: | |
Online-Zugang: | Inhaltsverzeichnis |
Zusammenfassung: | Among other topics, The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar brings together two important but very different learning problems within the same analytical framework. The first concerns the problem of learning functional mappings using neural networks, followed by learning natural language grammars in the principles and parameters tradition of Chomsky. These two learning problems are seemingly very different. Neural networks are real-valued, infinite-dimensional, continuous mappings. On the other hand, grammars are boolean-valued, finite-dimensional, discrete (symbolic) mappings. Furthermore the research communities that work in the two areas almost never overlap. The book's objective is to bridge this gap. It uses the formal techniques developed in statistical learning theory and theoretical computer science over the last decade to analyze both kinds of learning problems. By asking the same question - how much information does it take to learn - of both problems, it highlights their similarities and differences. Specific results include model selection in neural networks, active learning, language learning and evolutionary models of language change. |
Beschreibung: | XXI, 224 S. graph. Darst. |
ISBN: | 0792380819 |
Internformat
MARC
LEADER | 00000nam a2200000 c 4500 | ||
---|---|---|---|
001 | BV011833233 | ||
003 | DE-604 | ||
005 | 19980709 | ||
007 | t | ||
008 | 980317s1998 d||| m||| 00||| eng d | ||
020 | |a 0792380819 |9 0-7923-8081-9 | ||
035 | |a (OCoLC)37755233 | ||
035 | |a (DE-599)BVBBV011833233 | ||
040 | |a DE-604 |b ger |e rakddb | ||
041 | 0 | |a eng | |
049 | |a DE-739 |a DE-12 |a DE-19 | ||
050 | 0 | |a P98 | |
082 | 0 | |a 410/.285 |2 21 | |
084 | |a ER 900 |0 (DE-625)27772: |2 rvk | ||
100 | 1 | |a Niyogi, Partha |e Verfasser |4 aut | |
245 | 1 | 0 | |a The informational complexity of learning |b perspectives on neural networks and generative grammar |c Partha Niyogi |
264 | 1 | |a Boston, Mass. [u.a.] |b Kluwer Acad. Publ. |c 1998 | |
300 | |a XXI, 224 S. |b graph. Darst. | ||
336 | |b txt |2 rdacontent | ||
337 | |b n |2 rdamedia | ||
338 | |b nc |2 rdacarrier | ||
520 | 3 | |a Among other topics, The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar brings together two important but very different learning problems within the same analytical framework. The first concerns the problem of learning functional mappings using neural networks, followed by learning natural language grammars in the principles and parameters tradition of Chomsky. These two learning problems are seemingly very different. Neural networks are real-valued, infinite-dimensional, continuous mappings. On the other hand, grammars are boolean-valued, finite-dimensional, discrete (symbolic) mappings. Furthermore the research communities that work in the two areas almost never overlap. The book's objective is to bridge this gap. It uses the formal techniques developed in statistical learning theory and theoretical computer science over the last decade to analyze both kinds of learning problems. By asking the same question - how much information does it take to learn - of both problems, it highlights their similarities and differences. Specific results include model selection in neural networks, active learning, language learning and evolutionary models of language change. | |
650 | 7 | |a Acquisition des connaissances (systèmes experts) |2 ram | |
650 | 7 | |a Analyse automatique (linguistique) |2 ram | |
650 | 7 | |a Apprentissage automatique |2 ram | |
650 | 7 | |a Computerlinguïstiek |2 gtt | |
650 | 7 | |a Grammaire générative |2 ram | |
650 | 7 | |a Intelligence artificielle |2 ram | |
650 | 7 | |a Kunstmatige intelligentie |2 gtt | |
650 | 7 | |a Neurale netwerken |2 gtt | |
650 | 7 | |a Réseaux neuronaux (informatique) |2 ram | |
650 | 7 | |a Taalverwerving |2 gtt | |
650 | 4 | |a Künstliche Intelligenz | |
650 | 4 | |a Computational linguistics | |
650 | 4 | |a Language acquisition | |
650 | 4 | |a Linguistic change | |
650 | 4 | |a Machine learning | |
650 | 4 | |a Neural networks (Computer science) | |
650 | 0 | 7 | |a Spracherwerb |0 (DE-588)4056458-7 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Generative Grammatik |0 (DE-588)4113707-3 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Neuronales Netz |0 (DE-588)4226127-2 |2 gnd |9 rswk-swf |
655 | 7 | |0 (DE-588)4113937-9 |a Hochschulschrift |2 gnd-content | |
689 | 0 | 0 | |a Generative Grammatik |0 (DE-588)4113707-3 |D s |
689 | 0 | 1 | |a Neuronales Netz |0 (DE-588)4226127-2 |D s |
689 | 0 | 2 | |a Spracherwerb |0 (DE-588)4056458-7 |D s |
689 | 0 | |5 DE-604 | |
856 | 4 | 2 | |m HEBIS Datenaustausch |q application/pdf |u http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=007991245&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |3 Inhaltsverzeichnis |
999 | |a oai:aleph.bib-bvb.de:BVB01-007991245 |
Datensatz im Suchindex
_version_ | 1804126377039888384 |
---|---|
adam_text | THE INFORMATIONAL COMPLEXITY OF
LEARNING
Perspectives on Neural Networks
and Generative Grammar
PARTHA NIYOGI
Massachusetts Institute of Technology
Cambridge, MA
Kluwer Academic Publishers
Boston/Dordrecht/London
Contents
List of Figures ix
Foreword xv
Preface xix
Acknowledgments xxi
1 INTRODUCTION 1
1 1 The Components of a Learning Paradigm 3
111 Concepts, Hypotheses, and Learners 3
112 Generalization, Learnability, Successful learning 6
113 Informational Complexity 7
1 2 Parametric Hypothesis Spaces 11
1 3 Technical Contents and Major Contributions 13
131A Final Word 19
2 GENERALIZATION ERROR FOR NEURAL NETS 21
2 1 Introduction 21
2 2 Definitions and Statement of the Problem 24
221 Random Variables and Probability Distributions 24
222 Learning from Examples and Estimators 25
223 The Expected Risk and the Regression Function 26
224 The Empirical Risk 28
225 The Problem 28
226 Bounding the Generalization Error 30
227A Note on Models and Model Complexity 33
2 3 Stating the Problem for Radial Basis Functions 34
2 4 Main Result 36
2 5 Remarks 36
251 Observations on the Main Result 36
252 Extensions 37
253 Connections with Other Results 39
2 6 Implications of the Theorem in Practice: Putting In the Numbers 40
261 Rate of Growth of n for Guaranteed Convergence 40
262 Optimal Choice of n 41
263 Experiments 45
vi INFORMATIONAL COMPLEXITY OF LEARNING
2 7 Conclusion 50
2-A Notations 50
2-B A Useful Decomposition of the Expected Risk 56
2-C A Useful Inequality 56
2-D Proof of the Main Theorem 57
2-D l Bounding the approximation error 58
2-D 2 Bounding the estimation error 60
2-D 3 Bounding the generalization error 72
3 ACTIVE LEARNING 75
31A General Framework For Active Approximation 77
311 Preliminaries 77
312 The Problem of Collecting Examples 80
313 In Context 83
3 2 Example 1: A Class of Monotonically Increasing Bounded Functions 86
321 Lower Bound for Passive Learning 87
322 Active Learning Algorithms 88
3221 Derivation of an optimal sampling strategy 88
323 Empirical Simulations, and other Investigations 94
3231 Distribution of Points Selected 94
3232 Classical Optimal Recovery 95
3233 Error Rates and Sample Complexities for some Arbitrary
Functions: Some Simulations 97
3 3 Example 2: A Class of Functions with Bounded First Derivative 100
331 Lower Bounds 102
332 Active Learning Algorithms 105
3321 Derivation of an optimal sampling strategy 105
333 Some Simulations 110
3331 Distribution of points selected 110
3332 Error Rates: 113
3 4 Conclusions, Extensions, and Open Problems 115
35A Simple Example 117
3 6 Generalizations 119
361 Localized Function Classes 119
362 The General e-focusing strategy; 120
363 Generalizations and Open Problems 122
4 LANGUAGE LEARNING 125
4 1 Language Learning and The Poverty of Stimulus 126
4 2 Constrained Grammars-Principles and Parameters 128
421 Example: A 3-parameter System from Syntax 129
422 Example: Parameterized Metrical Stress in Phonology 132
4 3 Learning in the Principles and Parameters Framework 134
4 4 Formal Analysis of the Triggering Learning Algorithm 137
441 Background 138
442 The Markov formulation 139
4421 Parameterized Grammars and their Corresponding Markov
Chains 139
Contents VU
4422 Markov Chain Criteria for Learnability 140
4423 The Markov chain for the 3-parameter Example 143
4,4,3 Derivation of the transition probabilities for the Markov TLA structure
4431 Formalization 145
4432 Additional Properties of the Learning System 147
4 5 Characterizing Convergence Times for the Markov Chain Model 148
451 Some Transition Matrices and Their Convergence Curves 148
452 Absorption Times 152
453 Eigenvalue Rates of Convergence 153
4 53 1 Eigenvalues and Eigenvectors 153
4532 Representation of Tk 154
4533 Initial Conditions and Limiting Distributions 155
4534 Rate of Convergence 156
4535 Transition Matrix Recipes: 156
4 6 Exploring Other Points 157
461 Changing the Algorithm 157
462 Distributional Assumptions 159
463 Natural Distributions-CHIIDES CORPUS 160
4 7 Batch Learning Upper and Lower Bounds: An Aside 162
4 8 Conclusions, Open Questions, and Future Directions 164
4 A Unembedded Sentences For Parametric Grammars 167
4 B Memoryless Algorithms and Markov Chains 167
4*C Proof of Learnability Theorem 168
4*C,1 Markov state terminology 168
4-C 2 Canonical Decomposition 169
4-0 Formal Proof 170
S, LANGUAGE CHANGE 173
5 1 Introduction 173
5 2 Language Change in Parametric Systems 181
5 3 Example 1; A Three Parameter System 182
531 Starting with Homogeneous Populations; 183
5311A= TLA; P, = Uniform; Finite Sample a 128 183
5312A= Greedy, No S,V,i Pi m Uniform; Finite Sample = 128 186
5313A= a) R,W, b) S, V, only; Pi = Uniform; Finite Sample
= 128 187
5314 Rates of Change 188
532 Non-homogeneous Populations; Phase=Spaee Plots 192
5321 Phase-Space Plots; Grammatical Trajectories 193
5322 Issues of Stability 194
5 4 Example 2: The Case of Modern French: 196
541 The Parametric Subspace and Data 197
542 The Case of Diachronic Syntax Change in French 198
543 Some Dynamical System Simulations 199
5431 Homogeneous Populations [Initial-Old French] 199
5432 Heterogeneous Populations (Mixtures) 201
5 5 Conclusions 2Q3
viii INFORMATIONAL COMPLEXITY OF LEARNING
6 CONCLUSIONS 207
6 1 Emergent Themes 208
6 2 Extensions 210
63A Concluding Note 212
References 213
|
any_adam_object | 1 |
author | Niyogi, Partha |
author_facet | Niyogi, Partha |
author_role | aut |
author_sort | Niyogi, Partha |
author_variant | p n pn |
building | Verbundindex |
bvnumber | BV011833233 |
callnumber-first | P - Language and Literature |
callnumber-label | P98 |
callnumber-raw | P98 |
callnumber-search | P98 |
callnumber-sort | P 298 |
callnumber-subject | P - Philology and Linguistics |
classification_rvk | ER 900 |
ctrlnum | (OCoLC)37755233 (DE-599)BVBBV011833233 |
dewey-full | 410/.285 |
dewey-hundreds | 400 - Language |
dewey-ones | 410 - Linguistics |
dewey-raw | 410/.285 |
dewey-search | 410/.285 |
dewey-sort | 3410 3285 |
dewey-tens | 410 - Linguistics |
discipline | Sprachwissenschaft Literaturwissenschaft |
format | Book |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>03553nam a2200601 c 4500</leader><controlfield tag="001">BV011833233</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">19980709 </controlfield><controlfield tag="007">t</controlfield><controlfield tag="008">980317s1998 d||| m||| 00||| eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">0792380819</subfield><subfield code="9">0-7923-8081-9</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)37755233</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)BVBBV011833233</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">rakddb</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-739</subfield><subfield code="a">DE-12</subfield><subfield code="a">DE-19</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">P98</subfield></datafield><datafield tag="082" ind1="0" ind2=" "><subfield code="a">410/.285</subfield><subfield code="2">21</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ER 900</subfield><subfield code="0">(DE-625)27772:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Niyogi, Partha</subfield><subfield code="e">Verfasser</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">The informational complexity of learning</subfield><subfield code="b">perspectives on neural networks and generative grammar</subfield><subfield code="c">Partha Niyogi</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Boston, Mass. [u.a.]</subfield><subfield code="b">Kluwer Acad. Publ.</subfield><subfield code="c">1998</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">XXI, 224 S.</subfield><subfield code="b">graph. Darst.</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1="3" ind2=" "><subfield code="a">Among other topics, The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar brings together two important but very different learning problems within the same analytical framework. The first concerns the problem of learning functional mappings using neural networks, followed by learning natural language grammars in the principles and parameters tradition of Chomsky. These two learning problems are seemingly very different. Neural networks are real-valued, infinite-dimensional, continuous mappings. On the other hand, grammars are boolean-valued, finite-dimensional, discrete (symbolic) mappings. Furthermore the research communities that work in the two areas almost never overlap. The book's objective is to bridge this gap. It uses the formal techniques developed in statistical learning theory and theoretical computer science over the last decade to analyze both kinds of learning problems. By asking the same question - how much information does it take to learn - of both problems, it highlights their similarities and differences. Specific results include model selection in neural networks, active learning, language learning and evolutionary models of language change.</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Acquisition des connaissances (systèmes experts)</subfield><subfield code="2">ram</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Analyse automatique (linguistique)</subfield><subfield code="2">ram</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Apprentissage automatique</subfield><subfield code="2">ram</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Computerlinguïstiek</subfield><subfield code="2">gtt</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Grammaire générative</subfield><subfield code="2">ram</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Intelligence artificielle</subfield><subfield code="2">ram</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Kunstmatige intelligentie</subfield><subfield code="2">gtt</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Neurale netwerken</subfield><subfield code="2">gtt</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Réseaux neuronaux (informatique)</subfield><subfield code="2">ram</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Taalverwerving</subfield><subfield code="2">gtt</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Künstliche Intelligenz</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Computational linguistics</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Language acquisition</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Linguistic change</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Machine learning</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Neural networks (Computer science)</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Spracherwerb</subfield><subfield code="0">(DE-588)4056458-7</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Generative Grammatik</subfield><subfield code="0">(DE-588)4113707-3</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Neuronales Netz</subfield><subfield code="0">(DE-588)4226127-2</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="655" ind1=" " ind2="7"><subfield code="0">(DE-588)4113937-9</subfield><subfield code="a">Hochschulschrift</subfield><subfield code="2">gnd-content</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Generative Grammatik</subfield><subfield code="0">(DE-588)4113707-3</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="1"><subfield code="a">Neuronales Netz</subfield><subfield code="0">(DE-588)4226127-2</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="2"><subfield code="a">Spracherwerb</subfield><subfield code="0">(DE-588)4056458-7</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="m">HEBIS Datenaustausch</subfield><subfield code="q">application/pdf</subfield><subfield code="u">http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=007991245&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA</subfield><subfield code="3">Inhaltsverzeichnis</subfield></datafield><datafield tag="999" ind1=" " ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-007991245</subfield></datafield></record></collection> |
genre | (DE-588)4113937-9 Hochschulschrift gnd-content |
genre_facet | Hochschulschrift |
id | DE-604.BV011833233 |
illustrated | Illustrated |
indexdate | 2024-07-09T18:16:29Z |
institution | BVB |
isbn | 0792380819 |
language | English |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-007991245 |
oclc_num | 37755233 |
open_access_boolean | |
owner | DE-739 DE-12 DE-19 DE-BY-UBM |
owner_facet | DE-739 DE-12 DE-19 DE-BY-UBM |
physical | XXI, 224 S. graph. Darst. |
publishDate | 1998 |
publishDateSearch | 1998 |
publishDateSort | 1998 |
publisher | Kluwer Acad. Publ. |
record_format | marc |
spelling | Niyogi, Partha Verfasser aut The informational complexity of learning perspectives on neural networks and generative grammar Partha Niyogi Boston, Mass. [u.a.] Kluwer Acad. Publ. 1998 XXI, 224 S. graph. Darst. txt rdacontent n rdamedia nc rdacarrier Among other topics, The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar brings together two important but very different learning problems within the same analytical framework. The first concerns the problem of learning functional mappings using neural networks, followed by learning natural language grammars in the principles and parameters tradition of Chomsky. These two learning problems are seemingly very different. Neural networks are real-valued, infinite-dimensional, continuous mappings. On the other hand, grammars are boolean-valued, finite-dimensional, discrete (symbolic) mappings. Furthermore the research communities that work in the two areas almost never overlap. The book's objective is to bridge this gap. It uses the formal techniques developed in statistical learning theory and theoretical computer science over the last decade to analyze both kinds of learning problems. By asking the same question - how much information does it take to learn - of both problems, it highlights their similarities and differences. Specific results include model selection in neural networks, active learning, language learning and evolutionary models of language change. Acquisition des connaissances (systèmes experts) ram Analyse automatique (linguistique) ram Apprentissage automatique ram Computerlinguïstiek gtt Grammaire générative ram Intelligence artificielle ram Kunstmatige intelligentie gtt Neurale netwerken gtt Réseaux neuronaux (informatique) ram Taalverwerving gtt Künstliche Intelligenz Computational linguistics Language acquisition Linguistic change Machine learning Neural networks (Computer science) Spracherwerb (DE-588)4056458-7 gnd rswk-swf Generative Grammatik (DE-588)4113707-3 gnd rswk-swf Neuronales Netz (DE-588)4226127-2 gnd rswk-swf (DE-588)4113937-9 Hochschulschrift gnd-content Generative Grammatik (DE-588)4113707-3 s Neuronales Netz (DE-588)4226127-2 s Spracherwerb (DE-588)4056458-7 s DE-604 HEBIS Datenaustausch application/pdf http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=007991245&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA Inhaltsverzeichnis |
spellingShingle | Niyogi, Partha The informational complexity of learning perspectives on neural networks and generative grammar Acquisition des connaissances (systèmes experts) ram Analyse automatique (linguistique) ram Apprentissage automatique ram Computerlinguïstiek gtt Grammaire générative ram Intelligence artificielle ram Kunstmatige intelligentie gtt Neurale netwerken gtt Réseaux neuronaux (informatique) ram Taalverwerving gtt Künstliche Intelligenz Computational linguistics Language acquisition Linguistic change Machine learning Neural networks (Computer science) Spracherwerb (DE-588)4056458-7 gnd Generative Grammatik (DE-588)4113707-3 gnd Neuronales Netz (DE-588)4226127-2 gnd |
subject_GND | (DE-588)4056458-7 (DE-588)4113707-3 (DE-588)4226127-2 (DE-588)4113937-9 |
title | The informational complexity of learning perspectives on neural networks and generative grammar |
title_auth | The informational complexity of learning perspectives on neural networks and generative grammar |
title_exact_search | The informational complexity of learning perspectives on neural networks and generative grammar |
title_full | The informational complexity of learning perspectives on neural networks and generative grammar Partha Niyogi |
title_fullStr | The informational complexity of learning perspectives on neural networks and generative grammar Partha Niyogi |
title_full_unstemmed | The informational complexity of learning perspectives on neural networks and generative grammar Partha Niyogi |
title_short | The informational complexity of learning |
title_sort | the informational complexity of learning perspectives on neural networks and generative grammar |
title_sub | perspectives on neural networks and generative grammar |
topic | Acquisition des connaissances (systèmes experts) ram Analyse automatique (linguistique) ram Apprentissage automatique ram Computerlinguïstiek gtt Grammaire générative ram Intelligence artificielle ram Kunstmatige intelligentie gtt Neurale netwerken gtt Réseaux neuronaux (informatique) ram Taalverwerving gtt Künstliche Intelligenz Computational linguistics Language acquisition Linguistic change Machine learning Neural networks (Computer science) Spracherwerb (DE-588)4056458-7 gnd Generative Grammatik (DE-588)4113707-3 gnd Neuronales Netz (DE-588)4226127-2 gnd |
topic_facet | Acquisition des connaissances (systèmes experts) Analyse automatique (linguistique) Apprentissage automatique Computerlinguïstiek Grammaire générative Intelligence artificielle Kunstmatige intelligentie Neurale netwerken Réseaux neuronaux (informatique) Taalverwerving Künstliche Intelligenz Computational linguistics Language acquisition Linguistic change Machine learning Neural networks (Computer science) Spracherwerb Generative Grammatik Neuronales Netz Hochschulschrift |
url | http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=007991245&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |
work_keys_str_mv | AT niyogipartha theinformationalcomplexityoflearningperspectivesonneuralnetworksandgenerativegrammar |