Neural networks theory:
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Buch |
Sprache: | English |
Veröffentlicht: |
Berlin [u.a.]
Springer
2007
|
Schlagworte: | |
Online-Zugang: | Beschreibung für Leser Inhaltstext Cover Inhaltsverzeichnis |
Beschreibung: | XX, 396 S. graph. Darst. 24 cm |
ISBN: | 3540481249 |
Internformat
MARC
LEADER | 00000nam a2200000zc 4500 | ||
---|---|---|---|
001 | BV023803045 | ||
003 | DE-604 | ||
005 | 20090426000000.0 | ||
007 | t | ||
008 | 080317s2007 d||| |||| 00||| eng d | ||
020 | |a 3540481249 |9 3-540-48124-9 | ||
035 | |a (OCoLC)255150684 | ||
035 | |a (DE-599)BVBBV023803045 | ||
040 | |a DE-604 |b ger | ||
041 | 0 | |a eng | |
049 | |a DE-634 |a DE-83 | ||
084 | |a ST 301 |0 (DE-625)143651: |2 rvk | ||
100 | 1 | |a Galuškin, Aleksandr I. |e Verfasser |4 aut | |
245 | 1 | 0 | |a Neural networks theory |c Alexander I. Galushkin |
264 | 1 | |a Berlin [u.a.] |b Springer |c 2007 | |
300 | |a XX, 396 S. |b graph. Darst. |c 24 cm | ||
336 | |b txt |2 rdacontent | ||
337 | |b n |2 rdamedia | ||
338 | |b nc |2 rdacarrier | ||
650 | 0 | 7 | |a Neuronales Netz |0 (DE-588)4226127-2 |2 gnd |9 rswk-swf |
689 | 0 | 0 | |a Neuronales Netz |0 (DE-588)4226127-2 |D s |
689 | 0 | |5 DE-604 | |
856 | 4 | |u http://deposit.dnb.de/cgi-bin/dokserv?id=2857422&prov=M&dok_var=1&dok_ext=htm |3 Beschreibung für Leser | |
856 | 4 | |q text/html |u http://deposit.dnb.de/cgi-bin/dokserv?id=2857422&prov=M&dok%5Fvar=1&dok%5Fext=htm |3 Inhaltstext | |
856 | 4 | |q text/html |u http://swbplus.bsz-bw.de/bsz258399767cov.htm |3 Cover | |
856 | 4 | 2 | |m HBZ Datenaustausch |q application/pdf |u http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=017445244&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |3 Inhaltsverzeichnis |
943 | 1 | |a oai:aleph.bib-bvb.de:BVB01-017445244 |
Datensatz im Suchindex
_version_ | 1809403642566934528 |
---|---|
adam_text |
Contents
Introduction 1
1.1 Neural Computers 1
1.2 Position of Neural Computers in the Set of Large-Powered Computing Facilities . 5
1.3 The Concept of Computer Universalism 8
1.4 Neural Computer Modularity 9
1.5 The Class of Problems Adequate to Neural Computers 10
1.6 Methods of Coefficient Readjustment 12
1.7 Neural Computer Classification 12
1.8 Some Remarks Concerning the Neural Computer Elemental Base 14
1.9 Neural Mathematics - Methods and Algorithms of Problem Solving
Using Neurocomputers 17
1.10 About Neural Networks 21
1.10.1 Neural Network Structures 23
1.10.2 Investigation of Neural Network Input Signal Characteristics 24
1.10.3 About the Selection of Criteria for Primary Neural Network Optimization 24
1.10.4 Analysis of Open-Loop Neural Networks 25
1.10.5 Algorithms for a Multivariable Functional Extremum Search
and Design of Adaptation Algorithms in Neural Networks 25
1.10.6 Investigation of Neural Network Adaptation Algorithms 27
1.10.7 Multilayer Neural Networks with Flexible Structure 28
1.10.8 Informative Feature Selection in Multilayer Neural Networks 28
1.10.9 Investigation of Neural Network Reliability 29
1.10.10 Neural Network Diagnostics 29
1.11 Conclusions 30
Iiterature 31
Appendix 31
A.1 Theory of Multilayer Neural Networks 31
A.2 Neural Computer Implementation 32
A.3 Neural Computer Elemental Base 32
Partl • The Structure of Neural Networks 33
1 Transfer from the Logical Basis of Boolean Elements "And, Or, Not"
to the Threshold Logical Basis 35
1.1 Linear Threshold Element (Neuron) 35
aiv i-untents
1.2 Multi-Threshold Logics 37
1.3 Continuous Logic 38
1.4 Particular Forms of Activation Function 39
literature 40
2 Qualitative Characteristics of Neural Network Architectures 43
2.1 Particular Types of Neural Network Architectures 43
2.2 Multilayer Neural Networks with Sequential Connections 45
2.3 Structural and Symbolic Description of Multilayer Neural Networks 47
Literature 52
3 Optimization of Cross Connection Multilayer Neural Network Structure 53
3.1 About the Problem Complexity Criterion 53
3.2 One-Dimensional Variant of the Neural Network with Cross Connections 54
3.3 Calculation of Upper and Lower Estimation of the Number of Regions 55
3.4 Particular Optimization Problem 57
3.5 Structural Optimization by Some Main Topological Characteristics 60
3.6 Optimization of a Multilayer Neural Network Structure with Kp Solutions 64
Literature 66
4 Continual Neural Networks 67
4.1 Neurons with Continuum Input Features 67
4.2 Continuum of Neurons in the Layer 68
4.3 Continuum Neurons in the Layer and Discrete Feature Set 68
4.4 Classification of Continuum Neuron Layer Models 69
4.4.1 Discrete Set of Neurons 69
4.4.2 One-Dimensional and Two-Dimensional m2 Feature Space 69
4.4.3 Continuum of Features - One-Dimensional m1 for Several Channels . 71
4.4.4 Feature Continuum - Two-Dimensional ml 72
4.4.5 Neuron Layer with a Continuum of Output Values 72
literature 74
Part II • Optimal Models of Neural Networks 75
5 Investigation of Neural Network Input Signal Characteristics 77
5.1 Problem Statement 77
5.2 Joint Probability Distribution of the Input Signal for Two Pattern Classes 79
5.3 Joint Distribution Law for the Input Signal Probabilities
in the Case of K Classes of Patterns 84
Literature 87
6 Design of Neural Network Optimal Models 89
6.1 General Structure of the Optimal Model 89
6.2 Analytical Representation of Divisional Surfaces in Typical Neural Networks . 90
6.3 Optimal Neural Network Model for Multidimensional Signals e(n) and y(n) . 110
6.4 A Priori Information about the Input Signal in the Self-Learning Mode 113
Contents X'
6.5 About Neural Network Primary Optimization Criteria
in the Self-Learning Mode 114
6.6 Optimal Neural Network Models in the Self-Learning Mode
and Arbitrary Teacher Qualification 116
Iiterature 119
7 Analysis of the Open-Loop Neural Networks 121
7.1 Distribution Laws of Analogous and Discrete Neural Network Errors 121
7.1.1 Neuron with Two Solutions 121
7.1.2 Neuron with a Solution Continuum 124
7.1.3 Analysis of a Neuron with K Solutions 126
7.1.4 Analysis of a Pattern Recognition System
with a Nonlinear Divisional Surface 128
7.2 Selection of the Secondary Optimization Functional 129
7.3 About Selection of the Secondary Optimization Functional
in the "Adalin" System 131
7.4 Development of the Secondary Optimization Functionals
Corresponding to the Given Primary Optimization Criterion 132
7.4.1 The Average Risk Function Minimum Criterion 132
7.4.2 Minimum Criterion for Ä under the Conditionp,?! =p2r2 133
7.4.3 The Minimum Criterion for R underthe Conditionpjr^ a = Const. . 134
7.5 Neural Network Continuum Models 135
7.5.1 Neural Network with a Solution Continuum -
Two Pattern Classes 135
7.5.2 Neural Network with a Solution Continuum -
Continuum of Pattern Classes 137
7.5.3 Neural Network with Kp Solutions -
JCPattern Classes 138
7.5.4 Neural Network with N* Output Channels -
Ko Gradations in Each Class 139
7.5.5 Neural Network with N* Output Channels -
Neural Network Solution Continuum 139
7.6 Neural Network in the Self-Learning Mode and Arbitrary Teacher Qualification . 140
Iiterature 141
8 Development of Multivariable Function Extremum Search Algorithms 143
8.1 Procedure of the Secondary Optimization Functional Extremum Search
in Multilayer Neural Networks 143
8.2 Analysis of the Iteration Method for the Multivariable Function
Extremum Search 143
8.3 About the Stochastic Approximation Method 146
8.4 Iteration Methods for Multivariable Function Extremum Search
in the Case of Equality-Type Constraints upon Variables 146
8.4.1 Search Algorithm 147
8.4.2 Analysis of the Matrix of the Second Derivatives
of the Lagrange Function 148
8.4.3 Operation Speed Optimization for the Extremum Search
Iteration Procedure in the Case of Equality-Type Constraints 148
8.4.4 Optimal Operation Speed under Constraints (8.6) 149
8.4.5 The Case of Constraints of Equality Type That Can Be Solved 149
8.4.6 Iteration Process Stability under Equality-Type Constraints 150
8.4.7 Convergence of the Iteration Search Method
under the Equality-Type Constraints 151
8.5 Iteration Extremum Search Methods for Multivariable Functions
under Inequality-Type Constraints 152
8.5.1 Conditions of Optimality 152
8.5.2 Algorithm of Extremum Search in the Case of Inequality-Type
Constraints 153
8.6 Algorithm of Random Search of Local and Global Extrema
for Multivariable Functions 154
8.7 Development of the Neural Network Adaptation Algorithms with the Use of
Estimations of the Second Order Derivatives of the Secondary Optimization
Functional 155
8.7.1 Development of Search Algorithms 155
8.7.2 One-Dimensional Case 158
Iiterature 158
Part III • Adaptive Neural Networks 161
9 Neural Network Adjustment Algorithms 163
9.1 Problem Statement 163
9.2 Neuron with Two-Solution Continuums 164
9.3 Two-Layer Neural Networks 167
9.4 Multilayer Neural Networks with Solution Continuum Neurons 169
9.5 Design of Neural Networks with Closed Cyde Adjustment
under Constraints upon Variables 170
9.6 Implementation of Primary Optimization Criteria for Neurons
with Two Solutions 173
9.7 Implementation of Minimum Average Risk Function Criterion
for Neurons with Continuum Solutions and Kp Solutions 175
9.8 Implementation of the Minimum Average Risk Function Criterion
for Neural Networks with N" Output Channels (Neuron Layer) 177
9.9 Implementation of the Minimum Average Risk Function Criterion
for Multilayer Neural Networks 178
9.10 Development of Closed-Loop Neural Networks of Non-Stationary Patterns . 180
9.11 Development of Closed-Cycle Adjustable Neural Networks
with Cross and Backward Connections 182
9.12 Development of Closed-Loop Neural Networks in the Learning Modes
with Arbitrary Teacher Qualifkation 183
9.13 Expressions for the Estimations of the Second Order Derivatives
of the Secondary Optimization Functional 185
Iiterature 187
Contents XVI
10 Adjustment of Continuum Neural Networks 189
10.1 Adjustment of a Neuron with a Feature Continuum 190
10.2 Adjustment of the Continuum Neuron Layer 190
10.3 Selection of the Parameter Matrix for the Learning Procedure
of the Continuum Neuron Layer on the Basis of the Random Sample Data 190
10.4 Selection of the Parameter Matrix lt(i,j) for the Learning Procedure
of the Neuron with a Feature Continuum on the Basis of the Random
Sample Data 193
10.5 Characteristic Properties of the TWo-Layer Continuum Neural Network
Adjustment Algorithm 195
10.6 Three Variants of Implementation of the Continuum Neuron Layer
Weighting Functions and Corresponding Learning Procedures 195
10.7 Learning Algorithm with a2g Secondary Optimization Functional
(the Five-Feature Space) for the Two-Layer Continuum Neural Network 198
10.7.1 Learning Algorithm for the Second Layer
(Feature Continuum Neuron) 198
10.7.2 Learning Algorithm for the First Layer
(Continuum Neuron Layer) 199
10.8 Continuum Neuron Layer with Piecewise Constant Weighting Functions 200
10.8.1 Open-Loop Layer Structure 200
10.8.2 Recurrent Adjustment Procedure for the Piecewise Constant
Weighting Functions 201
10.8.3 About Matrix K*(i) Estimation 202
10.9 Continuum Neuron Layer with Piecewise Linear Weighting Functions 202
10.9.1 Open-Loop Structure of the Neuron Layer 202
10.9.2 Recurrent Adjustment Procedure for the Piecewise Linear
Weighting Functions 203
10.10 Continuum Neural Network Layer with Piecewise Constant
Weighting Functions (the Case of Fixed "Footsteps") 205
10.10.1 Open-Loop Layer Structure 205
10.10.2 Recurrent Adjustment Procedure for Piecewise Constant
Weighting Functions with Variable Interval Lengths fs 205
Iiterature 206
11 Selection of Initial Conditions During Neural Network Adjustment -
Typical Neural Network Input Signals 207
11.1 About Selection Methods for Initial Conditions 207
11.2 Algorithm of Deterministic Selection of the Initial Conditions
in the Adjustment Algorithms for Multilayer Neural Networks 208
11.3 Selection of Initial Conditions in Multilayer Neural Networks 212
11.4 Initial Condition Formation for Neural Network Coefficient Setting
in Different Problems of Optimization 216
11.4.1 Linear Equality Systems 217
11.4.2 linear Inequality Systems 217
11.4.3 Approximation and Extrapolation of Functions 218
11.4.4 Pattern Recognition 218
11.4.5 Clusterization 220
11.4.6 Traveling Salesman Problem 220
11.4.7 Dynamic System Modelling 220
11.4.8 Conclusion 221
11.5 Typical Input Signal of Multilayer Neural Networks 221
Iiterature 222
12 Analysis of Closed-Loop Multilayer Neural Networks 223
12.1 Problem Statement for the Synthesis of the Multilayer Neural Networks
Adjusted in the Closed Cycle 223
12.2 Investigation of the Neuron Under the Multi-Modal Distribution
of the Input Signal 224
12.2.1 One-Dimensional Case - Search Adjustment Algorithm 224
12.2.2 Multidimensional Case - Analytical Adjustment Algorithm 226
12.3 Investigation of Dynamics for the Neural Networks of Particular Form
for the Non-Stationary Pattern Recognition 231
12.4 Dynamics of the Three-Layer Neural Network in the Learning Mode 235
12.5 Investigation of the Particular Neural Network
with Backward Connections 239
12.6 Dynamics of One-Layer Neural Networks in the Learning Mode 242
12.6.1 Neural Network with the Search
of the Distribution Mode Centers/(x) 242
12.6.2 Neural Network with W* Output Channels 245
12.6.3 Neuron with Kp Solutions 248
12.7 Two-Layer Neural Network in the Self-Learning Mode 250
12.8 About Some Engineering Methods for the Selection of Matrix Parameters
in the Multilayer Neural Network Closed Cycle Adjustment Algorithms 257
12.9 Design of the Multilayer Neural Network for the Matrix Inversion Problem . 258
12.10 Design of the Multilayer Neural Network for the Number Transformation
from the Binary System into the Decimal One 261
1Z11 Investigation of the Multilayer Neural Network
under the Arbitrary Teacher Qualification 262
1Z12 Analytical Methods of Investigations of the Neural Network
Closed Cycle Adjustment 263
Iiterature 272
13 Synthesis of Multilayer Neural Networks with Flexible Structure 273
13.1 Sequential Learning Algorithm for the First Neuron Layer
of the Multilayer Neural Network 273
13.2 Learning Algorithm for the First Neuron Layer of the Multilayer
Neural Network Using the Method of Random Search of Local
and Global Function Extrema 277
13.3 Analysis of Algorithm Convergence under the Hyperplane
Number Increase 280
13.4 Learning Algorithms for the Second Layer Neurons
of the Two-Layer Neural Network 283
Contents XIX
13.4.1 Condition of the Logical Function e(y) Realizability
Using One Neuron 283
13.4.2 Synthesis of a Neuron by the Functional Minimization Method 285
13.4.3 Neuron Synthesis by the Threshold Function Tables 290
13.5 Learning Algorithm for Neurons of the Second and Third Layers
in the Three-Layer Neural Network 290
13.6 General Methods of the Multilayer Neural Network Successive Synthesis 292
13.7 Learning Method for the First-Layer Neurons of a Multilayer
Neural Network with a Feature Continuum 292
13.8 Application of the Adjustment Algorithm of the Multilayer Neural Networks
with Flexible Structure for the Problem of Initial Condition Selection 293
13.9 About the Self-Learning Algorithm for Multilayer Neural Networks
with Flexible Structure 294
Iiterature 294
14 Informative Feature Selection in Multilayer Neural Networks 295
14.1 Statement of the Informative Feature Selection Problem
in the Learning Mode 295
14.2 About Structural Methods for the Informative Feature Selection
in the Multilayer Neural Networks with Fixed Structure 297
14.3 Selection of the Initial Space Informative Features Using Multilayer Neural
Networks with Sequential Algorithms of the First-Layer Neuron Adjustment. 299
14.4 Neuron Number Minimization 300
14.5 About the Informative Feature Selection for Multilayer Neural Networks
in the Self-Learning Mode 302
Iiterature 302
Part IV • Neural Network Reliability and Diagnostics 303
15 Neural Network Reliability 305
15.1 Methods for the Neural Network Functional Reliability Investigation 305
15.2 Investigation of Functional Reliability of Restoring Organs Implemented
in the Form of Multilayer Neural Networks 306
15.3 Investigation of Multilayer Neural Network's Functional Reliability 308
15.4 Investigation of the Neural Network's Parametrical Reliability 309
15.5 Investigation of the Multilayer Neural Network's Functional Reliability
in the Case of Catastrophic Failures 317
Iiterature 318
16 Neural Network Diagnostics 321
16.1 Neural Network State Graph - The Main Notions and Definitions 322
16.2 Algorithm of Failure Localization in the Neural Networks 323
16.3 Algorithm of the Minimum Test Design for the Failures
of the Logical Constant Type at the Neuron Outputs 331
16.4 Method of the Neural Network Adaptive Failure Diagnostics 332
Iiterature 338
XX Contents
Part V • Summary 339
17 Methods of Problem Solving in the Neural Network Logical Basis 341
17.1 Neuromathematics - A New Perspective Part of Computational Mathematics . 341
17.2 Neural Network Theory - A Logical Basis for the Development
of the Neural Network Problem Solution Algorithms 343
17.3 Selection of the Problems Adequate to the Neural Network Logical Basis 344
17.4 The General Structure of the Program Package for Problem Solution
in the Neural Network Logical Basis 349
17.5 Multilayer Neural Networks with Flexible Structure 350
17.6 Neural Network with Fixed Structure 352
17.6.1 Generation of the Input Signal of the Neural Network 352
17.6.2 The Multilayer Neural Network Output Signal Generation 355
17.6.3 Formation of the Primary Optimization Criteria 355
17.6.4 Selection of the Open Neural Network Structure 356
17.6.5 Remarks about the Selection of the Open Neural Network Structure
that is Adequate to the Class of Solution Tasks 356
17.6.6 Remarks about the Activation Function Selection 358
17.6.7 Selection of the Multilayer Neural Network Structure
According to its Hardware Implementation Technology 359
17.6.8 Generation of the Secondary Optimization Functional
in the Multilayer Neural Networks 360
17.6.9 Generation of the Algorithm of the Search Procedure
for the Secondary Optimization Functional Extremum 360
17.6.10 Formation of the Adaptation Algorithms
in the Multilayer Neural Networks 364
17.7 Verification of the Adjusted Multilayer Neural Network 364
17.8 Elaboration of the Plan of Experiments 365
17.9 About the Importance of the Unification of Designations in the Process
of Synthesis of the Neural Network Adjustment Algorithms 367
17.10 About Myths in Neural Network Theory 368
17.11 Condusion 368
References 376
Condusion 377
Literature 379
Author's Publications on Neural Network Theory 381
Index 391 |
any_adam_object | 1 |
author | Galuškin, Aleksandr I. |
author_facet | Galuškin, Aleksandr I. |
author_role | aut |
author_sort | Galuškin, Aleksandr I. |
author_variant | a i g ai aig |
building | Verbundindex |
bvnumber | BV023803045 |
classification_rvk | ST 301 |
ctrlnum | (OCoLC)255150684 (DE-599)BVBBV023803045 |
discipline | Informatik |
format | Book |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>00000nam a2200000zc 4500</leader><controlfield tag="001">BV023803045</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20090426000000.0</controlfield><controlfield tag="007">t</controlfield><controlfield tag="008">080317s2007 d||| |||| 00||| eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">3540481249</subfield><subfield code="9">3-540-48124-9</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)255150684</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)BVBBV023803045</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-634</subfield><subfield code="a">DE-83</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 301</subfield><subfield code="0">(DE-625)143651:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Galuškin, Aleksandr I.</subfield><subfield code="e">Verfasser</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Neural networks theory</subfield><subfield code="c">Alexander I. Galushkin</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Berlin [u.a.]</subfield><subfield code="b">Springer</subfield><subfield code="c">2007</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">XX, 396 S.</subfield><subfield code="b">graph. Darst.</subfield><subfield code="c">24 cm</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Neuronales Netz</subfield><subfield code="0">(DE-588)4226127-2</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Neuronales Netz</subfield><subfield code="0">(DE-588)4226127-2</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="856" ind1="4" ind2=" "><subfield code="u">http://deposit.dnb.de/cgi-bin/dokserv?id=2857422&prov=M&dok_var=1&dok_ext=htm</subfield><subfield code="3">Beschreibung für Leser</subfield></datafield><datafield tag="856" ind1="4" ind2=" "><subfield code="q">text/html</subfield><subfield code="u">http://deposit.dnb.de/cgi-bin/dokserv?id=2857422&prov=M&dok%5Fvar=1&dok%5Fext=htm</subfield><subfield code="3">Inhaltstext</subfield></datafield><datafield tag="856" ind1="4" ind2=" "><subfield code="q">text/html</subfield><subfield code="u">http://swbplus.bsz-bw.de/bsz258399767cov.htm</subfield><subfield code="3">Cover</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="m">HBZ Datenaustausch</subfield><subfield code="q">application/pdf</subfield><subfield code="u">http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=017445244&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA</subfield><subfield code="3">Inhaltsverzeichnis</subfield></datafield><datafield tag="943" ind1="1" ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-017445244</subfield></datafield></record></collection> |
id | DE-604.BV023803045 |
illustrated | Illustrated |
indexdate | 2024-09-06T00:16:21Z |
institution | BVB |
isbn | 3540481249 |
language | English |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-017445244 |
oclc_num | 255150684 |
open_access_boolean | |
owner | DE-634 DE-83 |
owner_facet | DE-634 DE-83 |
physical | XX, 396 S. graph. Darst. 24 cm |
publishDate | 2007 |
publishDateSearch | 2007 |
publishDateSort | 2007 |
publisher | Springer |
record_format | marc |
spelling | Galuškin, Aleksandr I. Verfasser aut Neural networks theory Alexander I. Galushkin Berlin [u.a.] Springer 2007 XX, 396 S. graph. Darst. 24 cm txt rdacontent n rdamedia nc rdacarrier Neuronales Netz (DE-588)4226127-2 gnd rswk-swf Neuronales Netz (DE-588)4226127-2 s DE-604 http://deposit.dnb.de/cgi-bin/dokserv?id=2857422&prov=M&dok_var=1&dok_ext=htm Beschreibung für Leser text/html http://deposit.dnb.de/cgi-bin/dokserv?id=2857422&prov=M&dok%5Fvar=1&dok%5Fext=htm Inhaltstext text/html http://swbplus.bsz-bw.de/bsz258399767cov.htm Cover HBZ Datenaustausch application/pdf http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=017445244&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA Inhaltsverzeichnis |
spellingShingle | Galuškin, Aleksandr I. Neural networks theory Neuronales Netz (DE-588)4226127-2 gnd |
subject_GND | (DE-588)4226127-2 |
title | Neural networks theory |
title_auth | Neural networks theory |
title_exact_search | Neural networks theory |
title_full | Neural networks theory Alexander I. Galushkin |
title_fullStr | Neural networks theory Alexander I. Galushkin |
title_full_unstemmed | Neural networks theory Alexander I. Galushkin |
title_short | Neural networks theory |
title_sort | neural networks theory |
topic | Neuronales Netz (DE-588)4226127-2 gnd |
topic_facet | Neuronales Netz |
url | http://deposit.dnb.de/cgi-bin/dokserv?id=2857422&prov=M&dok_var=1&dok_ext=htm http://deposit.dnb.de/cgi-bin/dokserv?id=2857422&prov=M&dok%5Fvar=1&dok%5Fext=htm http://swbplus.bsz-bw.de/bsz258399767cov.htm http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=017445244&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |
work_keys_str_mv | AT galuskinaleksandri neuralnetworkstheory |