Functional Networks with Applications: A Neural-Based Paradigm
Artificial neural networks have been recognized as a powerful tool to learn and reproduce systems in various fields of applications. Neural net works are inspired by the brain behavior and consist of one or several layers of neurons, or computing units, connected by links. Each artificial neuron re...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Elektronisch E-Book |
Sprache: | English |
Veröffentlicht: |
Boston, MA
Springer US
1999
|
Schriftenreihe: | The Springer International Series in Engineering and Computer Science
473 |
Schlagworte: | |
Online-Zugang: | BTU01 Volltext |
Zusammenfassung: | Artificial neural networks have been recognized as a powerful tool to learn and reproduce systems in various fields of applications. Neural net works are inspired by the brain behavior and consist of one or several layers of neurons, or computing units, connected by links. Each artificial neuron receives an input value from the input layer or the neurons in the previ ous layer. Then it computes a scalar output from a linear combination of the received inputs using a given scalar function (the activation function), which is assumed the same for all neurons. One of the main properties of neural networks is their ability to learn from data. There are two types of learning: structural and parametric. Structural learning consists of learning the topology of the network, that is, the number of layers, the number of neurons in each layer, and what neurons are connected. This process is done by trial and error until a good fit to the data is obtained. Parametric learning consists of learning the weight values for a given topology of the network. Since the neural functions are given, this learning process is achieved by estimating the connection weights based on the given information. To this aim, an error function is minimized using several well known learning methods, such as the backpropagation algorithm. Unfortunately, for these methods: (a) The function resulting from the learning process has no physical or engineering interpretation. Thus, neural networks are seen as black boxes |
Beschreibung: | 1 Online-Ressource (XI, 309 p) |
ISBN: | 9781461556015 |
DOI: | 10.1007/978-1-4615-5601-5 |
Internformat
MARC
LEADER | 00000nmm a2200000zcb4500 | ||
---|---|---|---|
001 | BV045188032 | ||
003 | DE-604 | ||
005 | 00000000000000.0 | ||
007 | cr|uuu---uuuuu | ||
008 | 180912s1999 |||| o||u| ||||||eng d | ||
020 | |a 9781461556015 |9 978-1-4615-5601-5 | ||
024 | 7 | |a 10.1007/978-1-4615-5601-5 |2 doi | |
035 | |a (ZDB-2-ENG)978-1-4615-5601-5 | ||
035 | |a (OCoLC)1053796908 | ||
035 | |a (DE-599)BVBBV045188032 | ||
040 | |a DE-604 |b ger |e aacr | ||
041 | 0 | |a eng | |
049 | |a DE-634 | ||
082 | 0 | |a 621 |2 23 | |
100 | 1 | |a Castillo, Enrique |e Verfasser |4 aut | |
245 | 1 | 0 | |a Functional Networks with Applications |b A Neural-Based Paradigm |c by Enrique Castillo, Angel Cobo, José Manuel Gutiérrez, Rosa Eva Pruneda |
264 | 1 | |a Boston, MA |b Springer US |c 1999 | |
300 | |a 1 Online-Ressource (XI, 309 p) | ||
336 | |b txt |2 rdacontent | ||
337 | |b c |2 rdamedia | ||
338 | |b cr |2 rdacarrier | ||
490 | 0 | |a The Springer International Series in Engineering and Computer Science |v 473 | |
520 | |a Artificial neural networks have been recognized as a powerful tool to learn and reproduce systems in various fields of applications. Neural net works are inspired by the brain behavior and consist of one or several layers of neurons, or computing units, connected by links. Each artificial neuron receives an input value from the input layer or the neurons in the previ ous layer. Then it computes a scalar output from a linear combination of the received inputs using a given scalar function (the activation function), which is assumed the same for all neurons. One of the main properties of neural networks is their ability to learn from data. There are two types of learning: structural and parametric. Structural learning consists of learning the topology of the network, that is, the number of layers, the number of neurons in each layer, and what neurons are connected. This process is done by trial and error until a good fit to the data is obtained. Parametric learning consists of learning the weight values for a given topology of the network. Since the neural functions are given, this learning process is achieved by estimating the connection weights based on the given information. To this aim, an error function is minimized using several well known learning methods, such as the backpropagation algorithm. Unfortunately, for these methods: (a) The function resulting from the learning process has no physical or engineering interpretation. Thus, neural networks are seen as black boxes | ||
650 | 4 | |a Physics | |
650 | 4 | |a Statistical Physics, Dynamical Systems and Complexity | |
650 | 4 | |a Artificial Intelligence (incl. Robotics) | |
650 | 4 | |a Computer-Aided Engineering (CAD, CAE) and Design | |
650 | 4 | |a Data Structures | |
650 | 4 | |a Physics | |
650 | 4 | |a Data structures (Computer science) | |
650 | 4 | |a Artificial intelligence | |
650 | 4 | |a Computer-aided engineering | |
650 | 4 | |a Statistical physics | |
650 | 4 | |a Dynamical systems | |
700 | 1 | |a Cobo, Angel |4 aut | |
700 | 1 | |a Gutiérrez, José Manuel |4 aut | |
700 | 1 | |a Pruneda, Rosa Eva |4 aut | |
776 | 0 | 8 | |i Erscheint auch als |n Druck-Ausgabe |z 9781461375623 |
856 | 4 | 0 | |u https://doi.org/10.1007/978-1-4615-5601-5 |x Verlag |z URL des Erstveröffentlichers |3 Volltext |
912 | |a ZDB-2-ENG | ||
940 | 1 | |q ZDB-2-ENG_Archiv | |
999 | |a oai:aleph.bib-bvb.de:BVB01-030577209 | ||
966 | e | |u https://doi.org/10.1007/978-1-4615-5601-5 |l BTU01 |p ZDB-2-ENG |q ZDB-2-ENG_Archiv |x Verlag |3 Volltext |
Datensatz im Suchindex
_version_ | 1804178880820412416 |
---|---|
any_adam_object | |
author | Castillo, Enrique Cobo, Angel Gutiérrez, José Manuel Pruneda, Rosa Eva |
author_facet | Castillo, Enrique Cobo, Angel Gutiérrez, José Manuel Pruneda, Rosa Eva |
author_role | aut aut aut aut |
author_sort | Castillo, Enrique |
author_variant | e c ec a c ac j m g jm jmg r e p re rep |
building | Verbundindex |
bvnumber | BV045188032 |
collection | ZDB-2-ENG |
ctrlnum | (ZDB-2-ENG)978-1-4615-5601-5 (OCoLC)1053796908 (DE-599)BVBBV045188032 |
dewey-full | 621 |
dewey-hundreds | 600 - Technology (Applied sciences) |
dewey-ones | 621 - Applied physics |
dewey-raw | 621 |
dewey-search | 621 |
dewey-sort | 3621 |
dewey-tens | 620 - Engineering and allied operations |
doi_str_mv | 10.1007/978-1-4615-5601-5 |
format | Electronic eBook |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>03484nmm a2200541zcb4500</leader><controlfield tag="001">BV045188032</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">00000000000000.0</controlfield><controlfield tag="007">cr|uuu---uuuuu</controlfield><controlfield tag="008">180912s1999 |||| o||u| ||||||eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9781461556015</subfield><subfield code="9">978-1-4615-5601-5</subfield></datafield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/978-1-4615-5601-5</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(ZDB-2-ENG)978-1-4615-5601-5</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)1053796908</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)BVBBV045188032</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">aacr</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-634</subfield></datafield><datafield tag="082" ind1="0" ind2=" "><subfield code="a">621</subfield><subfield code="2">23</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Castillo, Enrique</subfield><subfield code="e">Verfasser</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Functional Networks with Applications</subfield><subfield code="b">A Neural-Based Paradigm</subfield><subfield code="c">by Enrique Castillo, Angel Cobo, José Manuel Gutiérrez, Rosa Eva Pruneda</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Boston, MA</subfield><subfield code="b">Springer US</subfield><subfield code="c">1999</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 Online-Ressource (XI, 309 p)</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="490" ind1="0" ind2=" "><subfield code="a">The Springer International Series in Engineering and Computer Science</subfield><subfield code="v">473</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Artificial neural networks have been recognized as a powerful tool to learn and reproduce systems in various fields of applications. Neural net works are inspired by the brain behavior and consist of one or several layers of neurons, or computing units, connected by links. Each artificial neuron receives an input value from the input layer or the neurons in the previ ous layer. Then it computes a scalar output from a linear combination of the received inputs using a given scalar function (the activation function), which is assumed the same for all neurons. One of the main properties of neural networks is their ability to learn from data. There are two types of learning: structural and parametric. Structural learning consists of learning the topology of the network, that is, the number of layers, the number of neurons in each layer, and what neurons are connected. This process is done by trial and error until a good fit to the data is obtained. Parametric learning consists of learning the weight values for a given topology of the network. Since the neural functions are given, this learning process is achieved by estimating the connection weights based on the given information. To this aim, an error function is minimized using several well known learning methods, such as the backpropagation algorithm. Unfortunately, for these methods: (a) The function resulting from the learning process has no physical or engineering interpretation. Thus, neural networks are seen as black boxes</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Physics</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Statistical Physics, Dynamical Systems and Complexity</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Artificial Intelligence (incl. Robotics)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Computer-Aided Engineering (CAD, CAE) and Design</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Data Structures</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Physics</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Data structures (Computer science)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Artificial intelligence</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Computer-aided engineering</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Statistical physics</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Dynamical systems</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Cobo, Angel</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Gutiérrez, José Manuel</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Pruneda, Rosa Eva</subfield><subfield code="4">aut</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Erscheint auch als</subfield><subfield code="n">Druck-Ausgabe</subfield><subfield code="z">9781461375623</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.1007/978-1-4615-5601-5</subfield><subfield code="x">Verlag</subfield><subfield code="z">URL des Erstveröffentlichers</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-2-ENG</subfield></datafield><datafield tag="940" ind1="1" ind2=" "><subfield code="q">ZDB-2-ENG_Archiv</subfield></datafield><datafield tag="999" ind1=" " ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-030577209</subfield></datafield><datafield tag="966" ind1="e" ind2=" "><subfield code="u">https://doi.org/10.1007/978-1-4615-5601-5</subfield><subfield code="l">BTU01</subfield><subfield code="p">ZDB-2-ENG</subfield><subfield code="q">ZDB-2-ENG_Archiv</subfield><subfield code="x">Verlag</subfield><subfield code="3">Volltext</subfield></datafield></record></collection> |
id | DE-604.BV045188032 |
illustrated | Not Illustrated |
indexdate | 2024-07-10T08:11:00Z |
institution | BVB |
isbn | 9781461556015 |
language | English |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-030577209 |
oclc_num | 1053796908 |
open_access_boolean | |
owner | DE-634 |
owner_facet | DE-634 |
physical | 1 Online-Ressource (XI, 309 p) |
psigel | ZDB-2-ENG ZDB-2-ENG_Archiv ZDB-2-ENG ZDB-2-ENG_Archiv |
publishDate | 1999 |
publishDateSearch | 1999 |
publishDateSort | 1999 |
publisher | Springer US |
record_format | marc |
series2 | The Springer International Series in Engineering and Computer Science |
spelling | Castillo, Enrique Verfasser aut Functional Networks with Applications A Neural-Based Paradigm by Enrique Castillo, Angel Cobo, José Manuel Gutiérrez, Rosa Eva Pruneda Boston, MA Springer US 1999 1 Online-Ressource (XI, 309 p) txt rdacontent c rdamedia cr rdacarrier The Springer International Series in Engineering and Computer Science 473 Artificial neural networks have been recognized as a powerful tool to learn and reproduce systems in various fields of applications. Neural net works are inspired by the brain behavior and consist of one or several layers of neurons, or computing units, connected by links. Each artificial neuron receives an input value from the input layer or the neurons in the previ ous layer. Then it computes a scalar output from a linear combination of the received inputs using a given scalar function (the activation function), which is assumed the same for all neurons. One of the main properties of neural networks is their ability to learn from data. There are two types of learning: structural and parametric. Structural learning consists of learning the topology of the network, that is, the number of layers, the number of neurons in each layer, and what neurons are connected. This process is done by trial and error until a good fit to the data is obtained. Parametric learning consists of learning the weight values for a given topology of the network. Since the neural functions are given, this learning process is achieved by estimating the connection weights based on the given information. To this aim, an error function is minimized using several well known learning methods, such as the backpropagation algorithm. Unfortunately, for these methods: (a) The function resulting from the learning process has no physical or engineering interpretation. Thus, neural networks are seen as black boxes Physics Statistical Physics, Dynamical Systems and Complexity Artificial Intelligence (incl. Robotics) Computer-Aided Engineering (CAD, CAE) and Design Data Structures Data structures (Computer science) Artificial intelligence Computer-aided engineering Statistical physics Dynamical systems Cobo, Angel aut Gutiérrez, José Manuel aut Pruneda, Rosa Eva aut Erscheint auch als Druck-Ausgabe 9781461375623 https://doi.org/10.1007/978-1-4615-5601-5 Verlag URL des Erstveröffentlichers Volltext |
spellingShingle | Castillo, Enrique Cobo, Angel Gutiérrez, José Manuel Pruneda, Rosa Eva Functional Networks with Applications A Neural-Based Paradigm Physics Statistical Physics, Dynamical Systems and Complexity Artificial Intelligence (incl. Robotics) Computer-Aided Engineering (CAD, CAE) and Design Data Structures Data structures (Computer science) Artificial intelligence Computer-aided engineering Statistical physics Dynamical systems |
title | Functional Networks with Applications A Neural-Based Paradigm |
title_auth | Functional Networks with Applications A Neural-Based Paradigm |
title_exact_search | Functional Networks with Applications A Neural-Based Paradigm |
title_full | Functional Networks with Applications A Neural-Based Paradigm by Enrique Castillo, Angel Cobo, José Manuel Gutiérrez, Rosa Eva Pruneda |
title_fullStr | Functional Networks with Applications A Neural-Based Paradigm by Enrique Castillo, Angel Cobo, José Manuel Gutiérrez, Rosa Eva Pruneda |
title_full_unstemmed | Functional Networks with Applications A Neural-Based Paradigm by Enrique Castillo, Angel Cobo, José Manuel Gutiérrez, Rosa Eva Pruneda |
title_short | Functional Networks with Applications |
title_sort | functional networks with applications a neural based paradigm |
title_sub | A Neural-Based Paradigm |
topic | Physics Statistical Physics, Dynamical Systems and Complexity Artificial Intelligence (incl. Robotics) Computer-Aided Engineering (CAD, CAE) and Design Data Structures Data structures (Computer science) Artificial intelligence Computer-aided engineering Statistical physics Dynamical systems |
topic_facet | Physics Statistical Physics, Dynamical Systems and Complexity Artificial Intelligence (incl. Robotics) Computer-Aided Engineering (CAD, CAE) and Design Data Structures Data structures (Computer science) Artificial intelligence Computer-aided engineering Statistical physics Dynamical systems |
url | https://doi.org/10.1007/978-1-4615-5601-5 |
work_keys_str_mv | AT castilloenrique functionalnetworkswithapplicationsaneuralbasedparadigm AT coboangel functionalnetworkswithapplicationsaneuralbasedparadigm AT gutierrezjosemanuel functionalnetworkswithapplicationsaneuralbasedparadigm AT prunedarosaeva functionalnetworkswithapplicationsaneuralbasedparadigm |