Neural networks and pattern recognition /:
This book is one of the most up-to-date and cutting-edge texts available on the rapidly growing application area of neural networks. Neural Networks and Pattern Recognition focuses on the use of neural networksin pattern recognition, a very important application area for neural networks technology....
Gespeichert in:
Weitere Verfasser: | , |
---|---|
Format: | Elektronisch E-Book |
Sprache: | English |
Veröffentlicht: |
San Diego, Calif. :
Academic Press,
©1998.
|
Schlagworte: | |
Online-Zugang: | Volltext Volltext |
Zusammenfassung: | This book is one of the most up-to-date and cutting-edge texts available on the rapidly growing application area of neural networks. Neural Networks and Pattern Recognition focuses on the use of neural networksin pattern recognition, a very important application area for neural networks technology. The contributors are widely known and highly respected researchers and practitioners in the field. Key Features * Features neural network architectures on the cutting edge of neural network research * Brings together highly innovative ideas on dynamical neural networks * Includes articles written by authors prominent in the neural networks research community * Provides an authoritative, technically correct presentation of each specific technical area. |
Beschreibung: | 1 online resource (xvi, 351 pages) : illustrations |
Bibliographie: | Includes bibliographical references and index. |
ISBN: | 9780125264204 0125264208 9780080512617 0080512615 128103343X 9781281033437 9786611033439 6611033432 |
Internformat
MARC
LEADER | 00000cam a2200000 a 4500 | ||
---|---|---|---|
001 | ZDB-4-EBA-ocn162128691 | ||
003 | OCoLC | ||
005 | 20241004212047.0 | ||
006 | m o d | ||
007 | cr cn||||||||| | ||
008 | 070802s1998 caua ob 001 0 eng d | ||
040 | |a OPELS |b eng |e pn |c OPELS |d OCLCQ |d N$T |d YDXCP |d IDEBK |d E7B |d UMI |d DEBSZ |d OCLCQ |d OCLCF |d OCLCQ |d COO |d AGLDB |d OCLCQ |d STF |d D6H |d VTS |d CEF |d OCLCQ |d WYU |d LEAUB |d OCLCQ |d K6U |d VT2 |d VLY |d OCLCQ |d OCLCO |d M8D |d DST |d OCLCQ |d OCLCO |d OCLCL |d OCLCQ | ||
019 | |a 181032110 |a 647695983 |a 820029257 |a 823828686 |a 823898423 |a 824089712 |a 824136313 |a 1035706101 |a 1062924678 |a 1156366188 |a 1162012201 |a 1241775171 |a 1300637834 | ||
020 | |a 9780125264204 | ||
020 | |a 0125264208 | ||
020 | |a 9780080512617 |q (electronic bk.) | ||
020 | |a 0080512615 |q (electronic bk.) | ||
020 | |a 128103343X | ||
020 | |a 9781281033437 | ||
020 | |a 9786611033439 | ||
020 | |a 6611033432 | ||
035 | |a (OCoLC)162128691 |z (OCoLC)181032110 |z (OCoLC)647695983 |z (OCoLC)820029257 |z (OCoLC)823828686 |z (OCoLC)823898423 |z (OCoLC)824089712 |z (OCoLC)824136313 |z (OCoLC)1035706101 |z (OCoLC)1062924678 |z (OCoLC)1156366188 |z (OCoLC)1162012201 |z (OCoLC)1241775171 |z (OCoLC)1300637834 | ||
037 | |a 77584:77584 |b Elsevier Science & Technology |n http://www.sciencedirect.com | ||
050 | 4 | |a QA76.87 |b .O45 1998eb | |
072 | 7 | |a COM |x 044000 |2 bisacsh | |
072 | 7 | |a UXCR |2 bicssc | |
082 | 7 | |a 006.3/2 |2 22 | |
049 | |a MAIN | ||
245 | 0 | 0 | |a Neural networks and pattern recognition / |c edited by Omid Omidvar, Judith Dayhoff. |
260 | |a San Diego, Calif. : |b Academic Press, |c ©1998. | ||
300 | |a 1 online resource (xvi, 351 pages) : |b illustrations | ||
336 | |a text |b txt |2 rdacontent | ||
337 | |a computer |b c |2 rdamedia | ||
338 | |a online resource |b cr |2 rdacarrier | ||
520 | |a This book is one of the most up-to-date and cutting-edge texts available on the rapidly growing application area of neural networks. Neural Networks and Pattern Recognition focuses on the use of neural networksin pattern recognition, a very important application area for neural networks technology. The contributors are widely known and highly respected researchers and practitioners in the field. Key Features * Features neural network architectures on the cutting edge of neural network research * Brings together highly innovative ideas on dynamical neural networks * Includes articles written by authors prominent in the neural networks research community * Provides an authoritative, technically correct presentation of each specific technical area. | ||
505 | 0 | |a (Chapter Headings) Preface. Contributors. J.L. Johnson, H. Ranganath, G. Kuntimad, and H.J. Caulfield, Pulse-Coupled Neural Networks. H. Li and J. Wang, A Neural Network Model for Optical Flow Computation. F. Unal and N. Tepedelenlioglu, Temporal Pattern Matching Using an Artificial Neural Network. J. Dayhoff, P. Palmadesso, F. Richards, and D.-T. Lin, Patterns of Dynamic Activity and Timing in Neural Network Processing. J. Ghosh, H.-J. Chang, and K. Liano, A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons. P. Tito, B. Horne, C.L. Giles, and P. Collingwood, Finite State Machines and Recurrent Neural Networks--Automata and Dynamical Systems Approaches. R. Anderson, Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error. A. Nigrin, Using SONNET 1 to Segment Continuous Sequences of Items. K. Venkatesh, A. Pandya, and S. Hsu, On the Use of High Level Petri Nets in the Modeling of Biological Neural Networks. J. Principe, S. Celebi, B. de Vries, and J. Harris, Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions. Preface. Contributors. J.L. Johnson, H. Ranganath, G. Kuntimad, and H.J. Caulfield, Pulse-Coupled Neural Networks: Introduction. Basic Model. Multiple Pulses. Multiple Receptive Field Inputs. Time Evolution of Two Cells. Space to Time. LinkingWaves and Time Scales. Groups. Invariances. Segmentation. Adaptation. Time to Space. Implementations. Integration into Systems. Concluding Remarks. References. H. Li and J. Wang, A Neural Network Model for Optical Flow Computation: Introduction. Theoretical Background. Discussion on the Reformulation. Choosing Regularization Parameters. A Recurrent Neural Network Model. Experiments. Comparison to Other Work. Summary and Discussion. References. F. Unal and N. Tepedelenlioglu, TemporalPattern Matching Using an Artificial Neural Network: Introduction. Solving Optimization Problems Using the Hopfield Network. Dynamic Time Warping Using Hopfield Network. Computer Simulation Results. Conclusions. References. J. Dayhoff, P. Palmadesso, F. Richards, and D.-T. Lin, Patterns of Dynamic Activity and Timing in Neural Network Processing: Introduction. Dynamic Networks. Chaotic Attractors and Attractor Locking. Developing Multiple Attractors. Attractor Basins and Dynamic Binary Networks. Time Delay Mechanisms and Attractor Training. Timing of Action Potentials in Impulse Trains. Discussion. Acknowledgments. References. J. Ghosh, H.-J. Chang, and K. Liano, A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons: Introduction. A Macroscopic Model for Cell Assemblies. Interactions Between Two Neural Groups. Stability of Equilibrium States. Oscillation Frequency Estimation. Experimental Validation. Conclusion. Appendix. References. P. Tito, B. Horne, C.L. Giles, and P. Collingwood, Finite State Machines and Recurrent Neural Networks--Automata and Dynamical Systems Approaches: Introduction. State Machines. Dynamical Systems. Recurrent Neural Network. RNN as a State Machine. RNN as a Collection of Dynamical Systems. RNN with Two State Neurons. Experiments--Learning Loops of FSM. Discussion. References. R. Anderson, Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error: Introduction. Hebb's Rule. Theoretical Learning Rules. Biological Evidence. Conclusions. Acknowledgments. References and Bibliography. A. Nigrin, Using SONNET 1 to Segment Continuous Sequences of Items: Introduction. Learning Isolated and Embedded Spatial Patterns. Storing Items with Decreasing Activity. The LTM Invariance Principle. Using Rehearsal to Process Arbitrarily Long Lists. Implementing the LTM Invariance Principle with an On-Center Off-Surround Circuit. Resetting Items Once They can be Classified. Properties of a Classifying System. Simulations. Discussion. K. Venkatesh, A. Pandya, and S. Hsu, On the Use of High Level Petri Nets in the Modeling of Biological Neural Networks: Introduction. Fundamentals of PNs. Modeling of Biological Neural Systems with High Level PNs. New/Modified Elements Added to HPNs to Model BNNs. Example of a BNN: The Olfactory Bulb. Conclusions. References. J. Principe, S. Celebi, B. de Vries, and J. Harris, Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions: Introduction. Linear Finite Dimensional Memory Structures. The Gamma Neural Network. Applications of the Gamma Memory. Interpretations of the Gamma Memory. Laguerre and Gamma II Memories. Analog VLSI Implementations of the Gamma Filter. Conclusions. References. | |
504 | |a Includes bibliographical references and index. | ||
505 | 0 | |a Pulse-coupled neural networks / J.L. Johnson [and others] -- A neural network model for optical flow computation / Hua Li ; Jun Wang -- Temporal pattern matching using an artificial neural network / Fatih A. Unal ; Nazif Tepedelenlioglu -- Patterns of dynamic activity and timing in neural network processing / Judith E. Dayhoff [and others] -- A macroscopic model of oscillation in ensembles of inhibitory and excitatory neurons / Joydeep Ghosh ; Hung-Jen Chang ; Kadir Liano -- Finite state machines and recurrent neural networks--automata and dynamical systems approaches / Peter Tiňo [and others] -- Biased random-walk learning: a neurobiological correlate to trial-and-error / Russell W. Anderson -- Using SONNET 1 to segment continuous sequences of items / Albert Nigrin -- On the use of high-level petri nets in the modeling of biological neural networks / Kurapati Venkatesh ; Abhijit Pandya ; Sam Hsu -- Locally recurrent networks: the gamma operator, properties, and extensions / Jose C. Principe [and others]. | |
588 | 0 | |a Print version record. | |
546 | |a English. | ||
650 | 0 | |a Neural networks (Computer science) |0 http://id.loc.gov/authorities/subjects/sh90001937 | |
650 | 0 | |a Pattern recognition systems. |0 http://id.loc.gov/authorities/subjects/sh85098791 | |
650 | 6 | |a Réseaux neuronaux (Informatique) | |
650 | 6 | |a Reconnaissance des formes (Informatique) | |
650 | 7 | |a COMPUTERS |x Neural Networks. |2 bisacsh | |
650 | 7 | |a Neural networks (Computer science) |2 fast | |
650 | 7 | |a Pattern recognition systems |2 fast | |
700 | 1 | |a Omidvar, Omid. |0 http://id.loc.gov/authorities/names/n96099079 | |
700 | 1 | |a Dayhoff, Judith E. |0 http://id.loc.gov/authorities/names/n88283578 | |
776 | 0 | 8 | |i Print version: |t Neural networks and pattern recognition. |d San Diego, Calif. : Academic Press, ©1998 |z 0125264208 |z 9780125264204 |w (DLC) 97025466 |w (OCoLC)37155674 |
856 | 4 | 0 | |l FWS01 |p ZDB-4-EBA |q FWS_PDA_EBA |u https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=209290 |3 Volltext |
856 | 4 | 0 | |l FWS01 |p ZDB-4-EBA |q FWS_PDA_EBA |u https://www.sciencedirect.com/science/book/9780125264204 |3 Volltext |
938 | |a ebrary |b EBRY |n ebr10206480 | ||
938 | |a EBSCOhost |b EBSC |n 209290 | ||
938 | |a ProQuest MyiLibrary Digital eBook Collection |b IDEB |n 103343 | ||
938 | |a YBP Library Services |b YANK |n 2729644 | ||
994 | |a 92 |b GEBAY | ||
912 | |a ZDB-4-EBA | ||
049 | |a DE-863 |
Datensatz im Suchindex
DE-BY-FWS_katkey | ZDB-4-EBA-ocn162128691 |
---|---|
_version_ | 1816881650235080704 |
adam_text | |
any_adam_object | |
author2 | Omidvar, Omid Dayhoff, Judith E. |
author2_role | |
author2_variant | o o oo j e d je jed |
author_GND | http://id.loc.gov/authorities/names/n96099079 http://id.loc.gov/authorities/names/n88283578 |
author_facet | Omidvar, Omid Dayhoff, Judith E. |
author_sort | Omidvar, Omid |
building | Verbundindex |
bvnumber | localFWS |
callnumber-first | Q - Science |
callnumber-label | QA76 |
callnumber-raw | QA76.87 .O45 1998eb |
callnumber-search | QA76.87 .O45 1998eb |
callnumber-sort | QA 276.87 O45 41998EB |
callnumber-subject | QA - Mathematics |
collection | ZDB-4-EBA |
contents | (Chapter Headings) Preface. Contributors. J.L. Johnson, H. Ranganath, G. Kuntimad, and H.J. Caulfield, Pulse-Coupled Neural Networks. H. Li and J. Wang, A Neural Network Model for Optical Flow Computation. F. Unal and N. Tepedelenlioglu, Temporal Pattern Matching Using an Artificial Neural Network. J. Dayhoff, P. Palmadesso, F. Richards, and D.-T. Lin, Patterns of Dynamic Activity and Timing in Neural Network Processing. J. Ghosh, H.-J. Chang, and K. Liano, A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons. P. Tito, B. Horne, C.L. Giles, and P. Collingwood, Finite State Machines and Recurrent Neural Networks--Automata and Dynamical Systems Approaches. R. Anderson, Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error. A. Nigrin, Using SONNET 1 to Segment Continuous Sequences of Items. K. Venkatesh, A. Pandya, and S. Hsu, On the Use of High Level Petri Nets in the Modeling of Biological Neural Networks. J. Principe, S. Celebi, B. de Vries, and J. Harris, Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions. Preface. Contributors. J.L. Johnson, H. Ranganath, G. Kuntimad, and H.J. Caulfield, Pulse-Coupled Neural Networks: Introduction. Basic Model. Multiple Pulses. Multiple Receptive Field Inputs. Time Evolution of Two Cells. Space to Time. LinkingWaves and Time Scales. Groups. Invariances. Segmentation. Adaptation. Time to Space. Implementations. Integration into Systems. Concluding Remarks. References. H. Li and J. Wang, A Neural Network Model for Optical Flow Computation: Introduction. Theoretical Background. Discussion on the Reformulation. Choosing Regularization Parameters. A Recurrent Neural Network Model. Experiments. Comparison to Other Work. Summary and Discussion. References. F. Unal and N. Tepedelenlioglu, TemporalPattern Matching Using an Artificial Neural Network: Introduction. Solving Optimization Problems Using the Hopfield Network. Dynamic Time Warping Using Hopfield Network. Computer Simulation Results. Conclusions. References. J. Dayhoff, P. Palmadesso, F. Richards, and D.-T. Lin, Patterns of Dynamic Activity and Timing in Neural Network Processing: Introduction. Dynamic Networks. Chaotic Attractors and Attractor Locking. Developing Multiple Attractors. Attractor Basins and Dynamic Binary Networks. Time Delay Mechanisms and Attractor Training. Timing of Action Potentials in Impulse Trains. Discussion. Acknowledgments. References. J. Ghosh, H.-J. Chang, and K. Liano, A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons: Introduction. A Macroscopic Model for Cell Assemblies. Interactions Between Two Neural Groups. Stability of Equilibrium States. Oscillation Frequency Estimation. Experimental Validation. Conclusion. Appendix. References. P. Tito, B. Horne, C.L. Giles, and P. Collingwood, Finite State Machines and Recurrent Neural Networks--Automata and Dynamical Systems Approaches: Introduction. State Machines. Dynamical Systems. Recurrent Neural Network. RNN as a State Machine. RNN as a Collection of Dynamical Systems. RNN with Two State Neurons. Experiments--Learning Loops of FSM. Discussion. References. R. Anderson, Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error: Introduction. Hebb's Rule. Theoretical Learning Rules. Biological Evidence. Conclusions. Acknowledgments. References and Bibliography. A. Nigrin, Using SONNET 1 to Segment Continuous Sequences of Items: Introduction. Learning Isolated and Embedded Spatial Patterns. Storing Items with Decreasing Activity. The LTM Invariance Principle. Using Rehearsal to Process Arbitrarily Long Lists. Implementing the LTM Invariance Principle with an On-Center Off-Surround Circuit. Resetting Items Once They can be Classified. Properties of a Classifying System. Simulations. Discussion. K. Venkatesh, A. Pandya, and S. Hsu, On the Use of High Level Petri Nets in the Modeling of Biological Neural Networks: Introduction. Fundamentals of PNs. Modeling of Biological Neural Systems with High Level PNs. New/Modified Elements Added to HPNs to Model BNNs. Example of a BNN: The Olfactory Bulb. Conclusions. References. J. Principe, S. Celebi, B. de Vries, and J. Harris, Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions: Introduction. Linear Finite Dimensional Memory Structures. The Gamma Neural Network. Applications of the Gamma Memory. Interpretations of the Gamma Memory. Laguerre and Gamma II Memories. Analog VLSI Implementations of the Gamma Filter. Conclusions. References. Pulse-coupled neural networks / J.L. Johnson [and others] -- A neural network model for optical flow computation / Hua Li ; Jun Wang -- Temporal pattern matching using an artificial neural network / Fatih A. Unal ; Nazif Tepedelenlioglu -- Patterns of dynamic activity and timing in neural network processing / Judith E. Dayhoff [and others] -- A macroscopic model of oscillation in ensembles of inhibitory and excitatory neurons / Joydeep Ghosh ; Hung-Jen Chang ; Kadir Liano -- Finite state machines and recurrent neural networks--automata and dynamical systems approaches / Peter Tiňo [and others] -- Biased random-walk learning: a neurobiological correlate to trial-and-error / Russell W. Anderson -- Using SONNET 1 to segment continuous sequences of items / Albert Nigrin -- On the use of high-level petri nets in the modeling of biological neural networks / Kurapati Venkatesh ; Abhijit Pandya ; Sam Hsu -- Locally recurrent networks: the gamma operator, properties, and extensions / Jose C. Principe [and others]. |
ctrlnum | (OCoLC)162128691 |
dewey-full | 006.3/2 |
dewey-hundreds | 000 - Computer science, information, general works |
dewey-ones | 006 - Special computer methods |
dewey-raw | 006.3/2 |
dewey-search | 006.3/2 |
dewey-sort | 16.3 12 |
dewey-tens | 000 - Computer science, information, general works |
discipline | Informatik |
format | Electronic eBook |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>09472cam a2200649 a 4500</leader><controlfield tag="001">ZDB-4-EBA-ocn162128691</controlfield><controlfield tag="003">OCoLC</controlfield><controlfield tag="005">20241004212047.0</controlfield><controlfield tag="006">m o d </controlfield><controlfield tag="007">cr cn|||||||||</controlfield><controlfield tag="008">070802s1998 caua ob 001 0 eng d</controlfield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">OPELS</subfield><subfield code="b">eng</subfield><subfield code="e">pn</subfield><subfield code="c">OPELS</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">N$T</subfield><subfield code="d">YDXCP</subfield><subfield code="d">IDEBK</subfield><subfield code="d">E7B</subfield><subfield code="d">UMI</subfield><subfield code="d">DEBSZ</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">OCLCF</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">COO</subfield><subfield code="d">AGLDB</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">STF</subfield><subfield code="d">D6H</subfield><subfield code="d">VTS</subfield><subfield code="d">CEF</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">WYU</subfield><subfield code="d">LEAUB</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">K6U</subfield><subfield code="d">VT2</subfield><subfield code="d">VLY</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">OCLCO</subfield><subfield code="d">M8D</subfield><subfield code="d">DST</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">OCLCO</subfield><subfield code="d">OCLCL</subfield><subfield code="d">OCLCQ</subfield></datafield><datafield tag="019" ind1=" " ind2=" "><subfield code="a">181032110</subfield><subfield code="a">647695983</subfield><subfield code="a">820029257</subfield><subfield code="a">823828686</subfield><subfield code="a">823898423</subfield><subfield code="a">824089712</subfield><subfield code="a">824136313</subfield><subfield code="a">1035706101</subfield><subfield code="a">1062924678</subfield><subfield code="a">1156366188</subfield><subfield code="a">1162012201</subfield><subfield code="a">1241775171</subfield><subfield code="a">1300637834</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9780125264204</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">0125264208</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9780080512617</subfield><subfield code="q">(electronic bk.)</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">0080512615</subfield><subfield code="q">(electronic bk.)</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">128103343X</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9781281033437</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9786611033439</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">6611033432</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)162128691</subfield><subfield code="z">(OCoLC)181032110</subfield><subfield code="z">(OCoLC)647695983</subfield><subfield code="z">(OCoLC)820029257</subfield><subfield code="z">(OCoLC)823828686</subfield><subfield code="z">(OCoLC)823898423</subfield><subfield code="z">(OCoLC)824089712</subfield><subfield code="z">(OCoLC)824136313</subfield><subfield code="z">(OCoLC)1035706101</subfield><subfield code="z">(OCoLC)1062924678</subfield><subfield code="z">(OCoLC)1156366188</subfield><subfield code="z">(OCoLC)1162012201</subfield><subfield code="z">(OCoLC)1241775171</subfield><subfield code="z">(OCoLC)1300637834</subfield></datafield><datafield tag="037" ind1=" " ind2=" "><subfield code="a">77584:77584</subfield><subfield code="b">Elsevier Science & Technology</subfield><subfield code="n">http://www.sciencedirect.com</subfield></datafield><datafield tag="050" ind1=" " ind2="4"><subfield code="a">QA76.87</subfield><subfield code="b">.O45 1998eb</subfield></datafield><datafield tag="072" ind1=" " ind2="7"><subfield code="a">COM</subfield><subfield code="x">044000</subfield><subfield code="2">bisacsh</subfield></datafield><datafield tag="072" ind1=" " ind2="7"><subfield code="a">UXCR</subfield><subfield code="2">bicssc</subfield></datafield><datafield tag="082" ind1="7" ind2=" "><subfield code="a">006.3/2</subfield><subfield code="2">22</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">MAIN</subfield></datafield><datafield tag="245" ind1="0" ind2="0"><subfield code="a">Neural networks and pattern recognition /</subfield><subfield code="c">edited by Omid Omidvar, Judith Dayhoff.</subfield></datafield><datafield tag="260" ind1=" " ind2=" "><subfield code="a">San Diego, Calif. :</subfield><subfield code="b">Academic Press,</subfield><subfield code="c">©1998.</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 online resource (xvi, 351 pages) :</subfield><subfield code="b">illustrations</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">computer</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">online resource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">This book is one of the most up-to-date and cutting-edge texts available on the rapidly growing application area of neural networks. Neural Networks and Pattern Recognition focuses on the use of neural networksin pattern recognition, a very important application area for neural networks technology. The contributors are widely known and highly respected researchers and practitioners in the field. Key Features * Features neural network architectures on the cutting edge of neural network research * Brings together highly innovative ideas on dynamical neural networks * Includes articles written by authors prominent in the neural networks research community * Provides an authoritative, technically correct presentation of each specific technical area.</subfield></datafield><datafield tag="505" ind1="0" ind2=" "><subfield code="a">(Chapter Headings) Preface. Contributors. J.L. Johnson, H. Ranganath, G. Kuntimad, and H.J. Caulfield, Pulse-Coupled Neural Networks. H. Li and J. Wang, A Neural Network Model for Optical Flow Computation. F. Unal and N. Tepedelenlioglu, Temporal Pattern Matching Using an Artificial Neural Network. J. Dayhoff, P. Palmadesso, F. Richards, and D.-T. Lin, Patterns of Dynamic Activity and Timing in Neural Network Processing. J. Ghosh, H.-J. Chang, and K. Liano, A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons. P. Tito, B. Horne, C.L. Giles, and P. Collingwood, Finite State Machines and Recurrent Neural Networks--Automata and Dynamical Systems Approaches. R. Anderson, Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error. A. Nigrin, Using SONNET 1 to Segment Continuous Sequences of Items. K. Venkatesh, A. Pandya, and S. Hsu, On the Use of High Level Petri Nets in the Modeling of Biological Neural Networks. J. Principe, S. Celebi, B. de Vries, and J. Harris, Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions. Preface. Contributors. J.L. Johnson, H. Ranganath, G. Kuntimad, and H.J. Caulfield, Pulse-Coupled Neural Networks: Introduction. Basic Model. Multiple Pulses. Multiple Receptive Field Inputs. Time Evolution of Two Cells. Space to Time. LinkingWaves and Time Scales. Groups. Invariances. Segmentation. Adaptation. Time to Space. Implementations. Integration into Systems. Concluding Remarks. References. H. Li and J. Wang, A Neural Network Model for Optical Flow Computation: Introduction. Theoretical Background. Discussion on the Reformulation. Choosing Regularization Parameters. A Recurrent Neural Network Model. Experiments. Comparison to Other Work. Summary and Discussion. References. F. Unal and N. Tepedelenlioglu, TemporalPattern Matching Using an Artificial Neural Network: Introduction. Solving Optimization Problems Using the Hopfield Network. Dynamic Time Warping Using Hopfield Network. Computer Simulation Results. Conclusions. References. J. Dayhoff, P. Palmadesso, F. Richards, and D.-T. Lin, Patterns of Dynamic Activity and Timing in Neural Network Processing: Introduction. Dynamic Networks. Chaotic Attractors and Attractor Locking. Developing Multiple Attractors. Attractor Basins and Dynamic Binary Networks. Time Delay Mechanisms and Attractor Training. Timing of Action Potentials in Impulse Trains. Discussion. Acknowledgments. References. J. Ghosh, H.-J. Chang, and K. Liano, A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons: Introduction. A Macroscopic Model for Cell Assemblies. Interactions Between Two Neural Groups. Stability of Equilibrium States. Oscillation Frequency Estimation. Experimental Validation. Conclusion. Appendix. References. P. Tito, B. Horne, C.L. Giles, and P. Collingwood, Finite State Machines and Recurrent Neural Networks--Automata and Dynamical Systems Approaches: Introduction. State Machines. Dynamical Systems. Recurrent Neural Network. RNN as a State Machine. RNN as a Collection of Dynamical Systems. RNN with Two State Neurons. Experiments--Learning Loops of FSM. Discussion. References. R. Anderson, Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error: Introduction. Hebb's Rule. Theoretical Learning Rules. Biological Evidence. Conclusions. Acknowledgments. References and Bibliography. A. Nigrin, Using SONNET 1 to Segment Continuous Sequences of Items: Introduction. Learning Isolated and Embedded Spatial Patterns. Storing Items with Decreasing Activity. The LTM Invariance Principle. Using Rehearsal to Process Arbitrarily Long Lists. Implementing the LTM Invariance Principle with an On-Center Off-Surround Circuit. Resetting Items Once They can be Classified. Properties of a Classifying System. Simulations. Discussion. K. Venkatesh, A. Pandya, and S. Hsu, On the Use of High Level Petri Nets in the Modeling of Biological Neural Networks: Introduction. Fundamentals of PNs. Modeling of Biological Neural Systems with High Level PNs. New/Modified Elements Added to HPNs to Model BNNs. Example of a BNN: The Olfactory Bulb. Conclusions. References. J. Principe, S. Celebi, B. de Vries, and J. Harris, Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions: Introduction. Linear Finite Dimensional Memory Structures. The Gamma Neural Network. Applications of the Gamma Memory. Interpretations of the Gamma Memory. Laguerre and Gamma II Memories. Analog VLSI Implementations of the Gamma Filter. Conclusions. References.</subfield></datafield><datafield tag="504" ind1=" " ind2=" "><subfield code="a">Includes bibliographical references and index.</subfield></datafield><datafield tag="505" ind1="0" ind2=" "><subfield code="a">Pulse-coupled neural networks / J.L. Johnson [and others] -- A neural network model for optical flow computation / Hua Li ; Jun Wang -- Temporal pattern matching using an artificial neural network / Fatih A. Unal ; Nazif Tepedelenlioglu -- Patterns of dynamic activity and timing in neural network processing / Judith E. Dayhoff [and others] -- A macroscopic model of oscillation in ensembles of inhibitory and excitatory neurons / Joydeep Ghosh ; Hung-Jen Chang ; Kadir Liano -- Finite state machines and recurrent neural networks--automata and dynamical systems approaches / Peter Tiňo [and others] -- Biased random-walk learning: a neurobiological correlate to trial-and-error / Russell W. Anderson -- Using SONNET 1 to segment continuous sequences of items / Albert Nigrin -- On the use of high-level petri nets in the modeling of biological neural networks / Kurapati Venkatesh ; Abhijit Pandya ; Sam Hsu -- Locally recurrent networks: the gamma operator, properties, and extensions / Jose C. Principe [and others].</subfield></datafield><datafield tag="588" ind1="0" ind2=" "><subfield code="a">Print version record.</subfield></datafield><datafield tag="546" ind1=" " ind2=" "><subfield code="a">English.</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Neural networks (Computer science)</subfield><subfield code="0">http://id.loc.gov/authorities/subjects/sh90001937</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Pattern recognition systems.</subfield><subfield code="0">http://id.loc.gov/authorities/subjects/sh85098791</subfield></datafield><datafield tag="650" ind1=" " ind2="6"><subfield code="a">Réseaux neuronaux (Informatique)</subfield></datafield><datafield tag="650" ind1=" " ind2="6"><subfield code="a">Reconnaissance des formes (Informatique)</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">COMPUTERS</subfield><subfield code="x">Neural Networks.</subfield><subfield code="2">bisacsh</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Neural networks (Computer science)</subfield><subfield code="2">fast</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Pattern recognition systems</subfield><subfield code="2">fast</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Omidvar, Omid.</subfield><subfield code="0">http://id.loc.gov/authorities/names/n96099079</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Dayhoff, Judith E.</subfield><subfield code="0">http://id.loc.gov/authorities/names/n88283578</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Print version:</subfield><subfield code="t">Neural networks and pattern recognition.</subfield><subfield code="d">San Diego, Calif. : Academic Press, ©1998</subfield><subfield code="z">0125264208</subfield><subfield code="z">9780125264204</subfield><subfield code="w">(DLC) 97025466</subfield><subfield code="w">(OCoLC)37155674</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="l">FWS01</subfield><subfield code="p">ZDB-4-EBA</subfield><subfield code="q">FWS_PDA_EBA</subfield><subfield code="u">https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=209290</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="l">FWS01</subfield><subfield code="p">ZDB-4-EBA</subfield><subfield code="q">FWS_PDA_EBA</subfield><subfield code="u">https://www.sciencedirect.com/science/book/9780125264204</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">ebrary</subfield><subfield code="b">EBRY</subfield><subfield code="n">ebr10206480</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">EBSCOhost</subfield><subfield code="b">EBSC</subfield><subfield code="n">209290</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">ProQuest MyiLibrary Digital eBook Collection</subfield><subfield code="b">IDEB</subfield><subfield code="n">103343</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">YBP Library Services</subfield><subfield code="b">YANK</subfield><subfield code="n">2729644</subfield></datafield><datafield tag="994" ind1=" " ind2=" "><subfield code="a">92</subfield><subfield code="b">GEBAY</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-4-EBA</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-863</subfield></datafield></record></collection> |
id | ZDB-4-EBA-ocn162128691 |
illustrated | Illustrated |
indexdate | 2024-11-27T13:16:05Z |
institution | BVB |
isbn | 9780125264204 0125264208 9780080512617 0080512615 128103343X 9781281033437 9786611033439 6611033432 |
language | English |
oclc_num | 162128691 |
open_access_boolean | |
owner | MAIN DE-863 DE-BY-FWS |
owner_facet | MAIN DE-863 DE-BY-FWS |
physical | 1 online resource (xvi, 351 pages) : illustrations |
psigel | ZDB-4-EBA |
publishDate | 1998 |
publishDateSearch | 1998 |
publishDateSort | 1998 |
publisher | Academic Press, |
record_format | marc |
spelling | Neural networks and pattern recognition / edited by Omid Omidvar, Judith Dayhoff. San Diego, Calif. : Academic Press, ©1998. 1 online resource (xvi, 351 pages) : illustrations text txt rdacontent computer c rdamedia online resource cr rdacarrier This book is one of the most up-to-date and cutting-edge texts available on the rapidly growing application area of neural networks. Neural Networks and Pattern Recognition focuses on the use of neural networksin pattern recognition, a very important application area for neural networks technology. The contributors are widely known and highly respected researchers and practitioners in the field. Key Features * Features neural network architectures on the cutting edge of neural network research * Brings together highly innovative ideas on dynamical neural networks * Includes articles written by authors prominent in the neural networks research community * Provides an authoritative, technically correct presentation of each specific technical area. (Chapter Headings) Preface. Contributors. J.L. Johnson, H. Ranganath, G. Kuntimad, and H.J. Caulfield, Pulse-Coupled Neural Networks. H. Li and J. Wang, A Neural Network Model for Optical Flow Computation. F. Unal and N. Tepedelenlioglu, Temporal Pattern Matching Using an Artificial Neural Network. J. Dayhoff, P. Palmadesso, F. Richards, and D.-T. Lin, Patterns of Dynamic Activity and Timing in Neural Network Processing. J. Ghosh, H.-J. Chang, and K. Liano, A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons. P. Tito, B. Horne, C.L. Giles, and P. Collingwood, Finite State Machines and Recurrent Neural Networks--Automata and Dynamical Systems Approaches. R. Anderson, Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error. A. Nigrin, Using SONNET 1 to Segment Continuous Sequences of Items. K. Venkatesh, A. Pandya, and S. Hsu, On the Use of High Level Petri Nets in the Modeling of Biological Neural Networks. J. Principe, S. Celebi, B. de Vries, and J. Harris, Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions. Preface. Contributors. J.L. Johnson, H. Ranganath, G. Kuntimad, and H.J. Caulfield, Pulse-Coupled Neural Networks: Introduction. Basic Model. Multiple Pulses. Multiple Receptive Field Inputs. Time Evolution of Two Cells. Space to Time. LinkingWaves and Time Scales. Groups. Invariances. Segmentation. Adaptation. Time to Space. Implementations. Integration into Systems. Concluding Remarks. References. H. Li and J. Wang, A Neural Network Model for Optical Flow Computation: Introduction. Theoretical Background. Discussion on the Reformulation. Choosing Regularization Parameters. A Recurrent Neural Network Model. Experiments. Comparison to Other Work. Summary and Discussion. References. F. Unal and N. Tepedelenlioglu, TemporalPattern Matching Using an Artificial Neural Network: Introduction. Solving Optimization Problems Using the Hopfield Network. Dynamic Time Warping Using Hopfield Network. Computer Simulation Results. Conclusions. References. J. Dayhoff, P. Palmadesso, F. Richards, and D.-T. Lin, Patterns of Dynamic Activity and Timing in Neural Network Processing: Introduction. Dynamic Networks. Chaotic Attractors and Attractor Locking. Developing Multiple Attractors. Attractor Basins and Dynamic Binary Networks. Time Delay Mechanisms and Attractor Training. Timing of Action Potentials in Impulse Trains. Discussion. Acknowledgments. References. J. Ghosh, H.-J. Chang, and K. Liano, A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons: Introduction. A Macroscopic Model for Cell Assemblies. Interactions Between Two Neural Groups. Stability of Equilibrium States. Oscillation Frequency Estimation. Experimental Validation. Conclusion. Appendix. References. P. Tito, B. Horne, C.L. Giles, and P. Collingwood, Finite State Machines and Recurrent Neural Networks--Automata and Dynamical Systems Approaches: Introduction. State Machines. Dynamical Systems. Recurrent Neural Network. RNN as a State Machine. RNN as a Collection of Dynamical Systems. RNN with Two State Neurons. Experiments--Learning Loops of FSM. Discussion. References. R. Anderson, Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error: Introduction. Hebb's Rule. Theoretical Learning Rules. Biological Evidence. Conclusions. Acknowledgments. References and Bibliography. A. Nigrin, Using SONNET 1 to Segment Continuous Sequences of Items: Introduction. Learning Isolated and Embedded Spatial Patterns. Storing Items with Decreasing Activity. The LTM Invariance Principle. Using Rehearsal to Process Arbitrarily Long Lists. Implementing the LTM Invariance Principle with an On-Center Off-Surround Circuit. Resetting Items Once They can be Classified. Properties of a Classifying System. Simulations. Discussion. K. Venkatesh, A. Pandya, and S. Hsu, On the Use of High Level Petri Nets in the Modeling of Biological Neural Networks: Introduction. Fundamentals of PNs. Modeling of Biological Neural Systems with High Level PNs. New/Modified Elements Added to HPNs to Model BNNs. Example of a BNN: The Olfactory Bulb. Conclusions. References. J. Principe, S. Celebi, B. de Vries, and J. Harris, Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions: Introduction. Linear Finite Dimensional Memory Structures. The Gamma Neural Network. Applications of the Gamma Memory. Interpretations of the Gamma Memory. Laguerre and Gamma II Memories. Analog VLSI Implementations of the Gamma Filter. Conclusions. References. Includes bibliographical references and index. Pulse-coupled neural networks / J.L. Johnson [and others] -- A neural network model for optical flow computation / Hua Li ; Jun Wang -- Temporal pattern matching using an artificial neural network / Fatih A. Unal ; Nazif Tepedelenlioglu -- Patterns of dynamic activity and timing in neural network processing / Judith E. Dayhoff [and others] -- A macroscopic model of oscillation in ensembles of inhibitory and excitatory neurons / Joydeep Ghosh ; Hung-Jen Chang ; Kadir Liano -- Finite state machines and recurrent neural networks--automata and dynamical systems approaches / Peter Tiňo [and others] -- Biased random-walk learning: a neurobiological correlate to trial-and-error / Russell W. Anderson -- Using SONNET 1 to segment continuous sequences of items / Albert Nigrin -- On the use of high-level petri nets in the modeling of biological neural networks / Kurapati Venkatesh ; Abhijit Pandya ; Sam Hsu -- Locally recurrent networks: the gamma operator, properties, and extensions / Jose C. Principe [and others]. Print version record. English. Neural networks (Computer science) http://id.loc.gov/authorities/subjects/sh90001937 Pattern recognition systems. http://id.loc.gov/authorities/subjects/sh85098791 Réseaux neuronaux (Informatique) Reconnaissance des formes (Informatique) COMPUTERS Neural Networks. bisacsh Neural networks (Computer science) fast Pattern recognition systems fast Omidvar, Omid. http://id.loc.gov/authorities/names/n96099079 Dayhoff, Judith E. http://id.loc.gov/authorities/names/n88283578 Print version: Neural networks and pattern recognition. San Diego, Calif. : Academic Press, ©1998 0125264208 9780125264204 (DLC) 97025466 (OCoLC)37155674 FWS01 ZDB-4-EBA FWS_PDA_EBA https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=209290 Volltext FWS01 ZDB-4-EBA FWS_PDA_EBA https://www.sciencedirect.com/science/book/9780125264204 Volltext |
spellingShingle | Neural networks and pattern recognition / (Chapter Headings) Preface. Contributors. J.L. Johnson, H. Ranganath, G. Kuntimad, and H.J. Caulfield, Pulse-Coupled Neural Networks. H. Li and J. Wang, A Neural Network Model for Optical Flow Computation. F. Unal and N. Tepedelenlioglu, Temporal Pattern Matching Using an Artificial Neural Network. J. Dayhoff, P. Palmadesso, F. Richards, and D.-T. Lin, Patterns of Dynamic Activity and Timing in Neural Network Processing. J. Ghosh, H.-J. Chang, and K. Liano, A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons. P. Tito, B. Horne, C.L. Giles, and P. Collingwood, Finite State Machines and Recurrent Neural Networks--Automata and Dynamical Systems Approaches. R. Anderson, Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error. A. Nigrin, Using SONNET 1 to Segment Continuous Sequences of Items. K. Venkatesh, A. Pandya, and S. Hsu, On the Use of High Level Petri Nets in the Modeling of Biological Neural Networks. J. Principe, S. Celebi, B. de Vries, and J. Harris, Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions. Preface. Contributors. J.L. Johnson, H. Ranganath, G. Kuntimad, and H.J. Caulfield, Pulse-Coupled Neural Networks: Introduction. Basic Model. Multiple Pulses. Multiple Receptive Field Inputs. Time Evolution of Two Cells. Space to Time. LinkingWaves and Time Scales. Groups. Invariances. Segmentation. Adaptation. Time to Space. Implementations. Integration into Systems. Concluding Remarks. References. H. Li and J. Wang, A Neural Network Model for Optical Flow Computation: Introduction. Theoretical Background. Discussion on the Reformulation. Choosing Regularization Parameters. A Recurrent Neural Network Model. Experiments. Comparison to Other Work. Summary and Discussion. References. F. Unal and N. Tepedelenlioglu, TemporalPattern Matching Using an Artificial Neural Network: Introduction. Solving Optimization Problems Using the Hopfield Network. Dynamic Time Warping Using Hopfield Network. Computer Simulation Results. Conclusions. References. J. Dayhoff, P. Palmadesso, F. Richards, and D.-T. Lin, Patterns of Dynamic Activity and Timing in Neural Network Processing: Introduction. Dynamic Networks. Chaotic Attractors and Attractor Locking. Developing Multiple Attractors. Attractor Basins and Dynamic Binary Networks. Time Delay Mechanisms and Attractor Training. Timing of Action Potentials in Impulse Trains. Discussion. Acknowledgments. References. J. Ghosh, H.-J. Chang, and K. Liano, A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons: Introduction. A Macroscopic Model for Cell Assemblies. Interactions Between Two Neural Groups. Stability of Equilibrium States. Oscillation Frequency Estimation. Experimental Validation. Conclusion. Appendix. References. P. Tito, B. Horne, C.L. Giles, and P. Collingwood, Finite State Machines and Recurrent Neural Networks--Automata and Dynamical Systems Approaches: Introduction. State Machines. Dynamical Systems. Recurrent Neural Network. RNN as a State Machine. RNN as a Collection of Dynamical Systems. RNN with Two State Neurons. Experiments--Learning Loops of FSM. Discussion. References. R. Anderson, Biased Random-Walk Learning: A Neurobiological Correlate to Trial-and-Error: Introduction. Hebb's Rule. Theoretical Learning Rules. Biological Evidence. Conclusions. Acknowledgments. References and Bibliography. A. Nigrin, Using SONNET 1 to Segment Continuous Sequences of Items: Introduction. Learning Isolated and Embedded Spatial Patterns. Storing Items with Decreasing Activity. The LTM Invariance Principle. Using Rehearsal to Process Arbitrarily Long Lists. Implementing the LTM Invariance Principle with an On-Center Off-Surround Circuit. Resetting Items Once They can be Classified. Properties of a Classifying System. Simulations. Discussion. K. Venkatesh, A. Pandya, and S. Hsu, On the Use of High Level Petri Nets in the Modeling of Biological Neural Networks: Introduction. Fundamentals of PNs. Modeling of Biological Neural Systems with High Level PNs. New/Modified Elements Added to HPNs to Model BNNs. Example of a BNN: The Olfactory Bulb. Conclusions. References. J. Principe, S. Celebi, B. de Vries, and J. Harris, Locally Recurrent Networks: The Gamma Operator, Properties, and Extensions: Introduction. Linear Finite Dimensional Memory Structures. The Gamma Neural Network. Applications of the Gamma Memory. Interpretations of the Gamma Memory. Laguerre and Gamma II Memories. Analog VLSI Implementations of the Gamma Filter. Conclusions. References. Pulse-coupled neural networks / J.L. Johnson [and others] -- A neural network model for optical flow computation / Hua Li ; Jun Wang -- Temporal pattern matching using an artificial neural network / Fatih A. Unal ; Nazif Tepedelenlioglu -- Patterns of dynamic activity and timing in neural network processing / Judith E. Dayhoff [and others] -- A macroscopic model of oscillation in ensembles of inhibitory and excitatory neurons / Joydeep Ghosh ; Hung-Jen Chang ; Kadir Liano -- Finite state machines and recurrent neural networks--automata and dynamical systems approaches / Peter Tiňo [and others] -- Biased random-walk learning: a neurobiological correlate to trial-and-error / Russell W. Anderson -- Using SONNET 1 to segment continuous sequences of items / Albert Nigrin -- On the use of high-level petri nets in the modeling of biological neural networks / Kurapati Venkatesh ; Abhijit Pandya ; Sam Hsu -- Locally recurrent networks: the gamma operator, properties, and extensions / Jose C. Principe [and others]. Neural networks (Computer science) http://id.loc.gov/authorities/subjects/sh90001937 Pattern recognition systems. http://id.loc.gov/authorities/subjects/sh85098791 Réseaux neuronaux (Informatique) Reconnaissance des formes (Informatique) COMPUTERS Neural Networks. bisacsh Neural networks (Computer science) fast Pattern recognition systems fast |
subject_GND | http://id.loc.gov/authorities/subjects/sh90001937 http://id.loc.gov/authorities/subjects/sh85098791 |
title | Neural networks and pattern recognition / |
title_auth | Neural networks and pattern recognition / |
title_exact_search | Neural networks and pattern recognition / |
title_full | Neural networks and pattern recognition / edited by Omid Omidvar, Judith Dayhoff. |
title_fullStr | Neural networks and pattern recognition / edited by Omid Omidvar, Judith Dayhoff. |
title_full_unstemmed | Neural networks and pattern recognition / edited by Omid Omidvar, Judith Dayhoff. |
title_short | Neural networks and pattern recognition / |
title_sort | neural networks and pattern recognition |
topic | Neural networks (Computer science) http://id.loc.gov/authorities/subjects/sh90001937 Pattern recognition systems. http://id.loc.gov/authorities/subjects/sh85098791 Réseaux neuronaux (Informatique) Reconnaissance des formes (Informatique) COMPUTERS Neural Networks. bisacsh Neural networks (Computer science) fast Pattern recognition systems fast |
topic_facet | Neural networks (Computer science) Pattern recognition systems. Réseaux neuronaux (Informatique) Reconnaissance des formes (Informatique) COMPUTERS Neural Networks. Pattern recognition systems |
url | https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=209290 https://www.sciencedirect.com/science/book/9780125264204 |
work_keys_str_mv | AT omidvaromid neuralnetworksandpatternrecognition AT dayhoffjudithe neuralnetworksandpatternrecognition |