Neural network perception for mobile robot guidance:
Vision based mobile robot guidance has proven difficult for classical machine vision methods because of the diversity and real time constraints inherent in the task. This book describes a connectionist system called ALVINN (Autonomous Land Vehicle In a Neural Network) that overcomes these difficulti...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Abschlussarbeit Buch |
Sprache: | English |
Veröffentlicht: |
Boston u.a.
Kluwer
1993
|
Schriftenreihe: | The Kluwer international series in engineering and computer science
239 |
Schlagworte: | |
Zusammenfassung: | Vision based mobile robot guidance has proven difficult for classical machine vision methods because of the diversity and real time constraints inherent in the task. This book describes a connectionist system called ALVINN (Autonomous Land Vehicle In a Neural Network) that overcomes these difficulties. ALVINN learns to guide mobile robots using the back-propagation training algorithm. Because of its ability to learn from example, ALVINN can adapt to new situations and therefore cope with the diversity of the autonomous navigation task But real world problems like vision based mobile robot guidance present a different set of challenges for the connectionist paradigm. Among them are: how to develop a general representation from a limited amount of real training data, how to understand the internal representations developed by artificial neural networks, how to estimate the reliability of individual networks, how to combine multiple networks trained for different situations into a single system, and how to combine connectionist perception with symbolic reasoning Neural Network Perception for Mobile Robot Guidance presents novel solutions to each of these problems. Using these techniques, the ALVINN system can learn to control an autonomous van in under 5 minutes by watching a person drive. Once trained, individual ALVINN networks can drive in a variety of circumstances, including single-lane paved and unpaved roads, and multi-lane lined and unlined roads, at speeds of up to 55 miles per hour. The techniques also are shown to generalize to the task of controlling the precise foot placement of a walking robot |
Beschreibung: | XIV, 191 S. Ill., graph. Darst. |
ISBN: | 0792393732 |
Internformat
MARC
LEADER | 00000nam a2200000 cb4500 | ||
---|---|---|---|
001 | BV009526689 | ||
003 | DE-604 | ||
005 | 20141215 | ||
007 | t | ||
008 | 940412s1993 ad|| m||| 00||| eng d | ||
020 | |a 0792393732 |9 0-7923-9373-2 | ||
035 | |a (OCoLC)28148857 | ||
035 | |a (DE-599)BVBBV009526689 | ||
040 | |a DE-604 |b ger |e rakddb | ||
041 | 0 | |a eng | |
049 | |a DE-91 |a DE-634 |a DE-83 | ||
050 | 0 | |a TJ211.415 | |
082 | 0 | |a 629.8/95 |2 20 | |
084 | |a ZQ 6250 |0 (DE-625)158184: |2 rvk | ||
084 | |a DAT 815d |2 stub | ||
084 | |a DAT 717d |2 stub | ||
084 | |a FER 986d |2 stub | ||
100 | 1 | |a Pomerleau, Dean A. |e Verfasser |4 aut | |
245 | 1 | 0 | |a Neural network perception for mobile robot guidance |c by Dean A. Pomerleau |
264 | 1 | |a Boston u.a. |b Kluwer |c 1993 | |
300 | |a XIV, 191 S. |b Ill., graph. Darst. | ||
336 | |b txt |2 rdacontent | ||
337 | |b n |2 rdamedia | ||
338 | |b nc |2 rdacarrier | ||
490 | 1 | |a The Kluwer international series in engineering and computer science |v 239 | |
502 | |a Zugl.: Pittsburgh, Pa., Univ., Diss., 1992 | ||
520 | 3 | |a Vision based mobile robot guidance has proven difficult for classical machine vision methods because of the diversity and real time constraints inherent in the task. This book describes a connectionist system called ALVINN (Autonomous Land Vehicle In a Neural Network) that overcomes these difficulties. ALVINN learns to guide mobile robots using the back-propagation training algorithm. Because of its ability to learn from example, ALVINN can adapt to new situations and therefore cope with the diversity of the autonomous navigation task | |
520 | |a But real world problems like vision based mobile robot guidance present a different set of challenges for the connectionist paradigm. Among them are: how to develop a general representation from a limited amount of real training data, how to understand the internal representations developed by artificial neural networks, how to estimate the reliability of individual networks, how to combine multiple networks trained for different situations into a single system, and how to combine connectionist perception with symbolic reasoning | ||
520 | |a Neural Network Perception for Mobile Robot Guidance presents novel solutions to each of these problems. Using these techniques, the ALVINN system can learn to control an autonomous van in under 5 minutes by watching a person drive. Once trained, individual ALVINN networks can drive in a variety of circumstances, including single-lane paved and unpaved roads, and multi-lane lined and unlined roads, at speeds of up to 55 miles per hour. The techniques also are shown to generalize to the task of controlling the precise foot placement of a walking robot | ||
650 | 4 | |a Mobile robots | |
650 | 4 | |a Neural networks (Computer science) | |
650 | 4 | |a Robots |x Control systems | |
650 | 0 | 7 | |a Neuronales Netz |0 (DE-588)4226127-2 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Mobiler Roboter |0 (DE-588)4191911-7 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Roboter |0 (DE-588)4050208-9 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Maschinelles Sehen |0 (DE-588)4129594-8 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Steuerung |0 (DE-588)4057472-6 |2 gnd |9 rswk-swf |
655 | 7 | |0 (DE-588)4113937-9 |a Hochschulschrift |2 gnd-content | |
689 | 0 | 0 | |a Neuronales Netz |0 (DE-588)4226127-2 |D s |
689 | 0 | 1 | |a Mobiler Roboter |0 (DE-588)4191911-7 |D s |
689 | 0 | 2 | |a Steuerung |0 (DE-588)4057472-6 |D s |
689 | 0 | |5 DE-604 | |
689 | 1 | 0 | |a Neuronales Netz |0 (DE-588)4226127-2 |D s |
689 | 1 | 1 | |a Maschinelles Sehen |0 (DE-588)4129594-8 |D s |
689 | 1 | 2 | |a Roboter |0 (DE-588)4050208-9 |D s |
689 | 1 | |8 1\p |5 DE-604 | |
830 | 0 | |a The Kluwer international series in engineering and computer science |v 239 |w (DE-604)BV023545171 |9 239 | |
999 | |a oai:aleph.bib-bvb.de:BVB01-006290272 | ||
883 | 1 | |8 1\p |a cgwrk |d 20201028 |q DE-101 |u https://d-nb.info/provenance/plan#cgwrk |
Datensatz im Suchindex
_version_ | 1804123863577001984 |
---|---|
any_adam_object | |
author | Pomerleau, Dean A. |
author_facet | Pomerleau, Dean A. |
author_role | aut |
author_sort | Pomerleau, Dean A. |
author_variant | d a p da dap |
building | Verbundindex |
bvnumber | BV009526689 |
callnumber-first | T - Technology |
callnumber-label | TJ211 |
callnumber-raw | TJ211.415 |
callnumber-search | TJ211.415 |
callnumber-sort | TJ 3211.415 |
callnumber-subject | TJ - Mechanical Engineering and Machinery |
classification_rvk | ZQ 6250 |
classification_tum | DAT 815d DAT 717d FER 986d |
ctrlnum | (OCoLC)28148857 (DE-599)BVBBV009526689 |
dewey-full | 629.8/95 |
dewey-hundreds | 600 - Technology (Applied sciences) |
dewey-ones | 629 - Other branches of engineering |
dewey-raw | 629.8/95 |
dewey-search | 629.8/95 |
dewey-sort | 3629.8 295 |
dewey-tens | 620 - Engineering and allied operations |
discipline | Informatik Fertigungstechnik Mess-/Steuerungs-/Regelungs-/Automatisierungstechnik / Mechatronik |
format | Thesis Book |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>03847nam a2200613 cb4500</leader><controlfield tag="001">BV009526689</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20141215 </controlfield><controlfield tag="007">t</controlfield><controlfield tag="008">940412s1993 ad|| m||| 00||| eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">0792393732</subfield><subfield code="9">0-7923-9373-2</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)28148857</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)BVBBV009526689</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">rakddb</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-91</subfield><subfield code="a">DE-634</subfield><subfield code="a">DE-83</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">TJ211.415</subfield></datafield><datafield tag="082" ind1="0" ind2=" "><subfield code="a">629.8/95</subfield><subfield code="2">20</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ZQ 6250</subfield><subfield code="0">(DE-625)158184:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">DAT 815d</subfield><subfield code="2">stub</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">DAT 717d</subfield><subfield code="2">stub</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">FER 986d</subfield><subfield code="2">stub</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Pomerleau, Dean A.</subfield><subfield code="e">Verfasser</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Neural network perception for mobile robot guidance</subfield><subfield code="c">by Dean A. Pomerleau</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Boston u.a.</subfield><subfield code="b">Kluwer</subfield><subfield code="c">1993</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">XIV, 191 S.</subfield><subfield code="b">Ill., graph. Darst.</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="490" ind1="1" ind2=" "><subfield code="a">The Kluwer international series in engineering and computer science</subfield><subfield code="v">239</subfield></datafield><datafield tag="502" ind1=" " ind2=" "><subfield code="a">Zugl.: Pittsburgh, Pa., Univ., Diss., 1992</subfield></datafield><datafield tag="520" ind1="3" ind2=" "><subfield code="a">Vision based mobile robot guidance has proven difficult for classical machine vision methods because of the diversity and real time constraints inherent in the task. This book describes a connectionist system called ALVINN (Autonomous Land Vehicle In a Neural Network) that overcomes these difficulties. ALVINN learns to guide mobile robots using the back-propagation training algorithm. Because of its ability to learn from example, ALVINN can adapt to new situations and therefore cope with the diversity of the autonomous navigation task</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">But real world problems like vision based mobile robot guidance present a different set of challenges for the connectionist paradigm. Among them are: how to develop a general representation from a limited amount of real training data, how to understand the internal representations developed by artificial neural networks, how to estimate the reliability of individual networks, how to combine multiple networks trained for different situations into a single system, and how to combine connectionist perception with symbolic reasoning</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Neural Network Perception for Mobile Robot Guidance presents novel solutions to each of these problems. Using these techniques, the ALVINN system can learn to control an autonomous van in under 5 minutes by watching a person drive. Once trained, individual ALVINN networks can drive in a variety of circumstances, including single-lane paved and unpaved roads, and multi-lane lined and unlined roads, at speeds of up to 55 miles per hour. The techniques also are shown to generalize to the task of controlling the precise foot placement of a walking robot</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Mobile robots</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Neural networks (Computer science)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Robots</subfield><subfield code="x">Control systems</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Neuronales Netz</subfield><subfield code="0">(DE-588)4226127-2</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Mobiler Roboter</subfield><subfield code="0">(DE-588)4191911-7</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Roboter</subfield><subfield code="0">(DE-588)4050208-9</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Maschinelles Sehen</subfield><subfield code="0">(DE-588)4129594-8</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Steuerung</subfield><subfield code="0">(DE-588)4057472-6</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="655" ind1=" " ind2="7"><subfield code="0">(DE-588)4113937-9</subfield><subfield code="a">Hochschulschrift</subfield><subfield code="2">gnd-content</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Neuronales Netz</subfield><subfield code="0">(DE-588)4226127-2</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="1"><subfield code="a">Mobiler Roboter</subfield><subfield code="0">(DE-588)4191911-7</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="2"><subfield code="a">Steuerung</subfield><subfield code="0">(DE-588)4057472-6</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="689" ind1="1" ind2="0"><subfield code="a">Neuronales Netz</subfield><subfield code="0">(DE-588)4226127-2</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="1" ind2="1"><subfield code="a">Maschinelles Sehen</subfield><subfield code="0">(DE-588)4129594-8</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="1" ind2="2"><subfield code="a">Roboter</subfield><subfield code="0">(DE-588)4050208-9</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="1" ind2=" "><subfield code="8">1\p</subfield><subfield code="5">DE-604</subfield></datafield><datafield tag="830" ind1=" " ind2="0"><subfield code="a">The Kluwer international series in engineering and computer science</subfield><subfield code="v">239</subfield><subfield code="w">(DE-604)BV023545171</subfield><subfield code="9">239</subfield></datafield><datafield tag="999" ind1=" " ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-006290272</subfield></datafield><datafield tag="883" ind1="1" ind2=" "><subfield code="8">1\p</subfield><subfield code="a">cgwrk</subfield><subfield code="d">20201028</subfield><subfield code="q">DE-101</subfield><subfield code="u">https://d-nb.info/provenance/plan#cgwrk</subfield></datafield></record></collection> |
genre | (DE-588)4113937-9 Hochschulschrift gnd-content |
genre_facet | Hochschulschrift |
id | DE-604.BV009526689 |
illustrated | Illustrated |
indexdate | 2024-07-09T17:36:31Z |
institution | BVB |
isbn | 0792393732 |
language | English |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-006290272 |
oclc_num | 28148857 |
open_access_boolean | |
owner | DE-91 DE-BY-TUM DE-634 DE-83 |
owner_facet | DE-91 DE-BY-TUM DE-634 DE-83 |
physical | XIV, 191 S. Ill., graph. Darst. |
publishDate | 1993 |
publishDateSearch | 1993 |
publishDateSort | 1993 |
publisher | Kluwer |
record_format | marc |
series | The Kluwer international series in engineering and computer science |
series2 | The Kluwer international series in engineering and computer science |
spelling | Pomerleau, Dean A. Verfasser aut Neural network perception for mobile robot guidance by Dean A. Pomerleau Boston u.a. Kluwer 1993 XIV, 191 S. Ill., graph. Darst. txt rdacontent n rdamedia nc rdacarrier The Kluwer international series in engineering and computer science 239 Zugl.: Pittsburgh, Pa., Univ., Diss., 1992 Vision based mobile robot guidance has proven difficult for classical machine vision methods because of the diversity and real time constraints inherent in the task. This book describes a connectionist system called ALVINN (Autonomous Land Vehicle In a Neural Network) that overcomes these difficulties. ALVINN learns to guide mobile robots using the back-propagation training algorithm. Because of its ability to learn from example, ALVINN can adapt to new situations and therefore cope with the diversity of the autonomous navigation task But real world problems like vision based mobile robot guidance present a different set of challenges for the connectionist paradigm. Among them are: how to develop a general representation from a limited amount of real training data, how to understand the internal representations developed by artificial neural networks, how to estimate the reliability of individual networks, how to combine multiple networks trained for different situations into a single system, and how to combine connectionist perception with symbolic reasoning Neural Network Perception for Mobile Robot Guidance presents novel solutions to each of these problems. Using these techniques, the ALVINN system can learn to control an autonomous van in under 5 minutes by watching a person drive. Once trained, individual ALVINN networks can drive in a variety of circumstances, including single-lane paved and unpaved roads, and multi-lane lined and unlined roads, at speeds of up to 55 miles per hour. The techniques also are shown to generalize to the task of controlling the precise foot placement of a walking robot Mobile robots Neural networks (Computer science) Robots Control systems Neuronales Netz (DE-588)4226127-2 gnd rswk-swf Mobiler Roboter (DE-588)4191911-7 gnd rswk-swf Roboter (DE-588)4050208-9 gnd rswk-swf Maschinelles Sehen (DE-588)4129594-8 gnd rswk-swf Steuerung (DE-588)4057472-6 gnd rswk-swf (DE-588)4113937-9 Hochschulschrift gnd-content Neuronales Netz (DE-588)4226127-2 s Mobiler Roboter (DE-588)4191911-7 s Steuerung (DE-588)4057472-6 s DE-604 Maschinelles Sehen (DE-588)4129594-8 s Roboter (DE-588)4050208-9 s 1\p DE-604 The Kluwer international series in engineering and computer science 239 (DE-604)BV023545171 239 1\p cgwrk 20201028 DE-101 https://d-nb.info/provenance/plan#cgwrk |
spellingShingle | Pomerleau, Dean A. Neural network perception for mobile robot guidance The Kluwer international series in engineering and computer science Mobile robots Neural networks (Computer science) Robots Control systems Neuronales Netz (DE-588)4226127-2 gnd Mobiler Roboter (DE-588)4191911-7 gnd Roboter (DE-588)4050208-9 gnd Maschinelles Sehen (DE-588)4129594-8 gnd Steuerung (DE-588)4057472-6 gnd |
subject_GND | (DE-588)4226127-2 (DE-588)4191911-7 (DE-588)4050208-9 (DE-588)4129594-8 (DE-588)4057472-6 (DE-588)4113937-9 |
title | Neural network perception for mobile robot guidance |
title_auth | Neural network perception for mobile robot guidance |
title_exact_search | Neural network perception for mobile robot guidance |
title_full | Neural network perception for mobile robot guidance by Dean A. Pomerleau |
title_fullStr | Neural network perception for mobile robot guidance by Dean A. Pomerleau |
title_full_unstemmed | Neural network perception for mobile robot guidance by Dean A. Pomerleau |
title_short | Neural network perception for mobile robot guidance |
title_sort | neural network perception for mobile robot guidance |
topic | Mobile robots Neural networks (Computer science) Robots Control systems Neuronales Netz (DE-588)4226127-2 gnd Mobiler Roboter (DE-588)4191911-7 gnd Roboter (DE-588)4050208-9 gnd Maschinelles Sehen (DE-588)4129594-8 gnd Steuerung (DE-588)4057472-6 gnd |
topic_facet | Mobile robots Neural networks (Computer science) Robots Control systems Neuronales Netz Mobiler Roboter Roboter Maschinelles Sehen Steuerung Hochschulschrift |
volume_link | (DE-604)BV023545171 |
work_keys_str_mv | AT pomerleaudeana neuralnetworkperceptionformobilerobotguidance |