Transformers for machine learning: a deep dive
Transformers are becoming a core part of many neural network architectures, employed in a wide range of applications such as NLP, Speech Recognition, Time Series, and Computer Vision. Transformers have gone through many adaptations and alterations, resulting in newer techniques and methods. Transfor...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Elektronisch E-Book |
Sprache: | English |
Veröffentlicht: |
Boca Raton ; London ; New York
CRC Press
2022
|
Ausgabe: | First edition |
Schriftenreihe: | Chapman & Hall/CRC machine learning & pattern recognition
|
Schlagworte: | |
Online-Zugang: | DE-863 DE-862 DE-91 DE-29 Volltext |
Zusammenfassung: | Transformers are becoming a core part of many neural network architectures, employed in a wide range of applications such as NLP, Speech Recognition, Time Series, and Computer Vision. Transformers have gone through many adaptations and alterations, resulting in newer techniques and methods. Transformers for Machine Learning: A Deep Dive is the first comprehensive book on transformers. Key Features: A comprehensive reference book for detailed explanations for every algorithm and techniques related to the transformers. 60+ transformer architectures covered in a comprehensive manner. A book for understanding how to apply the transformer techniques in speech, text, time series, and computer vision. Practical tips and tricks for each architecture and how to use it in the real world. Hands-on case studies and code snippets for theory and practical real-world analysis using the tools and libraries, all ready to run in Google Colab. The theoretical explanations of the state-of-the-art transformer architectures will appeal to postgraduate students and researchers (academic and industry) as it will provide a single entry point with deep discussions of a quickly moving field. The practical hands-on case studies and code will appeal to undergraduate students, practitioners, and professionals as it allows for quick experimentation and lowers the barrier to entry into the field |
Beschreibung: | 1 Online-Ressource (xxv, 257 Seiten) Illustrationen, Diagramme |
ISBN: | 9781003170082 9781000587098 |
DOI: | 10.1201/9781003170082 |
Internformat
MARC
LEADER | 00000nam a22000001c 4500 | ||
---|---|---|---|
001 | BV048306680 | ||
003 | DE-604 | ||
005 | 20241023 | ||
007 | cr|uuu---uuuuu | ||
008 | 220630s2022 xx a||| o|||| 00||| eng d | ||
020 | |a 9781003170082 |c Online, ebook |9 978-1-003-17008-2 | ||
020 | |a 9781000587098 |9 978-1-00-058709-8 | ||
024 | 7 | |a 10.1201/9781003170082 |2 doi | |
035 | |a (ZDB-30-PQE)EBC6950084 | ||
035 | |a (ZDB-30-PAD)EBC6950084 | ||
035 | |a (ZDB-89-EBL)EBL6950084 | ||
035 | |a (OCoLC)1335408026 | ||
035 | |a (DE-599)BVBBV048306680 | ||
040 | |a DE-604 |b ger |e rda | ||
041 | 0 | |a eng | |
049 | |a DE-29 |a DE-91 |a DE-863 |a DE-862 | ||
100 | 1 | |a Kamath, Uday |e Verfasser |0 (DE-588)1262783860 |4 aut | |
245 | 1 | 0 | |a Transformers for machine learning |b a deep dive |c Uday Kamath, Kenneth L. Graham, Wael Emara |
250 | |a First edition | ||
264 | 1 | |a Boca Raton ; London ; New York |b CRC Press |c 2022 | |
300 | |a 1 Online-Ressource (xxv, 257 Seiten) |b Illustrationen, Diagramme | ||
336 | |b txt |2 rdacontent | ||
337 | |b c |2 rdamedia | ||
338 | |b cr |2 rdacarrier | ||
490 | 0 | |a Chapman & Hall/CRC machine learning & pattern recognition | |
520 | 3 | |a Transformers are becoming a core part of many neural network architectures, employed in a wide range of applications such as NLP, Speech Recognition, Time Series, and Computer Vision. Transformers have gone through many adaptations and alterations, resulting in newer techniques and methods. Transformers for Machine Learning: A Deep Dive is the first comprehensive book on transformers. Key Features: A comprehensive reference book for detailed explanations for every algorithm and techniques related to the transformers. 60+ transformer architectures covered in a comprehensive manner. A book for understanding how to apply the transformer techniques in speech, text, time series, and computer vision. Practical tips and tricks for each architecture and how to use it in the real world. Hands-on case studies and code snippets for theory and practical real-world analysis using the tools and libraries, all ready to run in Google Colab. The theoretical explanations of the state-of-the-art transformer architectures will appeal to postgraduate students and researchers (academic and industry) as it will provide a single entry point with deep discussions of a quickly moving field. The practical hands-on case studies and code will appeal to undergraduate students, practitioners, and professionals as it allows for quick experimentation and lowers the barrier to entry into the field | |
650 | 0 | 7 | |a Soft Computing |0 (DE-588)4455833-8 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Deep Learning |0 (DE-588)1135597375 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Neuronales Netz |0 (DE-588)4226127-2 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Maschinelles Lernen |0 (DE-588)4193754-5 |2 gnd |9 rswk-swf |
653 | 6 | |a Electronic books | |
689 | 0 | 0 | |a Maschinelles Lernen |0 (DE-588)4193754-5 |D s |
689 | 0 | 1 | |a Neuronales Netz |0 (DE-588)4226127-2 |D s |
689 | 0 | 2 | |a Deep Learning |0 (DE-588)1135597375 |D s |
689 | 0 | |5 DE-604 | |
689 | 1 | 0 | |a Soft Computing |0 (DE-588)4455833-8 |D s |
689 | 1 | |5 DE-604 | |
700 | 1 | |a Graham, Kenneth L |e Verfasser |4 aut | |
700 | 1 | |a Emara, Wael |e Verfasser |0 (DE-588)130900479X |4 aut | |
776 | 0 | 8 | |i Erscheint auch als |n Druck-Ausgabe, Hardcover |z 978-0-367-77165-2 |
776 | 0 | 8 | |i Erscheint auch als |n Druck-Ausgabe, Paperback |z 978-0-367-76734-1 |
856 | 4 | 0 | |u https://doi.org/10.1201/9781003170082 |x Verlag |z URL des Erstveröffentlichers |3 Volltext |
912 | |a ZDB-7-TFC | ||
912 | |a ZDB-30-PQE | ||
943 | 1 | |a oai:aleph.bib-bvb.de:BVB01-033686361 | |
966 | e | |u https://doi.org/10.1201/9781003170082 |l DE-863 |p ZDB-7-TFC |x Verlag |3 Volltext | |
966 | e | |u https://doi.org/10.1201/9781003170082 |l DE-862 |p ZDB-7-TFC |x Verlag |3 Volltext | |
966 | e | |u https://ebookcentral.proquest.com/lib/munchentech/detail.action?docID=6950084 |l DE-91 |p ZDB-30-PQE |q TUM_PDA_PQE_Kauf_2024 |x Aggregator |3 Volltext | |
966 | e | |u https://doi.org/10.1201/9781003170082 |l DE-29 |p ZDB-7-TFC |q UER_Einzelkauf_2022 |x Verlag |3 Volltext |
Datensatz im Suchindex
_version_ | 1824556306258722816 |
---|---|
adam_text | |
adam_txt | |
any_adam_object | |
any_adam_object_boolean | |
author | Kamath, Uday Graham, Kenneth L Emara, Wael |
author_GND | (DE-588)1262783860 (DE-588)130900479X |
author_facet | Kamath, Uday Graham, Kenneth L Emara, Wael |
author_role | aut aut aut |
author_sort | Kamath, Uday |
author_variant | u k uk k l g kl klg w e we |
building | Verbundindex |
bvnumber | BV048306680 |
collection | ZDB-7-TFC ZDB-30-PQE |
ctrlnum | (ZDB-30-PQE)EBC6950084 (ZDB-30-PAD)EBC6950084 (ZDB-89-EBL)EBL6950084 (OCoLC)1335408026 (DE-599)BVBBV048306680 |
doi_str_mv | 10.1201/9781003170082 |
edition | First edition |
format | Electronic eBook |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>00000nam a22000001c 4500</leader><controlfield tag="001">BV048306680</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20241023</controlfield><controlfield tag="007">cr|uuu---uuuuu</controlfield><controlfield tag="008">220630s2022 xx a||| o|||| 00||| eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9781003170082</subfield><subfield code="c">Online, ebook</subfield><subfield code="9">978-1-003-17008-2</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9781000587098</subfield><subfield code="9">978-1-00-058709-8</subfield></datafield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1201/9781003170082</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(ZDB-30-PQE)EBC6950084</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(ZDB-30-PAD)EBC6950084</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(ZDB-89-EBL)EBL6950084</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)1335408026</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)BVBBV048306680</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">rda</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-29</subfield><subfield code="a">DE-91</subfield><subfield code="a">DE-863</subfield><subfield code="a">DE-862</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Kamath, Uday</subfield><subfield code="e">Verfasser</subfield><subfield code="0">(DE-588)1262783860</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Transformers for machine learning</subfield><subfield code="b">a deep dive</subfield><subfield code="c">Uday Kamath, Kenneth L. Graham, Wael Emara</subfield></datafield><datafield tag="250" ind1=" " ind2=" "><subfield code="a">First edition</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Boca Raton ; London ; New York</subfield><subfield code="b">CRC Press</subfield><subfield code="c">2022</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 Online-Ressource (xxv, 257 Seiten)</subfield><subfield code="b">Illustrationen, Diagramme</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="490" ind1="0" ind2=" "><subfield code="a">Chapman & Hall/CRC machine learning & pattern recognition</subfield></datafield><datafield tag="520" ind1="3" ind2=" "><subfield code="a">Transformers are becoming a core part of many neural network architectures, employed in a wide range of applications such as NLP, Speech Recognition, Time Series, and Computer Vision. Transformers have gone through many adaptations and alterations, resulting in newer techniques and methods. Transformers for Machine Learning: A Deep Dive is the first comprehensive book on transformers. Key Features: A comprehensive reference book for detailed explanations for every algorithm and techniques related to the transformers. 60+ transformer architectures covered in a comprehensive manner. A book for understanding how to apply the transformer techniques in speech, text, time series, and computer vision. Practical tips and tricks for each architecture and how to use it in the real world. Hands-on case studies and code snippets for theory and practical real-world analysis using the tools and libraries, all ready to run in Google Colab. The theoretical explanations of the state-of-the-art transformer architectures will appeal to postgraduate students and researchers (academic and industry) as it will provide a single entry point with deep discussions of a quickly moving field. The practical hands-on case studies and code will appeal to undergraduate students, practitioners, and professionals as it allows for quick experimentation and lowers the barrier to entry into the field</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Soft Computing</subfield><subfield code="0">(DE-588)4455833-8</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Deep Learning</subfield><subfield code="0">(DE-588)1135597375</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Neuronales Netz</subfield><subfield code="0">(DE-588)4226127-2</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Maschinelles Lernen</subfield><subfield code="0">(DE-588)4193754-5</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="653" ind1=" " ind2="6"><subfield code="a">Electronic books</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Maschinelles Lernen</subfield><subfield code="0">(DE-588)4193754-5</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="1"><subfield code="a">Neuronales Netz</subfield><subfield code="0">(DE-588)4226127-2</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="2"><subfield code="a">Deep Learning</subfield><subfield code="0">(DE-588)1135597375</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="689" ind1="1" ind2="0"><subfield code="a">Soft Computing</subfield><subfield code="0">(DE-588)4455833-8</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="1" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Graham, Kenneth L</subfield><subfield code="e">Verfasser</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Emara, Wael</subfield><subfield code="e">Verfasser</subfield><subfield code="0">(DE-588)130900479X</subfield><subfield code="4">aut</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Erscheint auch als</subfield><subfield code="n">Druck-Ausgabe, Hardcover</subfield><subfield code="z">978-0-367-77165-2</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Erscheint auch als</subfield><subfield code="n">Druck-Ausgabe, Paperback</subfield><subfield code="z">978-0-367-76734-1</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.1201/9781003170082</subfield><subfield code="x">Verlag</subfield><subfield code="z">URL des Erstveröffentlichers</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-7-TFC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-30-PQE</subfield></datafield><datafield tag="943" ind1="1" ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-033686361</subfield></datafield><datafield tag="966" ind1="e" ind2=" "><subfield code="u">https://doi.org/10.1201/9781003170082</subfield><subfield code="l">DE-863</subfield><subfield code="p">ZDB-7-TFC</subfield><subfield code="x">Verlag</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="966" ind1="e" ind2=" "><subfield code="u">https://doi.org/10.1201/9781003170082</subfield><subfield code="l">DE-862</subfield><subfield code="p">ZDB-7-TFC</subfield><subfield code="x">Verlag</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="966" ind1="e" ind2=" "><subfield code="u">https://ebookcentral.proquest.com/lib/munchentech/detail.action?docID=6950084</subfield><subfield code="l">DE-91</subfield><subfield code="p">ZDB-30-PQE</subfield><subfield code="q">TUM_PDA_PQE_Kauf_2024</subfield><subfield code="x">Aggregator</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="966" ind1="e" ind2=" "><subfield code="u">https://doi.org/10.1201/9781003170082</subfield><subfield code="l">DE-29</subfield><subfield code="p">ZDB-7-TFC</subfield><subfield code="q">UER_Einzelkauf_2022</subfield><subfield code="x">Verlag</subfield><subfield code="3">Volltext</subfield></datafield></record></collection> |
id | DE-604.BV048306680 |
illustrated | Illustrated |
index_date | 2024-07-03T20:08:24Z |
indexdate | 2025-02-20T07:21:28Z |
institution | BVB |
isbn | 9781003170082 9781000587098 |
language | English |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-033686361 |
oclc_num | 1335408026 |
open_access_boolean | |
owner | DE-29 DE-91 DE-BY-TUM DE-863 DE-BY-FWS DE-862 DE-BY-FWS |
owner_facet | DE-29 DE-91 DE-BY-TUM DE-863 DE-BY-FWS DE-862 DE-BY-FWS |
physical | 1 Online-Ressource (xxv, 257 Seiten) Illustrationen, Diagramme |
psigel | ZDB-7-TFC ZDB-30-PQE ZDB-30-PQE TUM_PDA_PQE_Kauf_2024 ZDB-7-TFC UER_Einzelkauf_2022 |
publishDate | 2022 |
publishDateSearch | 2022 |
publishDateSort | 2022 |
publisher | CRC Press |
record_format | marc |
series2 | Chapman & Hall/CRC machine learning & pattern recognition |
spellingShingle | Kamath, Uday Graham, Kenneth L Emara, Wael Transformers for machine learning a deep dive Soft Computing (DE-588)4455833-8 gnd Deep Learning (DE-588)1135597375 gnd Neuronales Netz (DE-588)4226127-2 gnd Maschinelles Lernen (DE-588)4193754-5 gnd |
subject_GND | (DE-588)4455833-8 (DE-588)1135597375 (DE-588)4226127-2 (DE-588)4193754-5 |
title | Transformers for machine learning a deep dive |
title_auth | Transformers for machine learning a deep dive |
title_exact_search | Transformers for machine learning a deep dive |
title_exact_search_txtP | Transformers for machine learning a deep dive |
title_full | Transformers for machine learning a deep dive Uday Kamath, Kenneth L. Graham, Wael Emara |
title_fullStr | Transformers for machine learning a deep dive Uday Kamath, Kenneth L. Graham, Wael Emara |
title_full_unstemmed | Transformers for machine learning a deep dive Uday Kamath, Kenneth L. Graham, Wael Emara |
title_short | Transformers for machine learning |
title_sort | transformers for machine learning a deep dive |
title_sub | a deep dive |
topic | Soft Computing (DE-588)4455833-8 gnd Deep Learning (DE-588)1135597375 gnd Neuronales Netz (DE-588)4226127-2 gnd Maschinelles Lernen (DE-588)4193754-5 gnd |
topic_facet | Soft Computing Deep Learning Neuronales Netz Maschinelles Lernen |
url | https://doi.org/10.1201/9781003170082 |
work_keys_str_mv | AT kamathuday transformersformachinelearningadeepdive AT grahamkennethl transformersformachinelearningadeepdive AT emarawael transformersformachinelearningadeepdive |