Transformers for natural language processing: build innovative deep neural network architectures for NLP with Python, Pytorch, TensorFlow, BERT, RoBERTa, and more
Being the first book in the market to dive deep into the Transformers, it is a step-by-step guide for data and AI practitioners to help enhance the performance of language understanding and gain expertise with hands-on implementation of transformers using PyTorch, TensorFlow, Hugging Face, Trax, and...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Elektronisch E-Book |
Sprache: | English |
Veröffentlicht: |
Birmingham ; Mumbai
Packt
2021
|
Schriftenreihe: | Expert insight
|
Schlagworte: | |
Online-Zugang: | FHN01 UBR01 UBY01 UER01 |
Zusammenfassung: | Being the first book in the market to dive deep into the Transformers, it is a step-by-step guide for data and AI practitioners to help enhance the performance of language understanding and gain expertise with hands-on implementation of transformers using PyTorch, TensorFlow, Hugging Face, Trax, and AllenNLP. |
Beschreibung: | 1 Online-Ressource (xvi, 360 Seiten) Illustrationen, Diagramme |
ISBN: | 9781800568631 |
Internformat
MARC
LEADER | 00000nmm a22000001c 4500 | ||
---|---|---|---|
001 | BV047160289 | ||
003 | DE-604 | ||
005 | 20230118 | ||
007 | cr|uuu---uuuuu | ||
008 | 210224s2021 |||| o||u| ||||||eng d | ||
020 | |a 9781800568631 |c Online |9 978-1-80056-863-1 | ||
035 | |a (ZDB-4-NLEBK)2739556 | ||
035 | |a (ZDB-30-PQE)EBC6467893 | ||
035 | |a (OCoLC)1240400328 | ||
035 | |a (DE-599)KEP061509248 | ||
040 | |a DE-604 |b ger |e rda | ||
041 | 0 | |a eng | |
049 | |a DE-355 |a DE-706 |a DE-92 |a DE-29 | ||
084 | |a ST 306 |0 (DE-625)143654: |2 rvk | ||
100 | 1 | |a Rothman, Denis |e Verfasser |0 (DE-588)1221752987 |4 aut | |
245 | 1 | 0 | |a Transformers for natural language processing |b build innovative deep neural network architectures for NLP with Python, Pytorch, TensorFlow, BERT, RoBERTa, and more |c Denis Rothman |
264 | 1 | |a Birmingham ; Mumbai |b Packt |c 2021 | |
300 | |a 1 Online-Ressource (xvi, 360 Seiten) |b Illustrationen, Diagramme | ||
336 | |b txt |2 rdacontent | ||
337 | |b c |2 rdamedia | ||
338 | |b cr |2 rdacarrier | ||
490 | 0 | |a Expert insight | |
520 | 3 | |a Being the first book in the market to dive deep into the Transformers, it is a step-by-step guide for data and AI practitioners to help enhance the performance of language understanding and gain expertise with hands-on implementation of transformers using PyTorch, TensorFlow, Hugging Face, Trax, and AllenNLP. | |
650 | 0 | 7 | |a Deep learning |0 (DE-588)1135597375 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Natürliche Sprache |0 (DE-588)4041354-8 |2 gnd |9 rswk-swf |
653 | 0 | |a Electronic books | |
689 | 0 | 0 | |a Natürliche Sprache |0 (DE-588)4041354-8 |D s |
689 | 0 | 1 | |a Deep learning |0 (DE-588)1135597375 |D s |
689 | 0 | |5 DE-604 | |
776 | 0 | 8 | |i Erscheint auch als |n Druck-Ausgabe |z 978-1-80056-579-1 |
912 | |a ZDB-30-PQE |a ZDB-4-NLEBK | ||
999 | |a oai:aleph.bib-bvb.de:BVB01-032565928 | ||
966 | e | |u http://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&db=nlabk&AN=2739556 |l FHN01 |p ZDB-4-NLEBK |x Aggregator |3 Volltext | |
966 | e | |u https://ebookcentral.proquest.com/lib/uniregensburg-ebooks/detail.action?docID=6467893 |l UBR01 |p ZDB-30-PQE |q UBR_Einzelkauf 2021 |x Aggregator |3 Volltext | |
966 | e | |u http://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&db=nlabk&AN=2739556 |l UBY01 |p ZDB-4-NLEBK |q UBY01_DDA21 |x Aggregator |3 Volltext | |
966 | e | |u https://ebookcentral.proquest.com/lib/erlangen/detail.action?docID=6467893 |l UER01 |p ZDB-30-PQE |q UER_PDA_PQE_Kauf_2023 |x Aggregator |3 Volltext |
Datensatz im Suchindex
_version_ | 1804182234965475328 |
---|---|
adam_txt | |
any_adam_object | |
any_adam_object_boolean | |
author | Rothman, Denis |
author_GND | (DE-588)1221752987 |
author_facet | Rothman, Denis |
author_role | aut |
author_sort | Rothman, Denis |
author_variant | d r dr |
building | Verbundindex |
bvnumber | BV047160289 |
classification_rvk | ST 306 |
collection | ZDB-30-PQE ZDB-4-NLEBK |
ctrlnum | (ZDB-4-NLEBK)2739556 (ZDB-30-PQE)EBC6467893 (OCoLC)1240400328 (DE-599)KEP061509248 |
discipline | Informatik |
discipline_str_mv | Informatik |
format | Electronic eBook |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>02459nmm a22004571c 4500</leader><controlfield tag="001">BV047160289</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20230118 </controlfield><controlfield tag="007">cr|uuu---uuuuu</controlfield><controlfield tag="008">210224s2021 |||| o||u| ||||||eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9781800568631</subfield><subfield code="c">Online</subfield><subfield code="9">978-1-80056-863-1</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(ZDB-4-NLEBK)2739556</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(ZDB-30-PQE)EBC6467893</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)1240400328</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)KEP061509248</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">rda</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-355</subfield><subfield code="a">DE-706</subfield><subfield code="a">DE-92</subfield><subfield code="a">DE-29</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 306</subfield><subfield code="0">(DE-625)143654:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Rothman, Denis</subfield><subfield code="e">Verfasser</subfield><subfield code="0">(DE-588)1221752987</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Transformers for natural language processing</subfield><subfield code="b">build innovative deep neural network architectures for NLP with Python, Pytorch, TensorFlow, BERT, RoBERTa, and more</subfield><subfield code="c">Denis Rothman</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Birmingham ; Mumbai</subfield><subfield code="b">Packt</subfield><subfield code="c">2021</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 Online-Ressource (xvi, 360 Seiten)</subfield><subfield code="b">Illustrationen, Diagramme</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="490" ind1="0" ind2=" "><subfield code="a">Expert insight</subfield></datafield><datafield tag="520" ind1="3" ind2=" "><subfield code="a">Being the first book in the market to dive deep into the Transformers, it is a step-by-step guide for data and AI practitioners to help enhance the performance of language understanding and gain expertise with hands-on implementation of transformers using PyTorch, TensorFlow, Hugging Face, Trax, and AllenNLP.</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Deep learning</subfield><subfield code="0">(DE-588)1135597375</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Natürliche Sprache</subfield><subfield code="0">(DE-588)4041354-8</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Electronic books</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Natürliche Sprache</subfield><subfield code="0">(DE-588)4041354-8</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="1"><subfield code="a">Deep learning</subfield><subfield code="0">(DE-588)1135597375</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Erscheint auch als</subfield><subfield code="n">Druck-Ausgabe</subfield><subfield code="z">978-1-80056-579-1</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-30-PQE</subfield><subfield code="a">ZDB-4-NLEBK</subfield></datafield><datafield tag="999" ind1=" " ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-032565928</subfield></datafield><datafield tag="966" ind1="e" ind2=" "><subfield code="u">http://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&db=nlabk&AN=2739556</subfield><subfield code="l">FHN01</subfield><subfield code="p">ZDB-4-NLEBK</subfield><subfield code="x">Aggregator</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="966" ind1="e" ind2=" "><subfield code="u">https://ebookcentral.proquest.com/lib/uniregensburg-ebooks/detail.action?docID=6467893</subfield><subfield code="l">UBR01</subfield><subfield code="p">ZDB-30-PQE</subfield><subfield code="q">UBR_Einzelkauf 2021</subfield><subfield code="x">Aggregator</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="966" ind1="e" ind2=" "><subfield code="u">http://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&db=nlabk&AN=2739556</subfield><subfield code="l">UBY01</subfield><subfield code="p">ZDB-4-NLEBK</subfield><subfield code="q">UBY01_DDA21</subfield><subfield code="x">Aggregator</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="966" ind1="e" ind2=" "><subfield code="u">https://ebookcentral.proquest.com/lib/erlangen/detail.action?docID=6467893</subfield><subfield code="l">UER01</subfield><subfield code="p">ZDB-30-PQE</subfield><subfield code="q">UER_PDA_PQE_Kauf_2023</subfield><subfield code="x">Aggregator</subfield><subfield code="3">Volltext</subfield></datafield></record></collection> |
id | DE-604.BV047160289 |
illustrated | Not Illustrated |
index_date | 2024-07-03T16:40:31Z |
indexdate | 2024-07-10T09:04:19Z |
institution | BVB |
isbn | 9781800568631 |
language | English |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-032565928 |
oclc_num | 1240400328 |
open_access_boolean | |
owner | DE-355 DE-BY-UBR DE-706 DE-92 DE-29 |
owner_facet | DE-355 DE-BY-UBR DE-706 DE-92 DE-29 |
physical | 1 Online-Ressource (xvi, 360 Seiten) Illustrationen, Diagramme |
psigel | ZDB-30-PQE ZDB-4-NLEBK ZDB-30-PQE UBR_Einzelkauf 2021 ZDB-4-NLEBK UBY01_DDA21 ZDB-30-PQE UER_PDA_PQE_Kauf_2023 |
publishDate | 2021 |
publishDateSearch | 2021 |
publishDateSort | 2021 |
publisher | Packt |
record_format | marc |
series2 | Expert insight |
spelling | Rothman, Denis Verfasser (DE-588)1221752987 aut Transformers for natural language processing build innovative deep neural network architectures for NLP with Python, Pytorch, TensorFlow, BERT, RoBERTa, and more Denis Rothman Birmingham ; Mumbai Packt 2021 1 Online-Ressource (xvi, 360 Seiten) Illustrationen, Diagramme txt rdacontent c rdamedia cr rdacarrier Expert insight Being the first book in the market to dive deep into the Transformers, it is a step-by-step guide for data and AI practitioners to help enhance the performance of language understanding and gain expertise with hands-on implementation of transformers using PyTorch, TensorFlow, Hugging Face, Trax, and AllenNLP. Deep learning (DE-588)1135597375 gnd rswk-swf Natürliche Sprache (DE-588)4041354-8 gnd rswk-swf Electronic books Natürliche Sprache (DE-588)4041354-8 s Deep learning (DE-588)1135597375 s DE-604 Erscheint auch als Druck-Ausgabe 978-1-80056-579-1 |
spellingShingle | Rothman, Denis Transformers for natural language processing build innovative deep neural network architectures for NLP with Python, Pytorch, TensorFlow, BERT, RoBERTa, and more Deep learning (DE-588)1135597375 gnd Natürliche Sprache (DE-588)4041354-8 gnd |
subject_GND | (DE-588)1135597375 (DE-588)4041354-8 |
title | Transformers for natural language processing build innovative deep neural network architectures for NLP with Python, Pytorch, TensorFlow, BERT, RoBERTa, and more |
title_auth | Transformers for natural language processing build innovative deep neural network architectures for NLP with Python, Pytorch, TensorFlow, BERT, RoBERTa, and more |
title_exact_search | Transformers for natural language processing build innovative deep neural network architectures for NLP with Python, Pytorch, TensorFlow, BERT, RoBERTa, and more |
title_exact_search_txtP | Transformers for natural language processing build innovative deep neural network architectures for NLP with Python, Pytorch, TensorFlow, BERT, RoBERTa, and more |
title_full | Transformers for natural language processing build innovative deep neural network architectures for NLP with Python, Pytorch, TensorFlow, BERT, RoBERTa, and more Denis Rothman |
title_fullStr | Transformers for natural language processing build innovative deep neural network architectures for NLP with Python, Pytorch, TensorFlow, BERT, RoBERTa, and more Denis Rothman |
title_full_unstemmed | Transformers for natural language processing build innovative deep neural network architectures for NLP with Python, Pytorch, TensorFlow, BERT, RoBERTa, and more Denis Rothman |
title_short | Transformers for natural language processing |
title_sort | transformers for natural language processing build innovative deep neural network architectures for nlp with python pytorch tensorflow bert roberta and more |
title_sub | build innovative deep neural network architectures for NLP with Python, Pytorch, TensorFlow, BERT, RoBERTa, and more |
topic | Deep learning (DE-588)1135597375 gnd Natürliche Sprache (DE-588)4041354-8 gnd |
topic_facet | Deep learning Natürliche Sprache |
work_keys_str_mv | AT rothmandenis transformersfornaturallanguageprocessingbuildinnovativedeepneuralnetworkarchitecturesfornlpwithpythonpytorchtensorflowbertrobertaandmore |