Getting started with Google BERT :: build and train state-of-the-art natural language processing models using BERT /
Getting Started with Google BERT will help you become well-versed with the BERT model from scratch and learn how to create interesting NLP applications. You'll understand several variants of BERT such as ALBERT, RoBERTa, DistilBERT, ELECTRA, VideoBERT, and many others in detail.
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Elektronisch E-Book |
Sprache: | English |
Veröffentlicht: |
Birmingham, UK :
Packt Publishing Ltd.,
2021.
|
Schlagworte: | |
Online-Zugang: | Volltext |
Zusammenfassung: | Getting Started with Google BERT will help you become well-versed with the BERT model from scratch and learn how to create interesting NLP applications. You'll understand several variants of BERT such as ALBERT, RoBERTa, DistilBERT, ELECTRA, VideoBERT, and many others in detail. |
Beschreibung: | 1 online resource (vii, 324 pages) : illustrations |
Bibliographie: | Includes bibliographical references and index. |
ISBN: | 1838826238 9781838826239 |
Internformat
MARC
LEADER | 00000cam a2200000 i 4500 | ||
---|---|---|---|
001 | ZDB-4-EBA-on1235595724 | ||
003 | OCoLC | ||
005 | 20241004212047.0 | ||
006 | m o d | ||
007 | cr ||||||||||| | ||
008 | 210130t20212021enka ob 001 0 eng d | ||
040 | |a EBLCP |b eng |e pn |e rda |c EBLCP |d UKAHL |d WAU |d OCLCO |d N$T |d MERUC |d OCLCF |d UKMGB |d OCLCO |d OCLCQ |d IEEEE |d OCLCO |d OCLCL | ||
015 | |a GBC117232 |2 bnb | ||
016 | 7 | |a 020098826 |2 Uk | |
020 | |a 1838826238 |q electronic book | ||
020 | |a 9781838826239 |q (electronic bk.) | ||
020 | |z 9781838821593 |q paperback | ||
035 | |a (OCoLC)1235595724 | ||
037 | |a 9781838826239 |b Packt Publishing | ||
037 | |a 10162609 |b IEEE | ||
050 | 4 | |a QA76.9.N38 | |
082 | 7 | |a 006.35 |2 23 | |
049 | |a MAIN | ||
100 | 1 | |a Ravichandiran, Sudharsan, |e author. |0 http://id.loc.gov/authorities/names/no2020004233 | |
245 | 1 | 0 | |a Getting started with Google BERT : |b build and train state-of-the-art natural language processing models using BERT / |c Sudharsan Ravichandiran. |
264 | 1 | |a Birmingham, UK : |b Packt Publishing Ltd., |c 2021. | |
264 | 4 | |c ©2021 | |
300 | |a 1 online resource (vii, 324 pages) : |b illustrations | ||
336 | |a text |b txt |2 rdacontent | ||
336 | |a still image |b sti |2 rdacontent | ||
337 | |a computer |b c |2 rdamedia | ||
338 | |a online resource |b cr |2 rdacarrier | ||
504 | |a Includes bibliographical references and index. | ||
505 | 0 | |a 1. A primer on transformers -- 2. Understanding the BERT model -- 3. Getting hands-on with BERT -- 4. BERT variants I -- 5. BERT variants II -- 6. Exploring BERTSUM for text summarization -- 7. Applying BERT to other languages -- 8. Exploring sentence and domain-specific BERT -- 9. Working with VideoBERT, BART, and more. | |
520 | |a Getting Started with Google BERT will help you become well-versed with the BERT model from scratch and learn how to create interesting NLP applications. You'll understand several variants of BERT such as ALBERT, RoBERTa, DistilBERT, ELECTRA, VideoBERT, and many others in detail. | ||
545 | 0 | |a Sudharsan Ravichandiran is a data scientist and artificial intelligence enthusiast. He holds a Bachelors in Information Technology from Anna University. His area of research focuses on practical implementations of deep learning and reinforcement learning including natural language processing and computer vision. He is an open-source contributor and loves answering questions on Stack Overflow. | |
588 | 0 | |a Print version record. | |
650 | 0 | |a Natural language processing (Computer science) |0 http://id.loc.gov/authorities/subjects/sh88002425 | |
650 | 6 | |a Traitement automatique des langues naturelles. | |
650 | 7 | |a Natural language processing (Computer science) |2 fast | |
758 | |i has work: |a Getting started with Google BERT (Text) |1 https://id.oclc.org/worldcat/entity/E39PCH4gJ9QT7J8Tc6qWp4yt83 |4 https://id.oclc.org/worldcat/ontology/hasWork | ||
776 | 0 | 8 | |i Print version: |t Getting started with Google BERT. |d Birmingham, UK : PACKT, 2021 |z 1838821597 |w (OCoLC)1201656421 |
856 | 4 | 0 | |l FWS01 |p ZDB-4-EBA |q FWS_PDA_EBA |u https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=2736061 |3 Volltext |
938 | |a Askews and Holts Library Services |b ASKH |n AH38265577 | ||
938 | |a ProQuest Ebook Central |b EBLB |n EBL6463518 | ||
938 | |a EBSCOhost |b EBSC |n 2736061 | ||
994 | |a 92 |b GEBAY | ||
912 | |a ZDB-4-EBA | ||
049 | |a DE-863 |
Datensatz im Suchindex
DE-BY-FWS_katkey | ZDB-4-EBA-on1235595724 |
---|---|
_version_ | 1816882537752952832 |
adam_text | |
any_adam_object | |
author | Ravichandiran, Sudharsan |
author_GND | http://id.loc.gov/authorities/names/no2020004233 |
author_facet | Ravichandiran, Sudharsan |
author_role | aut |
author_sort | Ravichandiran, Sudharsan |
author_variant | s r sr |
building | Verbundindex |
bvnumber | localFWS |
callnumber-first | Q - Science |
callnumber-label | QA76 |
callnumber-raw | QA76.9.N38 |
callnumber-search | QA76.9.N38 |
callnumber-sort | QA 276.9 N38 |
callnumber-subject | QA - Mathematics |
collection | ZDB-4-EBA |
contents | 1. A primer on transformers -- 2. Understanding the BERT model -- 3. Getting hands-on with BERT -- 4. BERT variants I -- 5. BERT variants II -- 6. Exploring BERTSUM for text summarization -- 7. Applying BERT to other languages -- 8. Exploring sentence and domain-specific BERT -- 9. Working with VideoBERT, BART, and more. |
ctrlnum | (OCoLC)1235595724 |
dewey-full | 006.35 |
dewey-hundreds | 000 - Computer science, information, general works |
dewey-ones | 006 - Special computer methods |
dewey-raw | 006.35 |
dewey-search | 006.35 |
dewey-sort | 16.35 |
dewey-tens | 000 - Computer science, information, general works |
discipline | Informatik |
format | Electronic eBook |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>03288cam a2200529 i 4500</leader><controlfield tag="001">ZDB-4-EBA-on1235595724</controlfield><controlfield tag="003">OCoLC</controlfield><controlfield tag="005">20241004212047.0</controlfield><controlfield tag="006">m o d </controlfield><controlfield tag="007">cr |||||||||||</controlfield><controlfield tag="008">210130t20212021enka ob 001 0 eng d</controlfield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">EBLCP</subfield><subfield code="b">eng</subfield><subfield code="e">pn</subfield><subfield code="e">rda</subfield><subfield code="c">EBLCP</subfield><subfield code="d">UKAHL</subfield><subfield code="d">WAU</subfield><subfield code="d">OCLCO</subfield><subfield code="d">N$T</subfield><subfield code="d">MERUC</subfield><subfield code="d">OCLCF</subfield><subfield code="d">UKMGB</subfield><subfield code="d">OCLCO</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">IEEEE</subfield><subfield code="d">OCLCO</subfield><subfield code="d">OCLCL</subfield></datafield><datafield tag="015" ind1=" " ind2=" "><subfield code="a">GBC117232</subfield><subfield code="2">bnb</subfield></datafield><datafield tag="016" ind1="7" ind2=" "><subfield code="a">020098826</subfield><subfield code="2">Uk</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">1838826238</subfield><subfield code="q">electronic book</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9781838826239</subfield><subfield code="q">(electronic bk.)</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="z">9781838821593</subfield><subfield code="q">paperback</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)1235595724</subfield></datafield><datafield tag="037" ind1=" " ind2=" "><subfield code="a">9781838826239</subfield><subfield code="b">Packt Publishing</subfield></datafield><datafield tag="037" ind1=" " ind2=" "><subfield code="a">10162609</subfield><subfield code="b">IEEE</subfield></datafield><datafield tag="050" ind1=" " ind2="4"><subfield code="a">QA76.9.N38</subfield></datafield><datafield tag="082" ind1="7" ind2=" "><subfield code="a">006.35</subfield><subfield code="2">23</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">MAIN</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Ravichandiran, Sudharsan,</subfield><subfield code="e">author.</subfield><subfield code="0">http://id.loc.gov/authorities/names/no2020004233</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Getting started with Google BERT :</subfield><subfield code="b">build and train state-of-the-art natural language processing models using BERT /</subfield><subfield code="c">Sudharsan Ravichandiran.</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Birmingham, UK :</subfield><subfield code="b">Packt Publishing Ltd.,</subfield><subfield code="c">2021.</subfield></datafield><datafield tag="264" ind1=" " ind2="4"><subfield code="c">©2021</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 online resource (vii, 324 pages) :</subfield><subfield code="b">illustrations</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">still image</subfield><subfield code="b">sti</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">computer</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">online resource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="504" ind1=" " ind2=" "><subfield code="a">Includes bibliographical references and index.</subfield></datafield><datafield tag="505" ind1="0" ind2=" "><subfield code="a">1. A primer on transformers -- 2. Understanding the BERT model -- 3. Getting hands-on with BERT -- 4. BERT variants I -- 5. BERT variants II -- 6. Exploring BERTSUM for text summarization -- 7. Applying BERT to other languages -- 8. Exploring sentence and domain-specific BERT -- 9. Working with VideoBERT, BART, and more.</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Getting Started with Google BERT will help you become well-versed with the BERT model from scratch and learn how to create interesting NLP applications. You'll understand several variants of BERT such as ALBERT, RoBERTa, DistilBERT, ELECTRA, VideoBERT, and many others in detail.</subfield></datafield><datafield tag="545" ind1="0" ind2=" "><subfield code="a">Sudharsan Ravichandiran is a data scientist and artificial intelligence enthusiast. He holds a Bachelors in Information Technology from Anna University. His area of research focuses on practical implementations of deep learning and reinforcement learning including natural language processing and computer vision. He is an open-source contributor and loves answering questions on Stack Overflow.</subfield></datafield><datafield tag="588" ind1="0" ind2=" "><subfield code="a">Print version record.</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Natural language processing (Computer science)</subfield><subfield code="0">http://id.loc.gov/authorities/subjects/sh88002425</subfield></datafield><datafield tag="650" ind1=" " ind2="6"><subfield code="a">Traitement automatique des langues naturelles.</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Natural language processing (Computer science)</subfield><subfield code="2">fast</subfield></datafield><datafield tag="758" ind1=" " ind2=" "><subfield code="i">has work:</subfield><subfield code="a">Getting started with Google BERT (Text)</subfield><subfield code="1">https://id.oclc.org/worldcat/entity/E39PCH4gJ9QT7J8Tc6qWp4yt83</subfield><subfield code="4">https://id.oclc.org/worldcat/ontology/hasWork</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Print version:</subfield><subfield code="t">Getting started with Google BERT.</subfield><subfield code="d">Birmingham, UK : PACKT, 2021</subfield><subfield code="z">1838821597</subfield><subfield code="w">(OCoLC)1201656421</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="l">FWS01</subfield><subfield code="p">ZDB-4-EBA</subfield><subfield code="q">FWS_PDA_EBA</subfield><subfield code="u">https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=2736061</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">Askews and Holts Library Services</subfield><subfield code="b">ASKH</subfield><subfield code="n">AH38265577</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">ProQuest Ebook Central</subfield><subfield code="b">EBLB</subfield><subfield code="n">EBL6463518</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">EBSCOhost</subfield><subfield code="b">EBSC</subfield><subfield code="n">2736061</subfield></datafield><datafield tag="994" ind1=" " ind2=" "><subfield code="a">92</subfield><subfield code="b">GEBAY</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-4-EBA</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-863</subfield></datafield></record></collection> |
id | ZDB-4-EBA-on1235595724 |
illustrated | Illustrated |
indexdate | 2024-11-27T13:30:12Z |
institution | BVB |
isbn | 1838826238 9781838826239 |
language | English |
oclc_num | 1235595724 |
open_access_boolean | |
owner | MAIN DE-863 DE-BY-FWS |
owner_facet | MAIN DE-863 DE-BY-FWS |
physical | 1 online resource (vii, 324 pages) : illustrations |
psigel | ZDB-4-EBA |
publishDate | 2021 |
publishDateSearch | 2021 |
publishDateSort | 2021 |
publisher | Packt Publishing Ltd., |
record_format | marc |
spelling | Ravichandiran, Sudharsan, author. http://id.loc.gov/authorities/names/no2020004233 Getting started with Google BERT : build and train state-of-the-art natural language processing models using BERT / Sudharsan Ravichandiran. Birmingham, UK : Packt Publishing Ltd., 2021. ©2021 1 online resource (vii, 324 pages) : illustrations text txt rdacontent still image sti rdacontent computer c rdamedia online resource cr rdacarrier Includes bibliographical references and index. 1. A primer on transformers -- 2. Understanding the BERT model -- 3. Getting hands-on with BERT -- 4. BERT variants I -- 5. BERT variants II -- 6. Exploring BERTSUM for text summarization -- 7. Applying BERT to other languages -- 8. Exploring sentence and domain-specific BERT -- 9. Working with VideoBERT, BART, and more. Getting Started with Google BERT will help you become well-versed with the BERT model from scratch and learn how to create interesting NLP applications. You'll understand several variants of BERT such as ALBERT, RoBERTa, DistilBERT, ELECTRA, VideoBERT, and many others in detail. Sudharsan Ravichandiran is a data scientist and artificial intelligence enthusiast. He holds a Bachelors in Information Technology from Anna University. His area of research focuses on practical implementations of deep learning and reinforcement learning including natural language processing and computer vision. He is an open-source contributor and loves answering questions on Stack Overflow. Print version record. Natural language processing (Computer science) http://id.loc.gov/authorities/subjects/sh88002425 Traitement automatique des langues naturelles. Natural language processing (Computer science) fast has work: Getting started with Google BERT (Text) https://id.oclc.org/worldcat/entity/E39PCH4gJ9QT7J8Tc6qWp4yt83 https://id.oclc.org/worldcat/ontology/hasWork Print version: Getting started with Google BERT. Birmingham, UK : PACKT, 2021 1838821597 (OCoLC)1201656421 FWS01 ZDB-4-EBA FWS_PDA_EBA https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=2736061 Volltext |
spellingShingle | Ravichandiran, Sudharsan Getting started with Google BERT : build and train state-of-the-art natural language processing models using BERT / 1. A primer on transformers -- 2. Understanding the BERT model -- 3. Getting hands-on with BERT -- 4. BERT variants I -- 5. BERT variants II -- 6. Exploring BERTSUM for text summarization -- 7. Applying BERT to other languages -- 8. Exploring sentence and domain-specific BERT -- 9. Working with VideoBERT, BART, and more. Natural language processing (Computer science) http://id.loc.gov/authorities/subjects/sh88002425 Traitement automatique des langues naturelles. Natural language processing (Computer science) fast |
subject_GND | http://id.loc.gov/authorities/subjects/sh88002425 |
title | Getting started with Google BERT : build and train state-of-the-art natural language processing models using BERT / |
title_auth | Getting started with Google BERT : build and train state-of-the-art natural language processing models using BERT / |
title_exact_search | Getting started with Google BERT : build and train state-of-the-art natural language processing models using BERT / |
title_full | Getting started with Google BERT : build and train state-of-the-art natural language processing models using BERT / Sudharsan Ravichandiran. |
title_fullStr | Getting started with Google BERT : build and train state-of-the-art natural language processing models using BERT / Sudharsan Ravichandiran. |
title_full_unstemmed | Getting started with Google BERT : build and train state-of-the-art natural language processing models using BERT / Sudharsan Ravichandiran. |
title_short | Getting started with Google BERT : |
title_sort | getting started with google bert build and train state of the art natural language processing models using bert |
title_sub | build and train state-of-the-art natural language processing models using BERT / |
topic | Natural language processing (Computer science) http://id.loc.gov/authorities/subjects/sh88002425 Traitement automatique des langues naturelles. Natural language processing (Computer science) fast |
topic_facet | Natural language processing (Computer science) Traitement automatique des langues naturelles. |
url | https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=2736061 |
work_keys_str_mv | AT ravichandiransudharsan gettingstartedwithgooglebertbuildandtrainstateoftheartnaturallanguageprocessingmodelsusingbert |