Getting Started with Google BERT: Build and train state-of-the-art natural language processing models using BERT

· Packt Publishing Ltd
4,0
3 avis
E-book
352
Pages

À propos de cet e-book

Kickstart your NLP journey by exploring BERT and its variants such as ALBERT, RoBERTa, DistilBERT, VideoBERT, and more with Hugging Face's transformers libraryKey FeaturesExplore the encoder and decoder of the transformer modelBecome well-versed with BERT along with ALBERT, RoBERTa, and DistilBERTDiscover how to pre-train and fine-tune BERT models for several NLP tasksBook Description

BERT (bidirectional encoder representations from transformer) has revolutionized the world of natural language processing (NLP) with promising results. This book is an introductory guide that will help you get to grips with Google's BERT architecture. With a detailed explanation of the transformer architecture, this book will help you understand how the transformer’s encoder and decoder work.

You’ll explore the BERT architecture by learning how the BERT model is pre-trained and how to use pre-trained BERT for downstream tasks by fine-tuning it for NLP tasks such as sentiment analysis and text summarization with the Hugging Face transformers library. As you advance, you’ll learn about different variants of BERT such as ALBERT, RoBERTa, and ELECTRA, and look at SpanBERT, which is used for NLP tasks like question answering. You'll also cover simpler and faster BERT variants based on knowledge distillation such as DistilBERT and TinyBERT. The book takes you through MBERT, XLM, and XLM-R in detail and then introduces you to sentence-BERT, which is used for obtaining sentence representation. Finally, you'll discover domain-specific BERT models such as BioBERT and ClinicalBERT, and discover an interesting variant called VideoBERT.

By the end of this BERT book, you’ll be well-versed with using BERT and its variants for performing practical NLP tasks.

What you will learnUnderstand the transformer model from the ground upFind out how BERT works and pre-train it using masked language model (MLM) and next sentence prediction (NSP) tasksGet hands-on with BERT by learning to generate contextual word and sentence embeddingsFine-tune BERT for downstream tasksGet to grips with ALBERT, RoBERTa, ELECTRA, and SpanBERT modelsGet the hang of the BERT models based on knowledge distillationUnderstand cross-lingual models such as XLM and XLM-RExplore Sentence-BERT, VideoBERT, and BARTWho this book is for

This book is for NLP professionals and data scientists looking to simplify NLP tasks to enable efficient language understanding using BERT. A basic understanding of NLP concepts and deep learning is required to get the best out of this book.

Notes et avis

4,0
3 avis

À propos de l'auteur

Sudharsan Ravichandiran is a data scientist, researcher, bestselling author. He completed his Bachelor's in Information Technology at Anna University. His area of research focuses on practical implementations of deep learning and reinforcement learning, including Natural Language Processing and computer vision. He is an open-source contributor and loves answering questions on Stack Overflow. He also authored a best-seller, Hands-On Reinforcement Learning with Python, published by Packt Publishing.

Donner une note à cet e-book

Dites-nous ce que vous en pensez.

Informations sur la lecture

Smartphones et tablettes
Installez l'application Google Play Livres pour Android et iPad ou iPhone. Elle se synchronise automatiquement avec votre compte et vous permet de lire des livres en ligne ou hors connexion, où que vous soyez.
Ordinateurs portables et de bureau
Vous pouvez écouter les livres audio achetés sur Google Play à l'aide du navigateur Web de votre ordinateur.
Liseuses et autres appareils
Pour lire sur des appareils e-Ink, comme les liseuses Kobo, vous devez télécharger un fichier et le transférer sur l'appareil en question. Suivez les instructions détaillées du Centre d'aide pour transférer les fichiers sur les liseuses compatibles.

Autres livres par Sudharsan Ravichandiran

E-books similaires