Transformers for Natural Language Processing: Build, train, and fine-tune deep neural network architectures for NLP with Python, PyTorch, TensorFlow, BERT, and GPT-3, 2nd Edition
- Length: 564 pages
- Edition: 2
- Language: English
- Publisher: Packt Publishing
- Publication Date: 2022-03-25
- ISBN-10: 1803247339
- ISBN-13: 9781803247335
- Sales Rank: #322898 (See Top 100 Books)
Under the hood working of transformers, fine-tuning GPT-3 models, DeBERTa, vision models, and the start of Metaverse, using a variety of NLP platforms: Hugging Face, OpenAI API, Trax, and AllenNLP
Key Features
- Implement models, such as BERT, Reformer, and T5, that outperform classical language models
- Compare NLP applications using GPT-3, GPT-2, and other transformers
- Analyze advanced use cases, including polysemy, cross-lingual learning, and computer vision
Book Description
Transformers are a game-changer for natural language understanding (NLU) and have become one of the pillars of artificial intelligence.
Transformers for Natural Language Processing, 2nd Edition, investigates deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question-answering, and many more NLP domains with transformers.
An Industry 4.0 AI specialist needs to be adaptable; knowing just one NLP platform is not enough anymore. Different platforms have different benefits depending on the application, whether it’s cost, flexibility, ease of implementation, results, or performance. In this book, we analyze numerous use cases with Hugging Face, Google Trax, OpenAI, and AllenNLP.
This book takes transformers’ capabilities further by combining multiple NLP techniques, such as sentiment analysis, named entity recognition, and semantic role labeling, to analyze complex use cases, such as dissecting fake news on Twitter. Also, see how transformers can create code using just a brief description.
By the end of this NLP book, you will understand transformers from a cognitive science perspective and be proficient in applying pretrained transformer models to various datasets.
What you will learn
- Discover new ways of performing NLP techniques with the latest pretrained transformers
- Grasp the workings of the original Transformer, GPT-3, BERT, T5, DeBERTa, and Reformer
- Find out how ViT and CLIP label images (including blurry ones!) and reconstruct images using DALL-E
- Carry out sentiment analysis, text summarization, casual language analysis, machine translations, and more using TensorFlow, PyTorch, and GPT-3
- Measure the productivity of key transformers to define their scope, potential, and limits in production
Who this book is for
If you want to learn about and apply transformers to your natural language (and image) data, this book is for you.
A good understanding of NLP, Python, and deep learning is required to benefit most from this book. Many platforms covered in this book provide interactive user interfaces, which allow readers with a general interest in NLP and AI to follow several chapters of this book.
Advanced Python Programming Second Edition Contributors About the author About the reviewers Preface Who this book is for What this book covers To get the most out of this book Download the example code files Download the color images Conventions used Get in touch Share Your Thoughts Section 1: Python-Native and Specialized Optimization Chapter 1: Benchmarking and Profiling Technical requirements Designing your application Building a particle simulator Visualizing the simulation Writing tests and benchmarks Timing your benchmark Writing better tests and benchmarks with pytest-benchmark Finding bottlenecks with cProfile Graphically analyzing profiling results Profiling line by line with line_profiler Optimizing our code Using the dis module Profiling memory usage with memory_profiler Summary Questions Further reading Chapter 2: Pure Python Optimizations Technical requirements Using the right algorithms and data structures Lists and deques Dictionaries Hash map Sets Heaps Tries Improved efficiency with caching and memoization Joblib Efficient iteration with comprehensions and generators Summary Questions Further reading Chapter 3: Fast Array Operations with NumPy, Pandas, and Xarray Technical requirement Getting started with NumPy Creating arrays Accessing arrays Indexing and slicing Fancy indexing Broadcasting Mathematical operations Calculating the norm Rewriting the particle simulator in NumPy Reaching optimal performance with numexpr Working with database-style data with pandas pandas fundamentals Indexing Series and DataFrame objects Database-style operations with pandas Mapping Grouping, aggregations, and transforms Joining High-performance labeled data with xarray Analyzing concentration The xarray library Improved performance Plotting with xarray Summary Questions Further reading Chapter 4: C Performance with Cython Technical requirements Compiling Cython extensions Adding static types Declaring variables Declaring functions Declaring classes Sharing declarations Working with arrays C arrays and pointers Working with NumPy arrays Working with typed memoryviews Using a particle simulator in Cython Profiling Cython Using Cython with Jupyter Summary Questions Chapter 5: Exploring Compilers Technical requirements Getting started with Numba Using Numba decorators Type specializations Object mode versus native mode Numba and NumPy Universal functions with Numba Generalized universal functions JIT classes Limitations in Numba The PyPy project Setting up PyPy Running a particle simulator in PyPy Other interesting projects Summary Questions Further reading Chapter 6: Automatic Differentiation and Accelerated Linear Algebra for Machine Learning A crash course in machine learning Model parameters Loss function Loss minimization Getting JAX up and running Installing JAX Using Google Colab Automatic differentiation for loss minimization Making the dataset Building a linear model Gradient descent with automatic differentiation Just-In-Time compilation for improved efficiency Automatic vectorization for efficient kernels Data that is not linearly separable The kernel method in machine learning Automatic vectorization for kernelized models Summary Questions Further reading Section 2: Concurrency and Parallelism Chapter 7: Implementing Concurrency Technical requirements Asynchronous programming Waiting for input/output Concurrency Callbacks Futures Event loops The asyncio framework Coroutines Converting blocking code into non-blocking code Reactive programming Observables Useful operators Hot and cold observables Building a CPU monitor Summary Questions Further reading Chapter 8: Parallel Processing Technical requirements Introduction to parallel programming GPUs Using multiple processes The Process and Pool classes The Executor interface Monte Carlo approximation of pi Synchronization and locks Parallel Cython with OpenMP Automatic parallelism Getting started with Theano Profiling Theano TensorFlow Running code on a GPU Summary Questions Chapter 9: Concurrent Web Requests The basics of web requests HTML HTTP requests HTTP status code The requests module Making a request in Python Running a ping test Concurrent web requests Spawning multiple threads Refactoring request logic The problem with timeouts Support from httpstat.us and simulation in Python Timeout specifications Good practices in making web requests Consider the terms of service and data-collecting policies Error handling Update your program regularly Avoid making a large number of requests Summary Questions Further reading Chapter 10: Concurrent Image Processing Technical requirements Image processing fundamentals Python as an image processing tool Installing OpenCV and NumPy Computer image basics RGB values Pixels and image files Coordinates inside an image OpenCV API Image processing techniques Grayscaling Thresholding Applying concurrency to image processing Good concurrent image processing practices Choosing the correct way (out of many) Spawning an appropriate number of processes Processing input/output concurrently Summary Questions Further reading Chapter 11: Building Communication Channels with asyncio Technical requirements The ecosystem of communication channels Communication protocol layers Asynchronous programming for communication channels Transports and protocols in asyncio The big picture of asyncio's server client Getting started with Python and Telnet Starting a server Installing Telnet Simulating a connection channel Sending messages back to clients Closing transports Client-side communication with aiohttp Installing aiohttp and aiofiles Fetching a website's HTML code Writing files asynchronously Summary Questions Further reading Chapter 12: Deadlocks Technical requirements The concept of deadlocks The dining philosophers problem A deadlock in a concurrent system Python simulation Approaches to deadlock situations Implementing ranking among resources Ignoring locks and sharing resources An additional note about locks Concluding note on deadlock solutions The concept of livelocks Summary Questions Further reading Chapter 13: Starvation Technical requirements Understanding starvation What is starvation? Scheduling Causes of starvation Starvation's relationship to deadlock Approaching the readers-writers problem Problem statement The first readers-writers problem The second readers-writers problem The third readers-writers problem Solutions to starvation Summary Questions Further reading Chapter 14: Race Conditions Technical requirements The concept of race conditions Critical sections How race conditions occur Simulating race conditions in Python Locks as a solution to race conditions The effectiveness of locks Implementation in Python The downside of locks Turning a concurrent program into a sequential program Locks do not lock anything Race conditions in real life Security Operating systems Networking Summary Questions Further reading Chapter 15: The Global Interpreter Lock Technical requirements Introducing the GIL Analyzing memory management in Python The problem that the GIL addresses Problems raised by the GIL The potential removal of the GIL from Python Working with the GIL Implementing multiprocessing, rather than multithreading Getting around the GIL with native extensions Utilizing a different Python interpreter Summary Questions Further reading Section 3: Design Patterns in Python Chapter 16: The Factory Pattern Technical requirements Understanding design patterns Implementing the factory method Real-world examples Use cases Implementing the factory method Applying the abstract factory Real-world examples Use cases Implementing the abstract factory pattern Summary Questions Chapter 17: The Builder Pattern Technical requirements Understanding the builder pattern Real-world examples Use cases Implementing an ordering application Summary Questions Chapter 18: Other Creational Patterns Technical requirements Implementing the prototype pattern Real-world examples Use cases Implementation Implementing the singleton pattern Real-world examples Use cases Implementation Summary Questions Further reading Chapter 19: The Adapter Pattern Technical requirements Understanding the adapter pattern Real-world examples Use cases Implementation Summary Chapter 20: The Decorator Pattern Technical requirements Introducing the decorator pattern Real-world examples Use cases Implementation Summary Questions Chapter 21: The Bridge Pattern Technical requirements Real-world examples Use cases Implementation Summary Questions Chapter 22: The Façade Pattern Technical requirements Understanding the façade pattern Real-world examples Use cases Implementation Summary Questions Further reading Chapter 23: Other Structural Patterns Technical requirements Implementing the flyweight pattern Real-world examples Use cases Implementation Implementing the model-view-controller pattern Real-world examples Use cases Implementation Applying the proxy pattern Real-world examples Use cases Implementation Summary Questions Chapter 24: The Chain of Responsibility Pattern Technical requirements Understanding the Chain of Responsibility pattern Real-world examples Use cases Implementation Summary Questions Chapter 25: The Command Pattern Technical requirements Understanding the command pattern Real-world examples Use cases Implementation Summary Questions Chapter 26: The Observer Pattern Technical requirements Understanding the observer pattern Real-world examples Use cases Implementation Summary Questions Assessments Chapter 1 Chapter 2 Chapter 3 Chapter 4 Chapter 5 Chapter 6 Chapter 7 Chapter 8 Chapter 9 Chapter 10 Chapter 11 Chapter 12 Chapter 13 Chapter 14 Chapter 15 Chapter 16 Chapter 17 Chapter 18 Chapter 20 Chapter 21 Chapter 22 Chapter 23 Chapter 24 Chapter 25 Chapter 26 Why subscribe? Other Books You May Enjoy Packt is searching for authors like you Share Your Thoughts
Donate to keep this site alive
How to download source code?
1. Go to: https://github.com/PacktPublishing
2. In the Find a repository… box, search the book title: Transformers for Natural Language Processing: Build, train, and fine-tune deep neural network architectures for NLP with Python, PyTorch, TensorFlow, BERT, and GPT-3, 2nd Edition
, sometime you may not get the results, please search the main title.
3. Click the book title in the search results.
3. Click Code to download.
1. Disable the AdBlock plugin. Otherwise, you may not get any links.
2. Solve the CAPTCHA.
3. Click download link.
4. Lead to download server to download.