Описание: This volume provides a thorough introduction to transformer condition monitoring for the assessment of power transformers. The fundamental theories are discussed, in addition to the most up-to-date research in this rapidly changing field.
Автор: Kamath, Uday Graham, Kenneth Emara, Wael Название: Transformers for Machine Learning ISBN: 0367771659 ISBN-13(EAN): 9780367771652 Издательство: Taylor&Francis Рейтинг: Цена: 16078.00 р. Наличие на складе: Есть у поставщика Поставка под заказ.
Описание: Transformers are becoming a core part of many neural network architectures, employed in a wide range of applications such as NLP, Speech Recognition, Time Series, and Computer Vision. Transformers have gone through many adaptations and alterations, resulting in newer techniques and methods. This is the first comprehensive book on transformers.
Автор: Jain Название: Introduction to Transformers for NLP ISBN: 1484288432 ISBN-13(EAN): 9781484288436 Издательство: Springer Рейтинг: Цена: 4611.00 р. Наличие на складе: Есть у поставщика Поставка под заказ.
Описание: Get a hands-on introduction to Transformer architecture using the Hugging Face library. This book explains how Transformers are changing the AI domain, particularly in the area of natural language processing. This book covers Transformer architecture and its relevance in natural language processing (NLP). It starts with an introduction to NLP and a progression of language models from n-grams to a Transformer-based architecture. Next, it offers some basic Transformers examples using the Google colab engine. Then, it introduces the Hugging Face ecosystem and the different libraries and models provided by it. Moving forward, it explains language models such as Google BERT with some examples before providing a deep dive into Hugging Face API using different language models to address tasks such as sentence classification, sentiment analysis, summarization, and text generation. After completing Introduction to Transformers for NLP, you will understand Transformer concepts and be able to solve problems using the Hugging Face library. What You Will Learn * Understand language models and their importance in NLP and NLU (Natural Language Understanding) * Master Transformer architecture through practical examples * Use the Hugging Face library in Transformer-based language models * Create a simple code generator in Python based on Transformer architecture Who This Book Is For Data Scientists and software developers interested in developing their skills in NLP and NLU (Natural Language Understanding)
Автор: Lin Название: Pretrained Transformers for Text Ranking ISBN: 3031010531 ISBN-13(EAN): 9783031010538 Издательство: Springer Рейтинг: Цена: 11179.00 р. Наличие на складе: Есть у поставщика Поставка под заказ.
Описание: The goal of text ranking is to generate an ordered list of texts retrieved from a corpus in response to a query. Although the most common formulation of text ranking is search, instances of the task can also be found in many natural language processing (NLP) applications.This book provides an overview of text ranking with neural network architectures known as transformers, of which BERT (Bidirectional Encoder Representations from Transformers) is the best-known example. The combination of transformers and self-supervised pretraining has been responsible for a paradigm shift in NLP, information retrieval (IR), and beyond. This book provides a synthesis of existing work as a single point of entry for practitioners who wish to gain a better understanding of how to apply transformers to text ranking problems and researchers who wish to pursue work in this area. It covers a wide range of modern techniques, grouped into two high-level categories: transformer models that perform reranking in multi-stage architectures and dense retrieval techniques that perform ranking directly. Two themes pervade the book: techniques for handling long documents, beyond typical sentence-by-sentence processing in NLP, and techniques for addressing the tradeoff between effectiveness (i.e., result quality) and efficiency (e.g., query latency, model and index size). Although transformer architectures and pretraining techniques are recent innovations, many aspects of how they are applied to text ranking are relatively well understood and represent mature techniques. However, there remain many open research questions, and thus in addition to laying out the foundations of pretrained transformers for text ranking, this book also attempts to prognosticate where the field is heading.
One-stop solution for NLP practitioners, ML developers and data scientists to build effective NLP systems that can perform real-world complicated tasks
Key Features
Implement deep learning algorithms such as BiLSTMS, CRFs, and many more using TensorFlow 2
Explore classical NLP techniques and libraries including parts-of-speech tagging and tokenization
Learn practical applications of NLP covering the forefronts of the field like sentiment analysis and generating text
Book Description
In the last couple of years, there have been tremendous advances in natural language processing, and we are now moving from research labs into practical applications. Advanced Natural Language Processing comes with a perfect blend of both the theoretical and practical aspects of trending and complex NLP techniques.
This book is focused on innovative applications in the field of NLP, language generation, and dialogue systems. It goes into the details of applying the concepts of text pre-processing using techniques such as tokenization, parts of speech tagging, and lemmatization using popular libraries such as Stanford NLP and SpaCy. Named Entity Recognition (NER), a cornerstone of task-oriented bots, is built from scratch using Conditional Random Fields and Viterbi Decoding on top of RNNs.
Taking a practical and application-focused perspective, the book covers key emerging areas such as generating text for use in sentence completion and text summarization, bridging images and text by generating captions for images, and managing dialogue aspects of chatbot design. It also covers one of the most important reasons behind recent advances in NLP - applying transfer learning and fine-tuning using TensorFlow 2.
Further, it covers practical techniques that can simplify the labelling of textual data which otherwise proves to be a costly affair. The book also has a working code for each tech piece so that you can adapt them to your use cases.
By the end of this TensorFlow book, you will have an advanced knowledge of the tools, techniques and deep learning architecture used to solve complex NLP problems.
What You Will Learn
Grasp important pre-steps in building NLP applications like POS tagging
Deal with vast amounts of unlabeled and small labelled Datasets in NLP
Use transfer and weakly supervised learning using libraries like Snorkel
Perform sentiment analysis using BERT
Apply encoder-decoder NN architectures and beam search for summarizing text
Use transformer models with attention to bring images and text together
Build applications that generate captions and answer questions about images
Use advanced TensorFlow techniques like learning rate annealing, custom layers, and custom loss functions to build the latest deep NLP models
Who this book is for
This is not an introductory book and assumes the reader is familiar with basics of NLP and has fundamental Python skills, as well as basic knowledge of machine learning and undergraduate-level calculus and linear algebra.
The readers who can benefit the most from this book include:
Intermediate ML developers who are familiar with the basics of supervised learning and deep learning techniques
Professionals who already use TensorFlow/Python for purposes such as data science, ML, research, and analysis
Автор: Minkner Название: The Technology of Instrument Transformers ISBN: 3658348623 ISBN-13(EAN): 9783658348625 Издательство: Springer Рейтинг: Цена: 9083.00 р. Наличие на складе: Поставка под заказ.
Описание: Existing instrument transformer technologies as well as new measuring principles for current and voltage measurement are described in this book. The properties of conventional current and voltage transformer as well as the dimensioning are discussed in details out of the long experience of the authors. Especially the dielectric dimensioning and the used materials are discussed. Beside this an overview over new modern measuring principles is given and the technology of low-power instrument transformer, and RC-dividers are shown.
ООО "Логосфера " Тел:+7(495) 980-12-10 www.logobook.ru