One-stop solution for NLP practitioners, ML developers and data scientists to build effective NLP systems that can perform real-world complicated tasks
Key Features
Implement deep learning algorithms such as BiLSTMS, CRFs, and many more using TensorFlow 2
Explore classical NLP techniques and libraries including parts-of-speech tagging and tokenization
Learn practical applications of NLP covering the forefronts of the field like sentiment analysis and generating text
Book Description
In the last couple of years, there have been tremendous advances in natural language processing, and we are now moving from research labs into practical applications. Advanced Natural Language Processing comes with a perfect blend of both the theoretical and practical aspects of trending and complex NLP techniques.
This book is focused on innovative applications in the field of NLP, language generation, and dialogue systems. It goes into the details of applying the concepts of text pre-processing using techniques such as tokenization, parts of speech tagging, and lemmatization using popular libraries such as Stanford NLP and SpaCy. Named Entity Recognition (NER), a cornerstone of task-oriented bots, is built from scratch using Conditional Random Fields and Viterbi Decoding on top of RNNs.
Taking a practical and application-focused perspective, the book covers key emerging areas such as generating text for use in sentence completion and text summarization, bridging images and text by generating captions for images, and managing dialogue aspects of chatbot design. It also covers one of the most important reasons behind recent advances in NLP - applying transfer learning and fine-tuning using TensorFlow 2.
Further, it covers practical techniques that can simplify the labelling of textual data which otherwise proves to be a costly affair. The book also has a working code for each tech piece so that you can adapt them to your use cases.
By the end of this TensorFlow book, you will have an advanced knowledge of the tools, techniques and deep learning architecture used to solve complex NLP problems.
What You Will Learn
Grasp important pre-steps in building NLP applications like POS tagging
Deal with vast amounts of unlabeled and small labelled Datasets in NLP
Use transfer and weakly supervised learning using libraries like Snorkel
Perform sentiment analysis using BERT
Apply encoder-decoder NN architectures and beam search for summarizing text
Use transformer models with attention to bring images and text together
Build applications that generate captions and answer questions about images
Use advanced TensorFlow techniques like learning rate annealing, custom layers, and custom loss functions to build the latest deep NLP models
Who this book is for
This is not an introductory book and assumes the reader is familiar with basics of NLP and has fundamental Python skills, as well as basic knowledge of machine learning and undergraduate-level calculus and linear algebra.
The readers who can benefit the most from this book include:
Intermediate ML developers who are familiar with the basics of supervised learning and deep learning techniques
Professionals who already use TensorFlow/Python for purposes such as data science, ML, research, and analysis
Автор: Kamath, Uday, Название: Transformers for machine learning : ISBN: 0367767341 ISBN-13(EAN): 9780367767341 Издательство: Taylor&Francis Рейтинг: Цена: 6889.00 р. Наличие на складе: Есть у поставщика Поставка под заказ.
Описание: Transformers are becoming a core part of many neural network architectures, employed in a wide range of applications such as NLP, Speech Recognition, Time Series, and Computer Vision. Transformers have gone through many adaptations and alterations, resulting in newer techniques and methods. This is the first comprehensive book on transformers.
Описание: This volume provides a thorough introduction to transformer condition monitoring for the assessment of power transformers. The fundamental theories are discussed, in addition to the most up-to-date research in this rapidly changing field.
Описание: Being the first book in the market to dive deep into the Transformers, this book is a step-by-step guide for data and AI practitioners to help enhance the performance of language understanding and gain expertise with hands-on implementation of transformers with Python, PyTorch, and TensorFlow
Автор: Steffen Rendle Название: Context-Aware Ranking with Factorization Models ISBN: 3642423973 ISBN-13(EAN): 9783642423970 Издательство: Springer Рейтинг: Цена: 15672.00 р. Наличие на складе: Есть у поставщика Поставка под заказ.
Описание: Context-aware ranking is an important task in search engine ranking. This book presents a generic method for context-aware ranking as well as its application. It applies this general theory to the three scenarios of item, tag and sequential-set recommendation.
Автор: Kamath, Uday Graham, Kenneth Emara, Wael Название: Transformers for Machine Learning ISBN: 0367771659 ISBN-13(EAN): 9780367771652 Издательство: Taylor&Francis Рейтинг: Цена: 16078.00 р. Наличие на складе: Есть у поставщика Поставка под заказ.
Описание: Transformers are becoming a core part of many neural network architectures, employed in a wide range of applications such as NLP, Speech Recognition, Time Series, and Computer Vision. Transformers have gone through many adaptations and alterations, resulting in newer techniques and methods. This is the first comprehensive book on transformers.
Автор: Tunstall, Lewis Von Werra, Leandro Wolf, Thomas Название: Natural language processing with transformers, revised edition ISBN: 1098136799 ISBN-13(EAN): 9781098136796 Издательство: Wiley Рейтинг: Цена: 7602.00 р. Наличие на складе: Есть у поставщика Поставка под заказ.
Описание: If you`re a data scientist or coder, this practical book -now revised in full color- shows you how to train and scale these large models using Hugging Face Transformers, a Python-based deep learning library.
Автор: Paula Orzechowska Название: Complexity in Polish Phonotactics ISBN: 9811372985 ISBN-13(EAN): 9789811372988 Издательство: Springer Рейтинг: Цена: 13974.00 р. Наличие на складе: Есть у поставщика Поставка под заказ.
Описание:
This book provides a refreshing perspective on the description, study and representation of consonant clusters in Polish. What are the sources of phonotactic complexity? What properties or principles motivate the phonological structure of initial and final consonant clusters? In answering these questions, a necessary turning point consists in investigating sequences of consonants at their most basic level, namely in terms of phonological features. The analysis is exploratory: it leads to discovering prevalent feature patterns in clusters from which new phonotactic generalizations are derived.
A recurring theme in the book is that phonological features vary in weight depending on (1) their distribution in a cluster, (2) their position in a word, and (3) language domain. Positional feature weight reflects the relative importance of place, manner and voice features (e.g. coronal, dorsal, strident, continuant) in constructing cluster inventories, minimizing cognitive effort, facilitating production and triggering specific casual speech processes. Feature weights give rise to previously unidentified positional preferences. Rankings of features and preferences are a testing ground for principles of sonority, contrast, clarity of perception and ease of articulation.
This volume addresses practitioners in the field seeking new methods of phonotactic modelling and approaches to complexity, as well as students interested in an overview of current research directions in the study of consonant clusters. Sequences of consonants in Polish are certainly among the most remarkable ones that readers will ever encounter in their linguistic explorations. In this volume, they will come to realise that hundreds of unusually long, odd-looking, sonority-violating, morphologically complex and infrequent clusters are in fact well-motivated and structured according to well-defined tactic patterns of features.
Автор: Jain Название: Introduction to Transformers for NLP ISBN: 1484288432 ISBN-13(EAN): 9781484288436 Издательство: Springer Рейтинг: Цена: 4611.00 р. Наличие на складе: Есть у поставщика Поставка под заказ.
Описание: Get a hands-on introduction to Transformer architecture using the Hugging Face library. This book explains how Transformers are changing the AI domain, particularly in the area of natural language processing. This book covers Transformer architecture and its relevance in natural language processing (NLP). It starts with an introduction to NLP and a progression of language models from n-grams to a Transformer-based architecture. Next, it offers some basic Transformers examples using the Google colab engine. Then, it introduces the Hugging Face ecosystem and the different libraries and models provided by it. Moving forward, it explains language models such as Google BERT with some examples before providing a deep dive into Hugging Face API using different language models to address tasks such as sentence classification, sentiment analysis, summarization, and text generation. After completing Introduction to Transformers for NLP, you will understand Transformer concepts and be able to solve problems using the Hugging Face library. What You Will Learn * Understand language models and their importance in NLP and NLU (Natural Language Understanding) * Master Transformer architecture through practical examples * Use the Hugging Face library in Transformer-based language models * Create a simple code generator in Python based on Transformer architecture Who This Book Is For Data Scientists and software developers interested in developing their skills in NLP and NLU (Natural Language Understanding)
ООО "Логосфера " Тел:+7(495) 980-12-10 www.logobook.ru