Контакты/Проезд  Доставка и Оплата Помощь/Возврат
История
  +7(495) 980-12-10
  пн-пт: 10-18 сб,вс: 11-18
  shop@logobook.ru
   
    Поиск книг                    Поиск по списку ISBN Расширенный поиск    
Найти
  Зарубежные издательства Российские издательства  
Авторы | Каталог книг | Издательства | Новинки | Учебная литература | Акции | Хиты | |
 

Pretrained Transformers for Text Ranking, Lin


Варианты приобретения
Цена: 11179.00р.
Кол-во:
Наличие: Поставка под заказ.  Есть в наличии на складе поставщика.
Склад Америка: Есть  
При оформлении заказа до: 2025-07-28
Ориентировочная дата поставки: Август-начало Сентября
При условии наличия книги у поставщика.

Добавить в корзину
в Мои желания

Автор: Lin
Название:  Pretrained Transformers for Text Ranking
ISBN: 9783031010538
Издательство: Springer
Классификация:


ISBN-10: 3031010531
Обложка/Формат: Soft cover
Страницы: 307
Вес: 0.62 кг.
Дата издания: 12.11.2021
Серия: Synthesis Lectures on Human Language Technologies
Язык: English
Иллюстрации: XVII, 307 p.
Размер: 235 x 191
Читательская аудитория: Professional & vocational
Основная тема: Computer Science
Подзаголовок: Bert and beyond
Ссылка на Издательство: Link
Рейтинг:
Поставляется из: Германии
Описание: The goal of text ranking is to generate an ordered list of texts retrieved from a corpus in response to a query. Although the most common formulation of text ranking is search, instances of the task can also be found in many natural language processing (NLP) applications.This book provides an overview of text ranking with neural network architectures known as transformers, of which BERT (Bidirectional Encoder Representations from Transformers) is the best-known example. The combination of transformers and self-supervised pretraining has been responsible for a paradigm shift in NLP, information retrieval (IR), and beyond. This book provides a synthesis of existing work as a single point of entry for practitioners who wish to gain a better understanding of how to apply transformers to text ranking problems and researchers who wish to pursue work in this area. It covers a wide range of modern techniques, grouped into two high-level categories: transformer models that perform reranking in multi-stage architectures and dense retrieval techniques that perform ranking directly. Two themes pervade the book: techniques for handling long documents, beyond typical sentence-by-sentence processing in NLP, and techniques for addressing the tradeoff between effectiveness (i.e., result quality) and efficiency (e.g., query latency, model and index size). Although transformer architectures and pretraining techniques are recent innovations, many aspects of how they are applied to text ranking are relatively well understood and represent mature techniques. However, there remain many open research questions, and thus in addition to laying out the foundations of pretrained transformers for text ranking, this book also attempts to prognosticate where the field is heading.
Дополнительное описание: Preface.- Acknowledgments.- Introduction.- Setting the Stage.- Multi-Stage Architectures for Reranking.- Refining Query and Document Representations.- Learned Dense Representations for Ranking.- Future Directions and Conclusions.- Bibliography.- Authors'



Advanced Natural Language Processing with TensorFlow 2: Build real-world effective NLP applications using NER, RNNs, seq2seq models, Transformers, and

Автор: Bansal Ashish
Название: Advanced Natural Language Processing with TensorFlow 2: Build real-world effective NLP applications using NER, RNNs, seq2seq models, Transformers, and
ISBN: 1800200935 ISBN-13(EAN): 9781800200937
Издательство: Неизвестно
Рейтинг:
Цена: 8091.00 р.
Наличие на складе: Есть у поставщика Поставка под заказ.

Описание:

One-stop solution for NLP practitioners, ML developers and data scientists to build effective NLP systems that can perform real-world complicated tasks


Key Features

  • Implement deep learning algorithms such as BiLSTMS, CRFs, and many more using TensorFlow 2
  • Explore classical NLP techniques and libraries including parts-of-speech tagging and tokenization
  • Learn practical applications of NLP covering the forefronts of the field like sentiment analysis and generating text


Book Description

In the last couple of years, there have been tremendous advances in natural language processing, and we are now moving from research labs into practical applications. Advanced Natural Language Processing comes with a perfect blend of both the theoretical and practical aspects of trending and complex NLP techniques.

This book is focused on innovative applications in the field of NLP, language generation, and dialogue systems. It goes into the details of applying the concepts of text pre-processing using techniques such as tokenization, parts of speech tagging, and lemmatization using popular libraries such as Stanford NLP and SpaCy. Named Entity Recognition (NER), a cornerstone of task-oriented bots, is built from scratch using Conditional Random Fields and Viterbi Decoding on top of RNNs.

Taking a practical and application-focused perspective, the book covers key emerging areas such as generating text for use in sentence completion and text summarization, bridging images and text by generating captions for images, and managing dialogue aspects of chatbot design. It also covers one of the most important reasons behind recent advances in NLP - applying transfer learning and fine-tuning using TensorFlow 2.

Further, it covers practical techniques that can simplify the labelling of textual data which otherwise proves to be a costly affair. The book also has a working code for each tech piece so that you can adapt them to your use cases.

By the end of this TensorFlow book, you will have an advanced knowledge of the tools, techniques and deep learning architecture used to solve complex NLP problems.


What You Will Learn

  • Grasp important pre-steps in building NLP applications like POS tagging
  • Deal with vast amounts of unlabeled and small labelled Datasets in NLP
  • Use transfer and weakly supervised learning using libraries like Snorkel
  • Perform sentiment analysis using BERT
  • Apply encoder-decoder NN architectures and beam search for summarizing text
  • Use transformer models with attention to bring images and text together
  • Build applications that generate captions and answer questions about images
  • Use advanced TensorFlow techniques like learning rate annealing, custom layers, and custom loss functions to build the latest deep NLP models


Who this book is for

This is not an introductory book and assumes the reader is familiar with basics of NLP and has fundamental Python skills, as well as basic knowledge of machine learning and undergraduate-level calculus and linear algebra.


The readers who can benefit the most from this book include:

Intermediate ML developers who are familiar with the basics of supervised learning and deep learning techniques

Professionals who already use TensorFlow/Python for purposes such as data science, ML, research, and analysis

Transformers for machine learning :

Автор: Kamath, Uday,
Название: Transformers for machine learning :
ISBN: 0367767341 ISBN-13(EAN): 9780367767341
Издательство: Taylor&Francis
Рейтинг:
Цена: 6889.00 р.
Наличие на складе: Есть у поставщика Поставка под заказ.

Описание: Transformers are becoming a core part of many neural network architectures, employed in a wide range of applications such as NLP, Speech Recognition, Time Series, and Computer Vision. Transformers have gone through many adaptations and alterations, resulting in newer techniques and methods. This is the first comprehensive book on transformers.

Condition Monitoring and Assessment of Power Transformers Using Computational Intelligence

Автор: W.H. Tang; Q.H. Wu
Название: Condition Monitoring and Assessment of Power Transformers Using Computational Intelligence
ISBN: 1447126262 ISBN-13(EAN): 9781447126263
Издательство: Springer
Рейтинг:
Цена: 19589.00 р.
Наличие на складе: Есть у поставщика Поставка под заказ.

Описание: This volume provides a thorough introduction to transformer condition monitoring for the assessment of power transformers. The fundamental theories are discussed, in addition to the most up-to-date research in this rapidly changing field.

Transformers for Natural Language Processing - Second Edition: Build, train, and fine-tune deep neural network architectures for NLP with Python, PyTo

Автор: Rothman Denis
Название: Transformers for Natural Language Processing - Second Edition: Build, train, and fine-tune deep neural network architectures for NLP with Python, PyTo
ISBN: 1803247339 ISBN-13(EAN): 9781803247335
Издательство: Неизвестно
Рейтинг:
Цена: 16551.00 р.
Наличие на складе: Есть у поставщика Поставка под заказ.

Описание: Being the first book in the market to dive deep into the Transformers, this book is a step-by-step guide for data and AI practitioners to help enhance the performance of language understanding and gain expertise with hands-on implementation of transformers with Python, PyTorch, and TensorFlow

Context-Aware Ranking with Factorization Models

Автор: Steffen Rendle
Название: Context-Aware Ranking with Factorization Models
ISBN: 3642423973 ISBN-13(EAN): 9783642423970
Издательство: Springer
Рейтинг:
Цена: 15672.00 р.
Наличие на складе: Есть у поставщика Поставка под заказ.

Описание: Context-aware ranking is an important task in search engine ranking. This book presents a generic method for context-aware ranking as well as its application. It applies this general theory to the three scenarios of item, tag and sequential-set recommendation.

Transformers for Machine Learning

Автор: Kamath, Uday Graham, Kenneth Emara, Wael
Название: Transformers for Machine Learning
ISBN: 0367771659 ISBN-13(EAN): 9780367771652
Издательство: Taylor&Francis
Рейтинг:
Цена: 16078.00 р.
Наличие на складе: Есть у поставщика Поставка под заказ.

Описание: Transformers are becoming a core part of many neural network architectures, employed in a wide range of applications such as NLP, Speech Recognition, Time Series, and Computer Vision. Transformers have gone through many adaptations and alterations, resulting in newer techniques and methods. This is the first comprehensive book on transformers.

Natural language processing with transformers, revised edition

Автор: Tunstall, Lewis Von Werra, Leandro Wolf, Thomas
Название: Natural language processing with transformers, revised edition
ISBN: 1098136799 ISBN-13(EAN): 9781098136796
Издательство: Wiley
Рейтинг:
Цена: 7602.00 р.
Наличие на складе: Есть у поставщика Поставка под заказ.

Описание: If you`re a data scientist or coder, this practical book -now revised in full color- shows you how to train and scale these large models using Hugging Face Transformers, a Python-based deep learning library.

Complexity in Polish Phonotactics

Автор: Paula Orzechowska
Название: Complexity in Polish Phonotactics
ISBN: 9811372985 ISBN-13(EAN): 9789811372988
Издательство: Springer
Рейтинг:
Цена: 13974.00 р.
Наличие на складе: Есть у поставщика Поставка под заказ.

Описание:

This book provides a refreshing perspective on the description, study and representation of consonant clusters in Polish. What are the sources of phonotactic complexity? What properties or principles motivate the phonological structure of initial and final consonant clusters? In answering these questions, a necessary turning point consists in investigating sequences of consonants at their most basic level, namely in terms of phonological features. The analysis is exploratory: it leads to discovering prevalent feature patterns in clusters from which new phonotactic generalizations are derived.
A recurring theme in the book is that phonological features vary in weight depending on (1) their distribution in a cluster, (2) their position in a word, and (3) language domain. Positional feature weight reflects the relative importance of place, manner and voice features (e.g. coronal, dorsal, strident, continuant) in constructing cluster inventories, minimizing cognitive effort, facilitating production and triggering specific casual speech processes. Feature weights give rise to previously unidentified positional preferences. Rankings of features and preferences are a testing ground for principles of sonority, contrast, clarity of perception and ease of articulation.
This volume addresses practitioners in the field seeking new methods of phonotactic modelling and approaches to complexity, as well as students interested in an overview of current research directions in the study of consonant clusters. Sequences of consonants in Polish are certainly among the most remarkable ones that readers will ever encounter in their linguistic explorations. In this volume, they will come to realise that hundreds of unusually long, odd-looking, sonority-violating, morphologically complex and infrequent clusters are in fact well-motivated and structured according to well-defined tactic patterns of features.
Introduction to Transformers for NLP

Автор: Jain
Название: Introduction to Transformers for NLP
ISBN: 1484288432 ISBN-13(EAN): 9781484288436
Издательство: Springer
Рейтинг:
Цена: 4611.00 р.
Наличие на складе: Есть у поставщика Поставка под заказ.

Описание: Get a hands-on introduction to Transformer architecture using the Hugging Face library. This book explains how Transformers are changing the AI domain, particularly in the area of natural language processing. This book covers Transformer architecture and its relevance in natural language processing (NLP). It starts with an introduction to NLP and a progression of language models from n-grams to a Transformer-based architecture. Next, it offers some basic Transformers examples using the Google colab engine. Then, it introduces the Hugging Face ecosystem and the different libraries and models provided by it. Moving forward, it explains language models such as Google BERT with some examples before providing a deep dive into Hugging Face API using different language models to address tasks such as sentence classification, sentiment analysis, summarization, and text generation. After completing Introduction to Transformers for NLP, you will understand Transformer concepts and be able to solve problems using the Hugging Face library. What You Will Learn * Understand language models and their importance in NLP and NLU (Natural Language Understanding) * Master Transformer architecture through practical examples * Use the Hugging Face library in Transformer-based language models * Create a simple code generator in Python based on Transformer architecture Who This Book Is For Data Scientists and software developers interested in developing their skills in NLP and NLU (Natural Language Understanding)


ООО "Логосфера " Тел:+7(495) 980-12-10 www.logobook.ru
   В Контакте     В Контакте Мед  Мобильная версия