Embeddings in Natural Language Processing

Embeddings in Natural Language Processing by Mohammad Taher Pilehvar


ISBN
9781636390239
Published
Binding
Hardcover
Pages
175
Dimensions
191 x 235mm

Embeddings have undoubtedly been one of the most influential research areas in Natural Language Processing (NLP). Encoding information into a low-dimensional vector representation, which is easily integrable in modern machine learning models, has played a central role in the development of NLP. Embedding techniques initially focused on words, but the attention soon started to shift to other forms: from graph structures, such as knowledge bases, to other types of textual content, such as sentences and documents.

This book provides a high-level synthesis of the main embedding techniques in NLP, in the broad sense. The book starts by explaining conventional word vector space models and word embeddings (e.g., Word2Vec and GloVe) and then moves to other types of embeddings, such as word sense, sentence and document, and graph embeddings. The book also provides an overview of recent developments in contextualized representations (e.g., ELMo and BERT) and explains their potential in NLP.

Throughout the book, the reader can find both essential information for understanding a certain topic from scratch and a broad overview of the most successful techniques developed in the literature.
EOFY 2025 Book Frenzy
168.29
RRP: $197.99
15% off RRP


This product is unable to be ordered online. Please check in-store availability.
Instore Price: $197.99
Enter your Postcode or Suburb to view availability and delivery times.

Other Titles by Mohammad Taher Pilehvar



RRP refers to the Recommended Retail Price as set out by the original publisher at time of release.
The RRP set by overseas publishers may vary to those set by local publishers due to exchange rates and shipping costs.
Due to our competitive pricing, we may have not sold all products at their original RRP.