- Natural Language Processing with TensorFlow
- Thushan Ganegedara
- 216字
- 2021-06-25 21:28:24
Chapter 4. Advanced Word2vec
In Chapter 3, Word2vec – Learning Word Embeddings, we introduced you to Word2vec, the basics of learning word embeddings, and the two common Word2vec algorithms: skip-gram and CBOW. In this chapter, we will discuss several topics related to Word2vec, focusing on these two algorithms and extensions.
First, we will explore how the original skip-gram algorithm was implemented and how it compares to its more modern variant, which we used in Chapter 3, Word2vec – Learning Word Embeddings. We will examine the differences between skip-gram and CBOW and look at the behavior of the loss over time of the two approaches. We will also discuss which method works better, using both our observation and the available literature.
We will discuss several extensions to the existing Word2vec methods that boost performance. These extensions include using more effective sampling techniques to sample negative examples for negative sampling and ignoring uninformative words in the learning process, among others. You will also learn a novel word embedding learning technique known as Global Vectors (GloVe) and the specific advantages that GloVe has over skip-gram and CBOW.
Finally, you will learn how to use Word2vec to solve a real-world problem: document classification. We will see this with a simple trick of obtaining document embeddings from word embeddings.
- Embedded Linux Projects Using Yocto Project Cookbook
- 數據庫程序員面試筆試真題與解析
- 軟件項目管理(第2版)
- Unity Virtual Reality Projects
- Julia機器學習核心編程:人人可用的高性能科學計算
- 精通API架構:設計、運維與演進
- Practical Game Design
- Visual C#通用范例開發金典
- 計算機應用基礎案例教程
- 智能手機APP UI設計與應用任務教程
- Moodle 3 Administration(Third Edition)
- Android移動應用項目化教程
- Python程序設計教程
- C語言程序設計
- Mastering Unity 2017 Game Development with C#(Second Edition)