Word2Vec is a method that obtains mathematical/vectorial representations of words (word embeddings) while supporting algebraic operations between those vectors to find relationships between words.
👉 Xiperia offers business consulting that transforms data into actionable knowledge to achieve your business goals. Learn more at https://www.xiperia.com
To cite this educational resource, use the following reference:
Gutiérrez-García, J.O. [Machine Code]. (2025, January 27). Word2Vec:
[Video]. YouTube. [Include the video URL here]
*********************************************
To guide your learning, in this link (https://youtu.be/lomJnbN5Wnk) you will find a sequential guide to learn:
1. Basic Programming with Python;
2. Data Management;
3. Data Visualization;
4. Data Analysis; and
5. Machine Learning and Data Science.
**********************************************
Video Index:
0:00 Introduction
0:41 Vectorizing words with semantics
4:20 Algebraic operations
5:50 Mathematical references
6:40 Context in Word2Vec
8:03 Self-supervised Word2Vec
12:17 CBOW: Continuous bag of words
12:40 Skip-gram
14:01 Skip-gram neural network
24:02 Word embeddings in skip-gram
25:39 CBOW neural network
29:03 Word embeddings in CBOW
29:22 Python modules for Word2Vec
⭐ Support Código Máquina by giving a Like, Commenting, Sharing or with a Super Thanks.
⭐ From the co-founder of Código Máquina, SINHAKI natural cosmetics products:
https://www.amazon.com.mx/stores/sinHaki/page/1BD34FBC-C0F9-44F5-AC69-520634334C61?ref_=ast_bln
#NLP #neuralnetworks #MachineLearning #DataScience #IA #AI #DataScience #ArtificialIntelligence #MachineLearning #MachineLearning #NLP