Background) (1) Basic Concepts Overview - (1-1) Information Theory - Understanding Quantity of Information, ENTROPY, Information Gain, KL Divergence, and Cross-Entropy (Lecture: YouTube)

Topics Covered : Quantity of Information Entropy Information Gain KL Divergence Cross-Entropy These topics are fundamental to understanding how information is processed and measured in machine learning and deep learning models. By studying and organizing these concepts, I aim to improve my understanding of the underlying principles and their applications in AI models.