Posts by Tags

information theory

Understanding Mutual Information

9 minute read

Published:

For this post I hope to accomplish a few different things. I will review a brilliant document put together by Erik G. Learned-Miller. I found this document when attempting to better understand the concept of ‘Mutual Information’, and it has been by far the most influential document in my understanding of entropy and mutual information. It is a short document, only 3 pages, with the purpose of being an introduction to entropy and mutual information for discrete random variables. Erik does something that I think so many teachers miss when introducing students to new concepts, which is using real-world, easy to understand examples to aid with the formulas. I hope I can add some intuition behind what entropy, joint entropy, and mutual information actually represent, as well as review some simple and more complex examples that I am currently working on in my research.

mutual information

Understanding Mutual Information

9 minute read

Published:

For this post I hope to accomplish a few different things. I will review a brilliant document put together by Erik G. Learned-Miller. I found this document when attempting to better understand the concept of ‘Mutual Information’, and it has been by far the most influential document in my understanding of entropy and mutual information. It is a short document, only 3 pages, with the purpose of being an introduction to entropy and mutual information for discrete random variables. Erik does something that I think so many teachers miss when introducing students to new concepts, which is using real-world, easy to understand examples to aid with the formulas. I hope I can add some intuition behind what entropy, joint entropy, and mutual information actually represent, as well as review some simple and more complex examples that I am currently working on in my research.

representation learning

Understanding Mutual Information

9 minute read

Published:

For this post I hope to accomplish a few different things. I will review a brilliant document put together by Erik G. Learned-Miller. I found this document when attempting to better understand the concept of ‘Mutual Information’, and it has been by far the most influential document in my understanding of entropy and mutual information. It is a short document, only 3 pages, with the purpose of being an introduction to entropy and mutual information for discrete random variables. Erik does something that I think so many teachers miss when introducing students to new concepts, which is using real-world, easy to understand examples to aid with the formulas. I hope I can add some intuition behind what entropy, joint entropy, and mutual information actually represent, as well as review some simple and more complex examples that I am currently working on in my research.

self-supervised learning

Understanding Mutual Information

9 minute read

Published:

For this post I hope to accomplish a few different things. I will review a brilliant document put together by Erik G. Learned-Miller. I found this document when attempting to better understand the concept of ‘Mutual Information’, and it has been by far the most influential document in my understanding of entropy and mutual information. It is a short document, only 3 pages, with the purpose of being an introduction to entropy and mutual information for discrete random variables. Erik does something that I think so many teachers miss when introducing students to new concepts, which is using real-world, easy to understand examples to aid with the formulas. I hope I can add some intuition behind what entropy, joint entropy, and mutual information actually represent, as well as review some simple and more complex examples that I am currently working on in my research.