MENU

Fun & Interesting

Information Theory, Lecture 1: Defining Entropy and Information - Oxford Mathematics 3rd Yr Lecture

Oxford Mathematics 40,276 1 month ago
Video Not Working? Fix It Now

In this lecture from Sam Cohen’s 3rd year ‘Information Theory’ course, one of eight we are showing, Sam asks: how do we measure the amount of information we learn by seeing the outcome of a random variable? Answer: this can be measured by the variable’s entropy (and related quantities), which we introduce. You can watch the eight lectures from the course as they appear via the playlist: https://www.youtube.com/playlist?list=PL4d5ZtfQonW3iAhXvTYCnoGEeRhxhKHMc You can also watch many other student lectures via our main Student Lectures playlist (also check out specific student lectures playlists): https://www.youtube.com/playlist?list=PL4d5ZtfQonW0A4VHeiY0gSkX1QEraaacE All first and second year lectures are followed by tutorials where students meet their tutor in pairs to go through the lecture and associated problem sheet and to talk and think more about the maths. Third and fourth year lectures are followed by classes.

Comment