Tuesday, 29 January 2013

Information Theory

Recently I've started watching David McKay's lecture series on Information Theory on videolectures.net (http://videolectures.net/course_information_theory_pattern_recognition/). Around September when choosing courses to take for the upcoming semester I had very little idea of what Information Theory actually was and since there were other more obvious courses to take, it barely registered a second thought. Nevertheless having had some facet pop up in every course I had taken that semester, even if it was just a passing mention to Shannon entropy, I thought perhaps I had missed something very worthwhile.

I've just finished lecture 4, and so far McKay has taken me through a tour of Shannon's source coding theorem and he has convinced of what a marvel it really is. I don't really have time to do all the surrounding reading (although I did read the chapter on Information Theory in Theoretical Neuroscience), but it has been excellent in giving an intuition for how the basis of the field was constructed. The exercises he takes the class through connect neatly together when you start to think about them and it becomes really satisfying when you start to know what is coming next, which happens more often than I was expecting.

Even though my MSc thesis topic appears to have changed (I'll now be looking at developing a Wiener kernel to predict variability in spike trains), it still looks as if this will be useful in understanding neural encoding. So I'll stick with it for now.

No comments:

Post a Comment