Monday, October 31, 2011

Unsupervised learning 1-17

I'm a bit tired, but want to finish this off tonight.

Do you notice how defensive S. is getting about his quizzes? Some of the criticism hit home.

Finally, we have some technology to animate the algorithms.  That should really help.

"for the sake of this class, let's just care about it."  That one made me laugh.

I'm excited about Expectation Maximization  It's a cool algorithm, but generally pretty hard to understand.  I wonder if he'll just give the gaussian version... yep.

That gaussian looks like it crosses the x axis.  Be careful, the tails approach but do not cross the x axis.

Anyway,  if you don't already  know about gaussians, I would recommend just hitting the I believe button regarding the formula (but do pay attention to the derivative).

It should be noted that what he's showing and calling EM, is not the general EM algorithm, but a special version just for these "mixtures of gaussians."  There are many other versions, depending on the underlying distribution you use.

I'm stopping here because I'm not paying very good attention.  I'll finish in the morning.

No comments:

Post a Comment