Online courses recommended by Hacker News users. [about]

Probabilistic Graphical Models 1: Representation

Coursera · Stanford University · 15 HN citations

Probabilistic graphical models (PGMs) are a rich framework for encoding probability distributions over complex domains: joint (multivariate) distributions over large ...

View on Coursera
The vast majority of the courses listed here on HN.Academy are available from their providers for free. Many courses offer a completion certification for a fee. A few courses and specializations require an enrollment fee. HN.Academy receives a referral commission when you visit course pages through links on this site and then purchase courses and completion certificates. If you decide to purchase a certificate or course the commission does not increase the cost of the course and helps support the continued existence of HN.Academy which is much appreciated.

Hacker News Comments about Probabilistic Graphical Models 1: Representation

All the comments and stories posted to Hacker News that reference this course.
Apr 06, 2018 nl on Berkeley offers its data science course online for free
The Georgia Tech online masters?

Depending on what stats you want to do, there are some pretty decent MOOCs. No one is going to claim that Daphne Koller's PGM course is weak in anyway for example[1].


Aug 22, 2017 iamkeyur on Ask HN: What are the best MOOCs for Data Science and Machine Learning?
Nowadays, there are a couple of really excellent online lectures to get you started.

The list is too long to include them all. Every one of the major MOOC sites offers not only one but several good Machine Learning classes, so please check [coursera]( ), [edX]( ), [Udacity]( ) yourself to see which ones are interesting to you.

However, there are a few that stand out, either because they're very popular or are done by people who are famous for their work in ML. Roughly in order from easiest to hardest, those are:

* Andrew Ng's [ML-Class at coursera]( ): Focused on application of techniques. Easy to understand, but mathematically very shallow. Good for beginners!

* Hasti/Tibshirani's [Elements of Statistical Learning]( ): Also aimed at beginners and focused more on applications.

* Yaser Abu-Mostafa's [Learning From Data]( ): Focuses a lot more on theory, but also doable for beginners

* Geoff Hinton's [Neural Nets for Machine Learning]( ): As the title says, this is almost exclusively about Neural Networks.

* Hugo Larochelle's [Neural Net lectures]( ): Again mostly on Neural Nets, with a focus on Deep Learning

* Daphne Koller's [Probabilistic Graphical Models]( ) Is a very challenging class, but has a lot of good material that few of the other.

May 14, 2017 allenleein on Ask HN: Why there is no Codecademy for ML or AI?
Yes, I did my research but there is no such interactive tutorial online like Treehouse or Codecademy. There are so many tutorials but none of it tells you the whole path.

Here are the resources I found useful:

========================================== Advices from Open AI, Facebook AI leaders

Courses You MUST Take: Machine Learning by Andrew Ng ( ) /// Class notes: ( )

Yaser Abu-Mostafa’s Machine Learning course which focuses much more on theory than the Coursera class but it is still relevant for beginners.( )

Neural Networks and Deep Learning (Recommended by Google Brain Team) ( )

Probabilistic Graphical Models ( )

Computational Neuroscience ( )

Statistical Machine Learning ( )

From Open AI CEO Greg Brockman on Quora

Deep Learning Book ( ) ( Also Recommended by Google Brain Team )

It contains essentially all the concepts and intuition needed for deep learning engineering (except reinforcement learning). by Greg

2. If you’d like to take courses: Linear Algebra — Stephen Boyd’s EE263 (Stanford) ( ) or Linear Algebra (MIT)( )

Neural Networks for Machine Learning — Geoff Hinton (Coursera)

Neural Nets — Andrej Karpathy’s CS231N (Stanford)

Advanced Robotics (the MDP / optimal control lectures) — Pieter Abbeel’s CS287 (Berkeley)

Deep RL — John Schulman’s CS294–112 (Berkeley)

From Director of AI Research at Facebook and Professor at NYU Yann LeCun on Quora

In any case, take Calc I, Calc II, Calc III, Linear Algebra, Probability and Statistics, and as many physics courses as you can. But make sure you learn to program.

Dec 29, 2016 davmre on Machine Learning Crash Course: Part 2
You're welcome! FWIW I mostly agree with argonaut's point elsewhere in this thread - very few people successfully self-teach ML from a textbook alone. So whichever book(s) you choose, it might also be worth working through some course materials. I've already suggested Stanford's CS229 for solid foundations, but depending on your interests in bioinformatics, Daphne Koller's Coursera course on probabilistic graphical models ( ) might be especially relevant. Koller literally wrote the book on PGMs, has done a lot of work in comp bio, and her MOOC is apparently the real deal: very intense but well-reviewed by the people that make it through.
Oct 15, 2016 hrzn on Ask HN: How to get started with machine learning?
Gain background knowledge first, it will make your life much easier. It will also make the difference between just running black box libraries and understanding what's happening. Make sure you're comfortable with linear algebra (matrix manipulation) and probability theory. You don't need advanced probability theory, but you should be comfortable with the notions of discrete and continuous random variables and probability distributions.

Khan Academy looks like a good beginning for linear algebra:

MIT 6.041SC seems like a good beginning for probability theory:

Then, for machine learning itself, pretty much everyone agrees that Andrew Ng's class on Coursera is a good introduction:

If you like books, "Pattern Recognition and Machine Learning" by Chris Bishop is an excellent reference of "traditional" machine learning (i.e., without deep learning).

"Machine Learning: a Probabilistic Perspective" book by Kevin Murphy is also an excellent (and heavy) book:

This online book is a very good resource to gain intuitive and practical knowledge about neural networks and deep learning:

Finally, I think it's very beneficial to spend time on probabilistic graphical models. Here is a good resource:

Have fun!

Oct 15, 2016 allenleein on Ask HN: How to get started with machine learning?
Courses You MUST Take:

1. Machine Learning by Andrew Ng ( ) /// Class notes: ( )

2. Yaser Abu-Mostafa’s Machine Learning course which focuses much more on theory than the Coursera class but it is still relevant for beginners.( )

3. Neural Networks and Deep Learning (Recommended by Google Brain Team) ( )

4. Probabilistic Graphical Models ( )

4. Computational Neuroscience ( )

5. Statistical Machine Learning ( )

If you want to learn AI:

Mar 15, 2016 hiddencost on Enrollment Is Surging in Machine Learning Classes
Sure, a couple things.

(I'm assuming you're comfortable with multivariable calculus.)

Andrew Ng's coursera course is good.

PRML (pattern recognition and machine learning) by bishop is good, and has a useful introduction to probability theory.

You also want a good grounding in linear algebra. Strang is basically the authority on linear:

You want a strong grounding in probability theory and statistics. (This is the basic language and intuition of the entire field.) I don't have as many preferences here (although its the most important); someone in this thread pointed to a course on statistical learning @ stanford that's good.

A good understanding of optimization is helpful. Here's a link that leads to a useful MOOC for that:

there's a lot of other stuff (markov decision processes, gaussian processes, monte carlo methods come to mind) that is useful that I'm not pointing to, but if you've hit the other stuff here then you'll probably be able to find out those things.

If you're into it, is good but not vital.

You may want to know about reinforcement learning. This answer does better than I can:

Deep learning seems popular these days :) ( )

Otherwise, it depends on the domain.

For NLP, there's a great stanford course on deep learning + NLP ( ), but there's a ton of domain knowledge for most NLP work (and a lot of it really centers around data preparation).

For speech, theoretical computer science matters (weighted finite state transducers, formal languages, etc.)

For vision, again, stanford: ( )

For other applications, well, ask someone else? :)


EDIT: unfortunately, there's also a lot of practitioner's dark art; I picked a lot up as a research assistant, and then my first year in industry felt like being strapped to a rocket.

Jan 13, 2016 Eridrus on Introduction to Statistical Learning, with Applications in R
There's a coursera course on Probabilistic Graphical Models:

I would guess that's more approachable than a text book, but who knows.

Nov 30, 2015 anoopelias on Ask HN: What's your favorite online course?

Andrew Ng's ML Class -

Daphne Koller's PGM Class -

Dan Jurafsky's and Christopher Manning's NLP Class -

Oct 26, 2015 Schiphol on List of the Most Popular MOOCs
Thanks for this. Incidentally, the second paper you link to is co-auhored, among others, by Daphne Koller, who teaches [this great course on probabilistic graphical models]( ) and Andrew Ng, who teaches [the best-known intro MOOC in machine learning]( ).
Oct 23, 2015 draven on Obstacles on the Path to AI
The title sounds familiar, it's also a course on coursera:

Last session was in 2013 though.

May 12, 2015 shogunmike on Ask HN: What are some good Machine Learning resources?
Some good books on Machine Learning:

Machine Learning: The Art and Science of Algorithms that Make Sense of Data (Flach):

Machine Learning: A Probabilistic Perspective (Murphy):

Pattern Recognition and Machine Learning (Bishop):

There are some great resources/books for Bayesian statistics and graphical models. I've listed them in (approximate) order of increasing difficulty/mathematical complexity:

Think Bayes (Downey):

Bayesian Methods for Hackers (Davidson-Pilon et al):

Doing Bayesian Data Analysis (Kruschke), aka "the puppy book":

Bayesian Data Analysis (Gellman):

Bayesian Reasoning and Machine Learning (Barber):

Probabilistic Graphical Models (Koller et al):

If you want a more mathematical/statistical take on Machine Learning, then the two books by Hastie/Tibshirani et al are definitely worth a read (plus, they're free to download from the authors' websites!):

Introduction to Statistical Learning:

The Elements of Statistical Learning:

Obviously there is the whole field of "deep learning" as well! A good place to start is with:

Apr 11, 2015 aylons on Deep Learning vs. Probabilistic Graphical Models vs. Logic
I don't know how he learned, but I studied it through the very demanding, and worth every second, course from Coursera:

Mar 29, 2015 guru_meditation on Monte Carlo Methods, Stochastic Optimization
I heartily recommend the notebooks published in this course as excellent applied reference material to estimation and optimization.

I love it how code and coursework are intermingled, reminiscing me of Knuth's Literate Programming [1]

My beef with many other courses offered (including Coursera) is that they use Matlab when it's clearly advantageous to use IPython Notebook as a better experimenting environment. For example, Daphne Koeller's PGM course[2] is still in Matlab and no matter what you do the code looks extremely clumsy and hard to read. N.B. I wrote tens of thousands of lines of Matlab code, including GUI programs, but that does not mean it's a good language to use especially in cases like this.



Jul 15, 2014 platz on Why Probabilistic Programming Matters
something like PGM [1] (this is not a lightweight class) helps to understand the concepts. But it still seems like more of a niche domain right now than a general programming technique.

When one can apply it though, it really shines.

I understand the current implementation of matching for xbox live is a big mess of imperative code - this is one area where knowledge of math can actually simplify the programming [2]

"Online gaming systems such as Microsoft’s Xbox Live rate relative skills of players playing online games so as to match players with comparable skills for game playing. The problem is to come up with an estimate of the skill of each player based on the outcome of the games each player has played so far. A Bayesian model for this has been proposed..." [3]




Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.
HN.Academy is an independent project and is not managed or owned by Y Combinator, Coursera, edX, or any of the universities and other institutions providing courses.
~ [email protected]
;laksdfhjdhksalkfj more things ~ Privacy Policy ~