- 9 reviews
- 8 completed
I already wrote a review for the first part of this course. So, I won't repeat main points here (that is, course is great and of a very good quality). Topics covered in a second part are also extremely common and useful (i.e. string processing, REs, data compression), but there are also more advanced topics (i.e. graph processing, intractability). Graph theory is just 3 weeks, so it may seem like lots of new stuff to learn in such a short period of time, but there's one week break to catch up (that is nice). Overall this course took me more time to complete than the first part, and in the end I decided that I'll take a break with the algorithms for a while :) Well, for me there are almost no dislikes. Some algorithms (like linear programming and manber-myers) were covered briefly, but there were not really enough time for all of them. So I guess they are just topics for more advanced courses (i.e. like linear and integer programming or something) or self-study.
Well, I really love courses that are challenging, that make you think, that leave you with a feeling you know little indeed, that give you this sense of victory after periods of frustration. But of course, there are people that hate that. So, I give 5/5, some will give 1. But well, I quickly forget stuff that I learn, and if I don't have to think much about the problem it will be gone in a week, in a month, not with the problems in this course. So, course is roughly divided into 3 parts: probability intro, learning HMMs and inference. Intro to probability was very steep and left me with the sensation I have to take that other great MIT course on probability mentioned here in reviews (6.041x). I liked the instructor and the videos, he was really passionate about the subject and active in the forums (I was lucky to catch the start of the session) which made it so much more fun to learn. Exercises were challenging and there was quite a great deal of code to write so often I had to stay up until late at night, plus there were also quizzes where one had to calculate by hand. I usually do not read mooc forums as it's often easy to complete exercises myself but during this course I used forums a lot because very often I had no clue what should I do in the exercises or in what direction should I proceed (and seems like I was not alone). Yes, it was often frustrating but I learnt and understood much more in the end. Also, there was an interesting time series analysis project in the end of the course but it was optional and I didn't have much time as it was holiday season so I just made sure forward-backward algorithm doesn't work with that data and didn't press it through the end, but it was an interesting challenge as well. To sum it up, I'm very grateful to MIT, instructors, and all the course participants for this wonderful opportunity to learn.
This is course #4 (out of 4) in the Machine Learning specialization from University of Washington on Coursera. Yes, there were 5 courses initially and a capstone project but the last two were removed. Seems to me like this was related to Dato (Turi) acquisition by Apple but I don't know if there was any announcement. So, if I were to rate the whole specialization I would have taken one star out for that. However, I liked this course as much as the other 3. I've already written reviews for #1 and #2, so perhaps it makes more sense to review the specialization as a whole and not just #4. So, the instructors are very cool and engaging, know what they are talking about and can explain the material reasonably well. It's not a very math heavy course, but it's not watered-down either. I would say it's just about right so one can gain intuition and understanding of what's going on and then go and read more complex books / papers. I also liked that there were "usually omitted" topics covered (like LSH, mixtures of gaussians, kd-trees pruning for nearest neighbors). I encourage you to read syllabuses before you start to get excited about what you will learn. One thing I didn't like was the amount of code written in exercises. I mean, I didn't have to write hardly any code at all, and when I had it was always very clear what I should do. If I were to start the specialization today I would've used the "sklearn way" (there are very detailed instructions for sklearn) and wouldn't have used graphlab at all. In any case, exercises are always very helpful in making sure you really understand what you've learnt, so I purchased the course #4 (I finished #1-3 before Coursera switched to the new pricing model, but I would have payed for #2 and 3 and would have skipped exercises and quizzes in #1 as it's very easy introductory course and if you plan on taking further courses you will be doing similar exercises anyway). Overall, 5/5 specialization, very helpful.
Part 2 of 6 parts series from University of Washington and Dato on Machine Learning. This course is much much better than the first one because it's more challenging, more theoretical explanations are given so you can understand regression in and out. Coverage is quite good (lasso feature selection is given, kernel methods, etc.) and instructors explicitly list points that they omitted at the end so you could go and take a look. I also appreciate the choice of using either scikit-learn or sframes and dato graphlab in exercises , and I think building everything from the ground up is an excellent exercise while using dato and sframes is a very good option for those with less time (though in this part of the course you build more things from scratch, so it's almost an equal choice). Plus I like the way Emily and Carlos teach (though no Carlos in this part), the way they are engaged and excited about MOOCs. One minus for me was that the forums are community driven and I didn't see any replies from professors in there. Overall, highly recommended.
First of all, this is entry level very easy introductory course. It should give you general topic awareness, overview of things you can achieve with machine learning, etc. Don't expect math or even algorithms explanations here. Nonetheless even though I have some deeper knowledge of the topic the course was very enjoyable to me and I look forward to more detailed courses in Machine Learning specialization from Coursera. First of all, instructors were amazing very enthusiastic guys and went to great length to explain concepts and get across ideas they wanted to convey (I'm not new to the subject though so I might be wrong here, but my impression was they were very cool). Assignments were interesting and in python notebook format which I like. Also, I find doing assignments in python more fun then in Matlab. Things I didn't like: forum was not very active, I didn't monitor it closely though. Android version of this course didn't have a button for videos download.
Taking this as a follow up to Apache Spark for BigData course. It's still an introductory course and a bit light on the subject, but well, if you have to cover Spark and Machine Learning and then scaling techniques in details that would be quite a course! I took Mining Massive Datasets from Jeff Ullman and it was much much more heave on math and theory, in fact there was only one optional real world programming assignment there as far as I remember (and I tried for a long time to solve it and it was like really really hard for me). So, this course strikes a good balance between theory, scalability, and programming assignments and thus was able to teach me more I guess. IPython notebooks format is quite good, though assignments are a bit harder here then in the BigData course and I had to scroll up and down quite often to look up variable names etc. Anyway, very good introductory course. Would love to see more advanced and difficult stuff to come.
Please set your expectations right, it's a very good course to get started with Apache Spark, but not that great if you want a deep dive with statistics and mathematics. Instructor explains everything really well, assignments are easy to do and moderately interesting, format of the assignments -- ipython notebooks -- really lets you focus on the problem and not the staging, though I felt sometimes that I could solve problems almost mechanically because they were too refined. Nonetheless, this course is a good value if you want to get started with Spark concepts and doesn't require too much time to complete.
First of all, I didn't see a button to leave a review on course talk, so if you have the same problem as me, try link http://www.coursetalk.com/coursera /mining-massive-datasets/review. I found this course very challenging (about 10 hrs of work a week or perhaps more) and pretty much theoretic. You'll get tons of useful theory, and very little practice. All quizzes and finals are about doing math calculations, and not about coding systems that do the job. Of course, it would be very nice to have both but it would require more time. There was only one (!) programming assignment on LSH, which was absolutely optional. Content is superb. There's a free book available on mmds.org which is very cool. I talked about this course being a theoretic one above, but I've got to say that this theory is extremely useful (should be at least) in practice. Course deals with tough subjects, like what to do if you are implementing a recommender system for instance, and have lots and lots of data that won't fit into memory. How to avoid disk thrashing? This IS tricky, isn't it? And things like that. You know what statistics is, for sure, you know (or heard at least) SVMs and NNs and CF and stuff like that, but probably you didn't try to implement it on web scale, did you? :) For absolute beginners in machine learning (experience ~ none), this course might be useful as well (perhaps with more work to do), because brief introductions to the corresponding topics are given. Now, few BUTs. I can't give this course 5 stars, and I was doubtful about 4 stars even, because organization was terrible. Missing pages on coursera, missing links (i.e. to find a way to see exam results you had to guess the right link !!), course structure was not intuitive at all. I don't know why, but they changed the order of the videos completely, with some weird references to the future lectures all the time (only they didn't say it would be in the future lectures). Instructors are great dudes and know stuff they teaching about, but some of them just can't teach. Jeff Ullman is a cool guy, to be sure, but his lectures were particularly dull... how to put it: content was superb, but uninterestedness of the lecturer and droning voice just put me to sleep in no time :) In this regard I love Jure Leskovec the best, he seemed passionate at least. So, great content, organization was terrible, learned so much (but not in depth, because to learn those things in depth you need more than 10 hrs of work a week for 3 months, be sure), and lecturers were so-so.
I've taken a number of computer science courses and this one is definitely among the top ones. Robert Sedgewick speaks a bit slowly but explanations are great and easy to understand. Drilled exercises are a bit boring, but programming assignments and final exams were very engaging and of high quality. Also programming interview questions were an interesting addition to the course, though not enough help to solve them was provided (only small hints, and a not very active sections in discussion forums). Had to google for answers : ) and some still remain unsolvable for me. First part focuses on basic topics like searching and sorting, and also deals with basic data structures -- a stuff, really, every programmer should know. So, if you don't know how to implement a hash table you may consider taking this course :) There's not much math in this course, the focus is on learning about different algorithms and data structures, and using them to solve day-to-day problems, not about developing mathematical models to analyze algorithms and such stuff. Java basics are required if you want to have an output from the autograder but I think programmers in other languages might also find this course useful (if they don't mind some Java :). Finally, there's also a book and an excellent website, so you can really get "it".