More on Udacity and Massive Open Online Courses

I’ve posted previously about Udacity’s Programming a Robotic Vehicle course I think that online courses may come to revolutionize college experience. However I recently read a thorough and well-written critique that reaches the opposite conclusion, and thought it worth revisiting.

Since taking that course, I’ve also started taking Udacity’s statistics course, mostly as a refresher, and I’m about 2/3rds of the way through.  Thus, I read with interest AngryMath’s critique.  AngryMath is a college level statistics instructor in New York City.  He has a very harsh critique of the course, and by extension, the whole MOOC model. He highlights 10 major problems and I think he makes a lot of good points, but that he over-generalizes in his criticism.  You can read his full critique on his site, here I list some of them and offer my thoughts:

  1. Lack of Planning: I agree.  The original syllabus seems to have been published in advance of the course, it’s not particularly well-organized, and doesn’t introduce material in the best order.  It seems much poorer in this respect than the robotic vehicles class.  However, that’s an indictment of this particular course, and this can occur in online and conventional courses.
  2. Sloppy Writing: Again, I agree.  AngryMath also cites the lack of a textbook or written material.  I don’t think a textbook is needed, but course notes similar to those developed for the robotics course would be helpful.  I certainly found them helpful in that course.  AngryMath has only sampled the statistics course, so has not reviewed this approach.
  3. Quizes: He thinks they aren’t in the best places and criticizes when they are used to introduce material.  I DON”T agree that they come in out of the blue to introduce material, and I like that approach occasionally anyway.  Try things out on your own first.
  4. Population and Sample not properly distinguished: Again, AGREE!  I’m taking this as a refresher, so knew a distinction was being overlooked and researched it again on my own.  More generally, this is a criticism I have of both Udacity courses I’ve taken.  It’s fine to simplify and present a high level sample, but let the student know you’re doing it and where to get more information.
  5. Final Exam Certification: AngryMath criticizes the certificates as meaningless, because you can repeatedly answer the same static questions until you get it right.  This makes the “certificate of accomplishment” meaningless.  Well, yes, at the present time, for this course, BUT 1) they’re already working to set up standardized, monitored testing using existing testing companies to offer “real” certification.  If you’ve ever taken a test at one of these facilities, it’s more heavily monitored than a typical college final.  Udacity makes no claims about the value of their current certificate, and are working to add this feature; 2) While a few multiple choice questions can be taken again and again until you get it right without true understanding, it’s easy to add variation to the questions, even if that wasn’t done in this course, and for courses with programming assignments, random guessing won’t work; and finally 3) The retake until you get it right approach can be a good learning model if you eliminate the other issues and make the questions different.It indicates what you’ve learned by the end, rather than how much you picked up the first time.

So, while I’m learning from the class, and intend to complete it, I agree with AngryMath that the course is done rather poorly, has errors, and could be much better.  Where I disagree is his generalization:

“Some of these shortcomings may be overcome by a more dedicated teacher. But others seem endemic to the massive-online project as a whole, and I suspect that the industry as a whole will turn out to be an over-inflating bubble that bursts at some point, much like other internet sensations of the recent past.”

I believe MANY of the problems in this class are specific to the class, especially having taken the much better Programming a Robotic Car class.  While the online model has many shortcomings compared to live teaching, unlike AngryMan I believe many can be adequately worked around.  For example, I found far more information in the crowd-sourced robotic vehicle course forum and wiki than I can recall getting from office hours and recitation sections in a live class.  Other shortcomings will clearly remain.  BUT, and this is where I part company totally, I don’t believe the shortcomings exceed the savings.  If I can get 80% as good a class at a 10th or less than the current, and rapidly rising, cost of a traditional college course, then this is clearly the future.  Perhaps not entirely, and I hope not, as the college experience is often one to be treasured.  But I can easily see the competition of MOOC’s forcing a new model, perhaps just one or two years on campus, and the rest done with far cheaper MOOCs.

Also, online courses offer a chance to “teach the long tail.”  Small colleges can’t have the breadth of faculty to cover all topics at advanced undergraduate or graduate levels.  In addition, maybe only 1 or 2 students at the college is interested in the topic.  Many times, college and universities in physical proximity will offer the opportunity to get credit for attending classes taught at nearby schools (e.g., the Five College Consortium in western Massachusetts).  Imagine “virtual consortia” of hundreds of schools throughout the country (or even the world).  Through this model, the 1-2 students per college can be virtually assembled to form a sizable class, taught by a well-qualified professor from one of the consortium’s schools.

Bottom line: AngryMath has a strong, valid critique of Udacity’s Statistics 101 course, but it’s dangerous both to infer the quality of other current classes from a sample set of one, and equally dangerous to extrapolate to the future.

For yet another take, there’s a good article in Forbes: Massive Open Online Courses — A Threat Or Opportunity To Universities?

Compass Trials and Tribulations

I finally got back to working on my 1st robotic vehicle, and it’s been 1 step forward, two steps back, with one of the back steps being self-inflicted.  My first go at a robotic vehicle used only wheel encoders for dead reckoning, working alright for measuring distance traveled, but suffering the well known lack of both precision and accuracy in heading.  So for determining heading, I decided to add an electronic compass, in particular the Devantech CMPS10 tilt-compensated compass.  I don’t really need the tilt compensation, since I’m running over flat ground and flooring, but I figured I may want to reuse the compass later on another project and it’s inexpensive for a tilt-compensated compass.

So, I add the compass and run a simple test program.  First issue is that there’s clearly interference from the metal, motors, and/or electronics.  So I’ve mounted in on an aluminum mast, and that seems to clear things up.  One problem down.

Next I modified my code and went into a several hour debugging nightmare.  As is often the case, the bug is obvious in hindsight, with the symptoms pointing to it.  My robot ran forward the set distance, then, when it should have turned left, it spun right in endless circles, bringing to mind the Tommy Roe song Dizzy for those of us of a certain age. In debugging, I notice that while the simple compass reading test program works fine, when I load the full robot code, the bearing jumps in large discrete increments of about 22 degrees.  Curious, obviously a clue, but I couldn’t figure out what it meant.  Only after a couple hours of staring at code and trying small incremental changes did I notice the problem. Where I should have typed: lowerByte = compass.read(), I had instead typed lowerByte – compass.read().  The higher precision result from the compass is sent in two bytes, and I was never actually setting the lowerByte value, resulting in the large discrete jumps.  One self-inflicted problem solved.

Why did the robot turn right instead of left? Either the code was erroneously jumping to the middle of the obstacle avoidance routine, where it try turning right to avoid obstacles, or something else.  This was relatively easy to isolate, as when the robot is hooked up to the PC with debugging serial.print statements on, it dumps its current state and parameters each time through the loop.  So I quickly saw it wasn’t a bad state change.  The problem was self-inflicted wound two: a sign error.   A clockwise turn is positive in the coordinate system, but it’s a negative turn in terms of compass bearing (e.g., turning right from heading 90 degrees to heading 0 degrees).  So, flip the sign and I’m in business.  An easy one.

Now it moves straight to the first waypoint, turns in the proper direction, but not to the correct heading.  There’s a lag in the compass reading that I need to account for.  At least that’s not a silly mistake, and I know the source of the problem.

Here’s a picture of my initial compass test (without the robot) , and then MARV-1 with the compass mounted on the mast:

Tilt Compensated Compass Test displaying bearing

 

MARV-1 robotic vehicle with electronic compass on mast

Recommended: Udacity’s CS373: Programming a Robotic Vehicle

I haven’t posted recently because I haven’t had time to mess with my robots.  Instead, my free time has been taken up with learning more theory, via Udacity’s free 7-week class.  There will be a new session starting next month, and I highly recommend that you check it out.

The class gives a broad but hands-on introduction to key robotic concepts and algorithms.  It covered localization,  filtering (monte-carlo, Kalman, and particle filters), pathfinding (intro to A*, dynamic programming, etc.), PID (Proportional Integrated Differential) control, and something called graph SLAM (Simultaneous Localization and Mapping). If you’re already well-versed in one or more of these, you probably won’t learn anything new on that subject, but if you’ve only a passing or no familiarity, the course is great.

The format for Udacity’s courses is what really stands out: short 5-10 minute videos with a question or short programming assignment at the end.  It’s a slightly higher tech Khan academy: mostly an electronic whiteboard and pen, but some videos from Google’s autonomous vehicle and the DARPA challenge.  The programming is done in python and submitted directly from the web page (although I recommend an IDE for the weekly homework programming).

Beyond the robotics course, I think this is starting on the path to the future of college education.  I think it’s much like newspapers and the print Encyclopaedia Britannica.   They have valuable features that the online experience can’t duplicate, but the cost differential is just too great to sustain the old model.  When you can offer a college level course to thousands of students at once, on-line, and crowd-source support to partially make up for the lack of direct, 1 on 1 help, it’s hard to imagine that, in 10-30 years, this won’t be the future of a college degree.  Now, it’s not there yet.  This was a beta run, and the numerous glitches and automated grading problems made that abundantly clear.  In addition, there’s a lot to work out, especially for non-tech courses.  But, compared to $10-$50K per year for a college resident degree?  I think I may have seen the future of education.

UPDATE 2/8/2013: I’m more convinced then ever that some sort of blended mix of low-cost online courses and a reduced “residency requirement” will be at least one model for future degrees.  Private tuition costs have been rising faster than healthcare, while college credit is already becoming available for some online courses.  The University of California system has partnered with Udacity to offer a couple of lower division and remedial courses for credit, online, for $150.  And now, The American Council on Education (ACE) has approved five Coursera courses for “credit equivalency.”  Personally, I loved the college experience, but with today’s high cost, it’s becoming unaffordable for too many, and/or imposing a huge debt.