Monday, May 04, 2009

Improving movie recommendations

If you haven't yet checked out the new CACM blogs, you need to soon. One of the posts that caught my attention was Greg Linden's What is a Good Recommendation Algorithm? Linden wonders if Netflix's one million dollar reward for a better recommendation engine is a little short-sighted. The goal for their recommendation system is to only show people how much they might like a movie. But Linden points out:
However, this might not be what we want. Even in a feature that shows people how much they might like any particular movie, people care a lot more about misses at the extremes. For example, it could be much worse to say that you will be lukewarm (a prediction of 3 1/2 stars) on a movie you love (an actual of 4 1/2 stars) than to say you will be slightly less lukewarm (a prediction of 2 1/2 stars) on a movie you are lukewarm about (an actual of 3 1/2 stars). Moreover, what we often want is not to make a prediction for any movie, but find the best movies. (emphasis mine)

Shifting gears a little, I want talk about a couple of small fixes to an existing movie recommendation system that could make customers a lot happier.

I haven't used Netflix, but I've been using Blockbuster Online for over a year, and I've played with their recommendation feature a lot. I would assume their recommender is on par with Netflix (hint: someone needs to compare the two).

One feature Blockbuster offers allows you to select "Do not show me this movie again", a little icon on the side of each movie's ratings. I've clicked this icon a lot (is it just me, or there's a lot of garbage out there?), hoping Blockbuster would stop recommending these specific movies to me and others like them. However, the screen shot below is what I saw this morning when I logged into my account:

Note how I was recommended "Zack" and "Quarantine" despite having clicked on the no-show icon weeks ago. They also recommend , a movie I've already rated (and therefore obviously seen). But since I didn't rent "Changeling" directly from Blockbuster, they still offer it as a movie I "might have missed."

These movies do not appear in my formal set of recommendations (the screen that results from clicking on the Recommendations link), so my guess is Blockbuster is using a different set of algorithms to populate their might-have-missed list from their formal recommendation list. However, I suggest that the might-have-missed list should take advantage of previous ratings to improve overall customer satisfaction.

This should be common sense: Do not suggest a movie that a user has already marked "do not show me this movie again". Especially not on the first page the user sees when logging into your site.

One more point. Below is a screen shot from the first page of recommendations made by Blockbuster. None of the movies below appeal to me, but I can see how they might have been recommended based on my viewing history and ratings.

But one movie really stands out as a bad recommendation: "Swing" (bottom-left). Note how it has only received two stars on average, equivalent to "I didn't like this movie".

Why would Blockbuster think I would like this movie when most people don't?

I know my taste in movies is probably not typical, but I don't think I've ever given a movie with an average rating of two stars a rating better than two stars. Even if Blockbuster thinks this movie matches my tastes, it would make much more sense to put movies with higher overall ratings on the first result page and bump lower rated movies back a few pages.

My experience in general has been that Blockbuster's recommendations don't really work. I've found one recommended movie in the past year that I thought looked interesting. Then again, I don't often try iffy movie recommendations because I'm not ready to gamble on two hours of a nice evening.

I'm looking forward to a time when the recommendation system really works well, but until then, I'll be consulting with my friends and family who have a much better idea of what I really like to see.