Check it out, Todd Beck has calculated the effectiveness of various college hoops computer rankings. Actually he did this in real-time during the season. Some observations…

Looking at the second half results, most systems fall on the correct side of the spread 50-52% of the time. This isn’t very good in the grand scheme of things, but most systems were not designed with beating the spread in mind. Still, any system advertising itself as predictive might aspire to do a little better (myself included).

Jeff Sagarin’s predictive ratings was the worst of his three systems at predicting. I think his intent was that it would be his best system at predicting, hence the name “Sagarin predictive.” Actually though, the system was the worst of any system against the betting public, and darn near last in predicting actual winners. So while my system has produced pretty lame results, I don’t feel so bad considering Sagarin’s ratings are usually considered the most respected.

I think this means it’s time for a new system. Maybe I can’t do much better than I already have (which isn’t very good). Basketball is so unpredictable that any system is going to look foolish a certain amount of the time. A start will be to ditch using raw scores, and use something along the lines of the offensive and defensive efficiency ratings I have discussed previously. We’ll see what happens.