The other week I couldn’t remember the name of a particularly nice bottle of wine I had had a while back. Eventually I figured out that it was this one.
Supposedly, my joining Cork’d three years ago was supposed to solve all this. Cork’d is a website for sharing wine reviews. In theory all your friends join up as well, and recommend each other wine to drink, with all of this wrapped up in a gentle seasoning of Web 2.0 goodness.
It hasn’t really worked like that. Very few people I know ever joined. Few New Zealand or Australian wines, the kind that I was likely to find in shops, were ever reviewed there. And so I fell out of the habit using the site.
So the other week, when I was thinking about this bottle of wine, I didn’t even think to visit Cork’d (which was a pity, because it’s one of the few wines I actually have reviewed there).
Anyway, the whole experience got me to thinking how useful the iPhone could be for keeping track of this sort of thing. Cork’d don’t have an iPhone app or website (though according to their CTO they’re working on it) so for the last while I’ve been trying out LOTS of different apps.
I’ve got a blog posting in the works on all this later. But in the meantime, I can’t say it’s been a uniformly great experience. And I’ve come to doubt the whole social approach, actually. It turns out that:
- The universe is large. There are more wines in the world than there are enthusiastic people to review them and share their reviews. You will almost never find a review of the NZ wine in front of you, unless it’s high-end stuff you can’t afford, or cheap swill sold in export job lots likely to have been run across by Europeans or North Americans.
Some sites/apps address this lack of content by obtaining information from wherever they can find it. While some might find ethical issues with this, it’s unfortunately rather likely that the site with the largest review database will have an advantage. But perhaps quality and consistency of reviews could be a selling point?
- User reviews are uneven. They’re uneven not only in the amount and quality of the data about the wine entered, but also in the ratings themselves. Some users don’t put any notes, just a score. Others don’t even get as far as that. Why freaking bother, I ask myself.
I’m beginning to think that consistent, professional reviews might be the answer. Here in NZ, people like Michael Cooper and Keith Stewart review thousands of wines a year. An accessible database of those would be fantastic, and a really useful adjunct, at the very least, to user-generated reviews.
- Rating systems vary. Different sites have different rating systems, often with no guide as to how to use them to create reviews that are broadly compatible with other people’s. Some sites use the 100 point system, but don’t explain how to use it.
I certainly have no idea how to score wine on a 100 point basis – so how can I be expected to enter such a score? Given my fellow users are likely to be as amateur as me, how then can I trust their scores? How does a 7-point scale on one site compare to a 5-glass scale on another?
Anyway, enough ranting. I’m hopeful I’ve found an answer that works for me (I need a couple more weekends to find out) and I’ll write it all up later.
On the other hand, maybe someone can hack up a mobile-friendly front-end to Bob’s Wine Reviews. Then I can (almost) call it a day.