Scoring wine

By Jamie Goode | 16th December 2021

‘How was your stay? Help us to do better. Please fill in this satisfaction survey!’ I hate getting these pieces of paper, and I never fill them in. I suspect that one of the problems with the likes of TripAdvisor is sampling bias. I don’t want to upset any readers who like this sort of survey, but I wonder whether the subset of people who actually fill in these sorts of forms, or who report back on their restaurant experience, are the sort of people you’d want rating your performance? When I had a proper job as a science editor they bought in performance appraisals. My boss hated them (he recognized that they rewarded the box-tickers who met the measurable sub-goals, without necessarily excelling at their main job) and so he gave me 100% on all counts. I loved him for it.

Scoring wine

But we are used to rating things. The already-mentioned TripAdvisor covers hospitality of all kinds, and then there’s Rotten Tomatoes for films. And for wine, we have the wine critics. Back in the 1980s, Robert Parker, an Baltimore-based lawyer with a love for wine, changed wine forever by bringing in a 100 point scale for wine and then popularizing it. A self-proclaimed consumer champion, he used the rather inflated US school grading system as a model for scoring wines. Here in the UK, I remember a score of 70% was enough to get you a first class degree, and not many of these were given out. In Parker’s world, 70 is pretty much a fail. For him, wines got interesting at 85 Points, and 90 was a really good score. 95 was pretty rare: an exceptional wine.

The scoring system took off and became the norm for wine critics who followed in his wake. Now there are lots of wine critics, and pretty much all of them score out of 100. But in a bid to be noticed, the critics have awarded ever-higher scores. Now 90 isn’t an exceptional score at all, and 95 seems to be the new 90. Such is score inflation.

A score sounds like a data point, but it isn’t. It’s an attempt to capture in a number something rather intangible. However much critics like to make it seem that a score is a property of a wine, and that they as professionals have some great insight, there’s no real science behind scoring. It’s just way of saying, in a number, how much you liked the wine and where you think it stands in relation to its peers. Critics try hard to not let their own personal tastes, or even their own biology (we all have slightly different sensitivities to tastes and smells) get in the way, but it’s very hard not to have some stylistic bias that will affect scoring. Indeed, it’s wise to pick a critic whose palate and preferences seem to track yours most closely if you are going to make use of critic ratings.

While professional critics fulfil a valuable role, they leave quite a lot of wines unrated. Look at the South African section in your local supermarket. I suspect you’d be hard-pressed to find critic ratings for most of the wines there. The wine world is just so diverse that even the hardest-working specialist critic can’t rate all the wines, and so they tend to focus on the top end of the market. Most people, however, buy less expensive wines in supermarkets and chain restaurants where the critics haven’t got round to scoring these bottles.

This is a gap in the market spotted by Vivino and other apps relying not on critic ratings, but on user ratings.

Vivino was founded just over a decade ago by Heini Zachariassen, and with its latest round of fundraising it’s valued at between $600 m and $800 m. He saw that the wine world was complicated, and there was a need for an app to make it easier to navigate. But this would only work if the data was filled in by users. The gap in the market was created by the emergence of smart phones, the fact that people were getting more used to crowd-sourced reviews (as opposed to the limited number of expert reviews) and the fact that no one else was addressing this issue. Initially, when the Vivino app was rolled out, there were many competitors, but Vivino’s approach was to release fast, learn fast, and keep focused.

Vivino took 12 months to get to 1000 app downloads, and 30 months to 1 million. It took 5 years to get to 100 million scans, but from 1 billion scans to 1.1 billion scans, the same numerical growth, took just 2 months. The app has 53 million users, and 1.7 billion pictures of wines. 14.2 million different wines are on it, and it’s free to use. How does it make money? Through wine sales. In 2020 these were $265 million, up 103% from 2019 ($131 m).

Do Vivino’s wisdom of the crowd ratings mean critics are no longer needed? No: if you look at critic ratings for ‘fine’ wines – in the case of South Africa, the sorts of wines we like to talk about and which geeks are interested in – then the critics are largely doing a solid job, and are reliable guides. If you looked at Vivino for the top 20 user-rated South African wines, I don’t think it would do such a good job at all. In fact, I just did this search and it came up with a slightly old-fashioned list of mainly big, sturdy, heavy-bottled ego wines, missing out on the sorts of bottles that make South African wine such an exciting scene right now. If a critic came up with such a list, I’d tell you to avoid them. But where Vivino comes in use is in helping users navigate a supermarket shelf, or a wine list in an unambitious restaurant.

Ultimately, scores and ratings can be very helpful, in the right context and from the right source. But they are guides only. The best way to find new wines is through personal recommendations, from someone who knows their stuff but who also knows where you are on your own journey with wine, and a little about your own preferences.