Tuesday, October 21, 2014

The Problem With Conde Nast's 2014 Top 100 Hotels

A large part of my job involves building and maintaining complicated financial models. For those that do a lot of work in Excel, you know how easy it is to fat-finger a number, or link to an incorrect cell. With all the inter-dependencies within a model, a single mistake can wreak havoc and produce some strange results.

That's why as part of the review process, we always perform the "smell test" before sending something off to the client. What's the smell test? Basically, you ask yourself if what you're looking at feels right, and if your gut is telling you that something is off, more often than not it is.

So what does this have to do with hotels? Let me explain.

2014 Conde Nast list of top hotels smells funny

Conde Nast just released the results of its 2014 Reader's Choice awards, ranking the top 100 hotels and resorts in the world. It's an annual list that was compiled from over a million votes from nearly 77,000 readers.

It's certainly a credible data set purely based on size, but in my opinion a heavily biased one. How and why did I come to that conclusion?

Because the rankings simply don't pass the smell test. Here's the list of US properties below:

92. Dunton Hot Springs, Dolores, Colorado
90. River Inn of Harbor Town, Memphis Tennessee
88. Ocean House, Watch Hill, Rhode Island
86. Montage Laguna Beach, California
85. Chateau du Sureau, Oakhurst, California
84. Colonial Houses - Historic Lodgings, Williamsburg, Virgnia
80. Thomson Chicago, Illinois
75. Winvian, Litchfield, Connecticut
74. Hotel Sorella CITYCENTRE, Houston, Texas
70. Stein Eriksen Lodge, Park City, Utah
66. The Wauwinet, Nantucket, Massachusetts
65. The Sanctuary Hotel at Kiwah Island, South Carolina
64. The Cloister, Sea Island, Georgia
62. Weekapaug Inn, Rhode Island
61. Twin Farms, Barnard, Vermont
60. Canoe Bay Hotel, Chetek, Wisconsin
59. The Pitcher Inn, Warren, Vermont
56. Wequassett Resort and Golf Club, Cape Cod, Massachusetts
53. Madden's on Gull Lake, Brainerd, Minnesota
51. Wailea Beach Villas, Maui, Hawaii
49. Waldorf Astoria, Chicago, Illinois
45. Old Edwards Inn and Spa, Highlands, North Carolina
35. The Point, Saranac Lake, NY
33. Primland Resort, Meadows of Dan, Virginia
31. C Lazy U Ranch, Granby, Colorado
30. The Langham, Chicago, Illinois
29. 21C Museum Hotel, Cincinatti, Ohio
27. Fairmont Heritage Place, Franz Klammer Lodge, Telluride, CO
25. Rancho Valencia, Rancho Santa Fe, California
22. XV Beacon, Boston, Massachusetts
21. Lake Austin Spa Resort, Texas
20. Mii Amo, Sedona, Arizona
16. The Lodge, Sea Island, Georgia
14. Casa Palmero, Pebble Beach, California
13. Cal-a-Vie, Vista, California
12. Gateway Canyons Resort, Colorado
5. Triple Creek Ranch, Darby, Montana
2. Lodge and Spa at Brush Creek Ranch, Saratoga Wyoming

By my count, according to this list, 38 of the top 100 hotels in the world are found in the US, including 10 of the top 25.

As much as I love the US, my reaction if you told me that roughly 40% of the top hotels in the world are located in the US?

"B.S., I just don't believe it."

Law of large numbers

Here's the Wikipedia definition of the law of large numbers:

In probability theory, the law of large numbers (LLN) is a theorem that describes the result of performing the same experiment a large number of times. According to the law, the average of the results obtained from a large number of trials should be close to the expected value, and will tend to become closer as more trials are performed.

How does this apply to this situation? Let's take a look at a the top 25 US hotels from the rankings, along with the number of reviews available on TripAdvisor for each property, shown in parentheses. In this case, I'm using the number of reviews on TripAdvisor for a property as a proxy for the number of guests passing through the hotel's doors. That may or may not be appropriate, but let's assume that it is for now.

Other than #22. XV Beacon, each US property has been reviewed fewer than 200 times, with #13. Cal-a-Vie having as few as 23 reviews.

25. Rancho Valencia, Rancho Santa Fe, California (122)
22. XV Beacon, Boston, Massachusetts (729)
21. Lake Austin Spa Resort, Texas (198)
20. Mii Amo, Sedona, Arizona (120)
16. The Lodge, Sea Island, Georgia (129)
14. Casa Palmero, Pebble Beach, California (39)
13. Cal-a-Vie, Vista, California (23)
12. Gateway Canyons Resort, Colorado (164)
5. Triple Creek Ranch, Darby, Montana (152)
2. Lodge and Spa at Brush Creek Ranch, Saratoga Wyoming (81)

Let's contrast that with the number of reviews from a a handful of the international properties in the top 100. As you can see, these international properties have numbers of reviews that are several multiples higher than their US counterparts.

95. Mandarin Oriental Tokyo, Japan (847)
83. Four Seasons Resort Bora Bora, French Polynesia (1,373)
38. St. Regis Punta Mita Resort, Mexico (1,139)
28. Burj Al Arab, Dubai, UAE (1,992)
17. Qualia, Hamilton Island, Great Barrier Reef, Australia (719)

Given that the #1 hotel in the list, Londolozi’s in South Africa, had a near-perfect winning score of 98.958, the margin of difference between the scores of the hotels on the list isn't very big. Which means that every additional review that does not give a perfect score will bring down the average score of the property. So, given the grade inflation going on here and the need for a perfect score to win, it's actually beneficial to have fewer reviewers voting on your property. 

Problems with the methodology

This is the most US-centric list of top hotel's in the world that I've ever seen.  Conde Nast explains the methodology behind the results, and here are 3 reasons why I believe that the methodology had a part in heavily influencing the results:

1. Without knowing much about the Conde Nast readership, I'm assuming that they are predominantly US-based. Therefore, it's very likely that the number of US-properties being voted on outnumbered those of any other country, causing US properties to be over-represented. In some ways, this is unavoidable given the demographics of the readership, but combining it with #2 causes issues.

2. While the prevalence of US-properties visited may be higher than any other country, each one wasn't visited as frequently as other top international hotels. The methodology states that "Candidates must receive a required minimum number of responses to be eligible for a Readers’ Choice Award." If that minimum was set too low, then there is a strong possibility that scores were skewed by several outliers. It would be akin to someone winning the MLB batting title, batting 1.000, with only 5 at-bats for the season.

3. For the first time, Conde Nast included "value" as a new category for 2014. To me, this is an even bigger red flag for the high prevalence of US-based properties. Based on my own experiences, the value one can get in Southeast Asian countries for example is unmatched anywhere in the world, and should have propelled more of them into the rankings.

How do you feel about the latest Conde Nast hotel rankings - do they pass your smell test?

2 comments:

  1. I love seeing quantitative analyses on these things. Conde Nast needs to hire a math-minded consultant! ;)

    What you're saying is like how, on Amazon, when you sort items by rating in certain product categories, the top few would have only 1-2 reviews while the actually popular products with 1,000+ reviews are placed at the bottom of the page.

    ReplyDelete
    Replies
    1. Maybe we can moonlight as internet math consultants =)

      I think most sites have acknowledged this as an issue, have have begun filtering out products that don't meet a minimum threshold of reviews. I also know that sites like Yelp will also filter out reviews from users who haven't met a certain threshold of reviews as well.

      Delete