There’s been a lot of talk about the Airline Quality Rating (AQR) lately since the 2012 rankings just came out. I’ve generally avoided talking about this and every other survey out there, except for one missive back in 2007. It seems like this might be a good time to revisit and explain why I don’t like these things. Today, I’ll just focus on the AQR.
You’ll hear the AQR referred to as a study or a survey when that’s being pretty generous. What the guys behind this do is take all the information the government releases in the monthly Air Travel Consumer Report (on time performance, mishandled bags, complaints, and denied boardings) and then smash it up into a rating. How do they smash it up? Well, they assign a weight to each metric and then combine them all.
Where did those weights come from? They were “established by surveying 65 airline industry experts regarding their opinion as to what consumers would rate as important.” Awhile later, they surveyed 766 frequent fliers but didn’t find much of a difference so they just kept the weights the same.
So what we have is really just a rehashing of what the government tells us each month. Is that useful? Not particularly. Let’s go over a few of the problems. Keep in mind, most of these complaints are about the underlying data as collected by the government. But since that’s what the AQR uses, no distinction is needed.
- Operating Airline – The biggest problem of all is that this is based on operating airline and not marketing airline. What I mean by that is this. When you buy a ticket on United, how often are you flying on United Express? There are more United Express flights than mainline United flights, so the chances are very good that you’ll end up on an Express partner, and not just on United. But the “United” we see here is just for mainline.
Can you add up the airlines by learning which ones operate for which big airline? No. Think about SkyWest, an airline which flies for Alaska, Delta, United, and US Airways. There’s no way to break apart what’s related to which marketing airline. - The 1% – We should feel lucky that we have the opaque SkyWest data because many of the regionals don’t report at all. Airlines with more than 1 percent of domestic scheduled passenger revenue have to report, and that leaves out many of the little guys. There’s no Comair, no Pinnacle, no Trans States, etc. So it’s a very incomplete picture. Don’t go looking for Spirit or Allegiant either. (Virgin America just got big enough to start reporting.)
- Bags per People – People often like to look at trends with this report, but that’s hard to do when it doesn’t adjust for business changes. The most glaring is in the baggage department. Mishandled bags are measured by looking at the number of bags mishandled per 1,000 passengers. In other words, it doesn’t matter how many bags are actually checked but rather only depends upon how many people fly. This is hugely problematic as airlines have started to charge for bags over the last few years. People check fewer bags, so you’d expect there to be fewer mishandled bags. But this report wouldn’t catch that because it only looks at the number of lost bags per total passengers.
- The Few, The Loud – Complaints are another issue here. When people complain, most of them complain to the airlines directly with the hope that they will get something in return. The number of people who get so annoyed with the airlines that they bother complaining to the DOT is miniscule. There were 7,465 complaints recorded by the DOT against the US airlines last year. That may sound like a lot until you realize there were more than 630 million passengers enplaned. That means about .001% of passengers complained. With numbers so small, it’s very hard to get any sort of true trend when comparing year over year.
Delta, for example, had the highest number of complaints of the airlines last year with 1,406. One angry 777 full of passengers could band together and complain to the DOT and that would show as a 20 percent increase if everything else held the same. If you’d like to read more about this, take a look at my post ripping into the Lufthansa catering union for really bad analysis. - Too Broad – One other issue here (and with most ranking efforts) is the broadness of the results. Hawaiian shows up as #2 in the survey, and a lot of that is going to be thanks to its stellar on time performance. But as I’ve written before, that on time performance rides on the coattails of the interisland operation while the long haul flying lags significantly.
These are just a few reasons why I don’t like the AQR, but I don’t want to make you think that there’s another survey I like better. There isn’t. I hate all these types of forced rankings because it’s rarely indicative of what you’re going to get when you fly.
22 comments on “Why I Put Very Little Stock in the Airline Quality Rating”
You can never go by a survey as the info will never be accurate. We have like 6 billion people on the planet, but a survey of a 1000 people is supposed to represent how 6 billion feel….no!
If company ABC123 hires a firm to conduct a survey, well golly gee the result is going to favor that company. One company can use a different scale to get results, while another company can use a different scale and get a different set of reults to the same questions.
Survey’s are worthless and the results can give people the wrong idea about something. And since most people more then likely lie when taking a survey to benefit themselves if they think it matters, a survey can never be taken as valuable.
I’ll admit to not understanding the exact details, but there is proof that if you do an appropriately controlled survey you can survey several thousand people and you’ll be able to accurately predict a census of the population of several million people.
I’d be possible to craft an airline survey that would be statistically accurate, there are just holes galore in AQR’s underlying data.
In addition to what Nicholas said, which is true, you have to remember that the statistics are set across a large population set. So it doesn’t necessarily hold true down to the individual level, but can give a general view for large subsets of the population.
That all being said, the history of aviation surveys is extremely questionable. The statistical results will only be as good as the underlying data, which so often just isn’t all that great.
To quote Mark twain, “There are three kinds of lies; lies, damned lies and statistics.”
Theres also “Figures never lie, but liars figure.”
So if you read the report on baggage handling, any increase in mishandled bags would be a very bad sign in terms of operational efficiency. As fees go up and fewer bags are checked, it should be easier to get the bags to the correct places. Except for American mainline flying, baggage handling seemed to get worse. Even Southwest had a hit, although they may have had more bags checked per 1000 passengers since people checking multiple bags may be more inclined to book with them.
Shane – It’s hard to say how big of an issue it is. Between 2011 and 2010, I’m not sure how many bag fee policy changes there were so the number of bags may have been relatively flat. But I could be wrong on that – don’t recall the last time there was a change. Also, bag handling is directly tied to on time performance, so it’s good to see if that had an impact year over year.
OMG so true. Don’t even get me started on Skytrax….
The biggest issue is what do all these surveys and rankings actually mean?
I buy tickets based on the lowest price and least number of connections, I can’t recall a single time in my life I have based my ticket purchase decision on one of these “quality rankings”. I guess it may be a meaningful “performance metric” for airline execs who want to apply what they learned during their MBA studies but I would bet it has zilch impact on the bottom line.
Agree completely. Who outside of industry insiders patting themselves on their backs get anything out of this. Everyone I know either buys on price or is a slave to their favorite rewards program. Yawn, this is a non-story for 99.9999% of us.
Such a waste. Get DOT out this quality rating business, or at least make the airlines pay for DOT to collect and publish this garbage. When the industry today is such a hodge-podge of “operating,” “operated-by” and “code-share” entities, the data is so useless.
Surveys can be useful and any one of us here (with Cranky numero uno) would make a wonderful respondent for any airline should it really care as to how we think about things. Recognizing, none of us represent all travelers, but we probably have a better understanding of more travelers than most others have.
SURVEY:
First, explain what kind of traveler you are, today, and what you may have been, in the past. (Oh, by the way, here is what we think we already know about you.)
Second, here are some questions about us and how we operate:
1. What are we doing you like? Why?
2. What are we doing you don’t like? Why?
3. What aren’t we doing you would like us to do?
4. How come you didn’t mention anything about this?
5. What would you think if we did this? Why?
Third, please explain your travel on other airlines.
1. What are they doing you like? Why?
2. Should we adopt what they are doing that you like? Why?
Just a stab at this. Others surely have ideas as good, if not better than mine.
So I was under the understanding that most of these statistics were collected by the airlines and reported to the DOT. (A few of them like complaints to the DOT, are DOT generated..) CF, do you know who does the primary collection of most of this data?
The data is reported by the airline to the DOT, except of course the complaints which come in directly.
In addition to all the problems already mentioned, most of which I agree with, the methodology is bad. Years back the AQR factored meaningless stats like fleet size in trying to judge ?quality?. In more recent years they watered it down to using the DoT stats, but I believe they developed their methodology by rigging it until they came out with the ?right? results?.LCC?s are good, majors are not so good, and regional airlines are awful. Even if you believe that the DoT stats themselves are beautifully representative of the air travel experience?.which as others have pointed out they are sometimes not?the way the study authors use them is rigged.
On-time is heavily underweighted. If you crush the raw data you can come out to the very numbers they do ? numbers which put AirTran on top. Part of AirTran?s success is ranking #3 in on-time with 88.4% on time. But let?s say AirTran instead posted the worst on-time score of anybody, 73.3% on time. They?d still rank #1. As a matter of fact, AirTran could have posted a dismal 48% on time for 2011 and the AQR would still have put them on time. That?s how little they weight on time performance.
On the other hand, bag mishandling is overweighted. AirTran ranked #1 in bags in 2011 with an enviable 1.63 mishandles per 100,000. Had they posted a slightly-worse 2.07 per 100,000 they still would have been tops in that category, but overall they would have fallen to #2 in the AQR. Had they posted the industry-average mishandle rate of 3.39 they would have been tied for 7th place in the AQR in spite of high scores in the other categories. Or, to put it the other way, had American Eagle could have been #1 in on time, #1 in fewest complaints, and #1 in involuntary boarding, yet they would have still ranked 14th in the AQR based on poor baggage. That?s how heavily baggage mishandles is weighted.
I believe the authors believed they had to underweight on-time and complaints because regional airlines sometimes look too good in these two areas. Sometimes regional airlines (such as Skywest in past years) have posted rather good ontime results. And complaints about a United flight from Appleton to Orlando probably get attributed to United Airline for the mainline segment more often than to Skywest for the feeder segment. On the flipside, regional airlines are certain to do poorly when it comes to bag mishandles because so much more of their traffic involves a connection?when bags get lost the most. LCC?s have somewhat fewer connections and thus could tend to have fewer mishandles than majors, and certainly less than regionals. (Southwest, by not charging for bags, may well be hurting their ranking because they may handle more bags per 100,000 compared to other airlines who discourage checked bags.)
So by heavily penalizing airlines for bag mishandles compared to other factors, the survey nicely puts LCC?s high, legacies in the middle, and regionals in the basement, just as everyone ?knows? airline quality really is. It takes rigged methodology, but it gets them headlines far and wide every year.
As further proof of how the survey really works, check out how JetBlue did. They ranked a weak 10th in complaints and ranked dead last (16th) in on time. Yet by virtue of #1 in bumping and #2 in bags, they ranked 3rd best in the ACQ survey, only a hairs breadth out of the #2 spot held by Hawaiian.
I don?t say this to slam the top airlines ? perhaps a more worthwhile quantification of quality would put FL, HA, B6 and F9 on top. But the AQR is garbage science.
At the risk of getting deleted by Cranky for self-promotion …
Rigged? You haven’t seen “rigged” airline report cards until you read my book about Kate Hanni and the stranger-than-fiction story behind the airline passengers’ bill of rights — Diverted: High Flyers and Frequent Liars available on Amazon.com.
The prologue and first chapter can be found at http://markmogel.blogspot.com
… perhaps he will make an exception in this case.
I never really cared about AQR, and if I really needed to know something about an airline I’d just ask people who had used its services in the past. Besides, different strokes for different folks, right? Not everyone is looking for the same things in an airline.
Excellent! Most ‘surveys’ and summaries are much the same. Before looking at the results, always look that the method. If method is not described, run like hell.
I never really take much notice of reviews or ratings purely because people’s tastes and expectations vary so much. I just prefer to make my own mind up and chalk bad experiences as just that.
For the Real Story of Mark Mogel’s participation in the airline passengers rights movement to to http://www.MarkMogel.com. Illuminating. Whether you believe in Kate Hanni’s efforts for airline passengers or not, the claims made by a few who participated for a short period of time should be debunked.
The MarkMogel.com website referred to above was purchased and operated by Kate Hanni — a website she has since been forced to take down. My official website is http://markmogel.blogspot.com.
As inaccurate as these surveys are, when you’re an angry passenger, complaining to the DOT is much more meaningful a way to punish an airline for bad behavior than calling the airline.