Why I Put Very Little Stock in the Airline Quality Rating

There’s been a lot of talk about the Airline Quality Rating (AQR) lately since the 2012 rankings just came out. I’ve generally avoided talking about this and every other survey out there, except for one missive back in 2007. It seems like this might be a good time to revisit and explain why I don’t like these things. Today, I’ll just focus 07_11_26 grouchsurveyon the AQR.

You’ll hear the AQR referred to as a study or a survey when that’s being pretty generous. What the guys behind this do is take all the information the government releases in the monthly Air Travel Consumer Report (on time performance, mishandled bags, complaints, and denied boardings) and then smash it up into a rating. How do they smash it up? Well, they assign a weight to each metric and then combine them all.

Where did those weights come from? They were “established by surveying 65 airline industry experts regarding their opinion as to what consumers would rate as important.” Awhile later, they surveyed 766 frequent fliers but didn’t find much of a difference so they just kept the weights the same.

So what we have is really just a rehashing of what the government tells us each month. Is that useful? Not particularly. Let’s go over a few of the problems. Keep in mind, most of these complaints are about the underlying data as collected by the government. But since that’s what the AQR uses, no distinction is needed.

  1. Operating Airline – The biggest problem of all is that this is based on operating airline and not marketing airline. What I mean by that is this. When you buy a ticket on United, how often are you flying on United Express? There are more United Express flights than mainline United flights, so the chances are very good that you’ll end up on an Express partner, and not just on United. But the “United” we see here is just for mainline.
     
    Can you add up the airlines by learning which ones operate for which big airline? No. Think about SkyWest, an airline which flies for Alaska, Delta, United, and US Airways. There’s no way to break apart what’s related to which marketing airline.
  2. The 1% – We should feel lucky that we have the opaque SkyWest data because many of the regionals don’t report at all. Airlines with more than 1 percent of domestic scheduled passenger revenue have to report, and that leaves out many of the little guys. There’s no Comair, no Pinnacle, no Trans States, etc. So it’s a very incomplete picture. Don’t go looking for Spirit or Allegiant either. (Virgin America just got big enough to start reporting.)
  3. Bags per People – People often like to look at trends with this report, but that’s hard to do when it doesn’t adjust for business changes. The most glaring is in the baggage department. Mishandled bags are measured by looking at the number of bags mishandled per 1,000 passengers. In other words, it doesn’t matter how many bags are actually checked but rather only depends upon how many people fly. This is hugely problematic as airlines have started to charge for bags over the last few years. People check fewer bags, so you’d expect there to be fewer mishandled bags. But this report wouldn’t catch that because it only looks at the number of lost bags per total passengers.
  4. The Few, The Loud – Complaints are another issue here. When people complain, most of them complain to the airlines directly with the hope that they will get something in return. The number of people who get so annoyed with the airlines that they bother complaining to the DOT is miniscule. There were 7,465 complaints recorded by the DOT against the US airlines last year. That may sound like a lot until you realize there were more than 630 million passengers enplaned. That means about .001% of passengers complained. With numbers so small, it’s very hard to get any sort of true trend when comparing year over year.
     
    Delta, for example, had the highest number of complaints of the airlines last year with 1,406. One angry 777 full of passengers could band together and complain to the DOT and that would show as a 20 percent increase if everything else held the same. If you’d like to read more about this, take a look at my post ripping into the Lufthansa catering union for really bad analysis.
  5. Too Broad – One other issue here (and with most ranking efforts) is the broadness of the results. Hawaiian shows up as #2 in the survey, and a lot of that is going to be thanks to its stellar on time performance. But as I’ve written before, that on time performance rides on the coattails of the interisland operation while the long haul flying lags significantly.
  6. These are just a few reasons why I don’t like the AQR, but I don’t want to make you think that there’s another survey I like better. There isn’t. I hate all these types of forced rankings because it’s rarely indicative of what you’re going to get when you fly.

    Get Posts via Email When They Go Live or in a Weekly Digest

Leave a Reply

22 Comments on "Why I Put Very Little Stock in the Airline Quality Rating"

avatar
Sort by:   newest | oldest | most voted
David SF eastbay
Member
You can never go by a survey as the info will never be accurate. We have like 6 billion people on the planet, but a survey of a 1000 people is supposed to represent how 6 billion feel….no! If company ABC123 hires a firm to conduct a survey, well golly gee the result is going to favor that company. One company can use a different scale to get results, while another company can use a different scale and get a different set of reults to the same questions. Survey’s are worthless and the results can give people the wrong idea… Read more »
Nick Barnard
Member

I’ll admit to not understanding the exact details, but there is proof that if you do an appropriately controlled survey you can survey several thousand people and you’ll be able to accurately predict a census of the population of several million people.

I’d be possible to craft an airline survey that would be statistically accurate, there are just holes galore in AQR’s underlying data.

Jason H
Guest

In addition to what Nicholas said, which is true, you have to remember that the statistics are set across a large population set. So it doesn’t necessarily hold true down to the individual level, but can give a general view for large subsets of the population.

That all being said, the history of aviation surveys is extremely questionable. The statistical results will only be as good as the underlying data, which so often just isn’t all that great.

DesertGhost
Guest

To quote Mark twain, “There are three kinds of lies; lies, damned lies and statistics.”

Nick Barnard
Member

Theres also “Figures never lie, but liars figure.”

Shane
Guest

So if you read the report on baggage handling, any increase in mishandled bags would be a very bad sign in terms of operational efficiency. As fees go up and fewer bags are checked, it should be easier to get the bags to the correct places. Except for American mainline flying, baggage handling seemed to get worse. Even Southwest had a hit, although they may have had more bags checked per 1000 passengers since people checking multiple bags may be more inclined to book with them.

Sanjeev M
Guest

OMG so true. Don’t even get me started on Skytrax….

Southeasterner
Guest

The biggest issue is what do all these surveys and rankings actually mean?

I buy tickets based on the lowest price and least number of connections, I can’t recall a single time in my life I have based my ticket purchase decision on one of these “quality rankings”. I guess it may be a meaningful “performance metric” for airline execs who want to apply what they learned during their MBA studies but I would bet it has zilch impact on the bottom line.

A
Guest

Agree completely. Who outside of industry insiders patting themselves on their backs get anything out of this. Everyone I know either buys on price or is a slave to their favorite rewards program. Yawn, this is a non-story for 99.9999% of us.

jaybru
Member
Such a waste. Get DOT out this quality rating business, or at least make the airlines pay for DOT to collect and publish this garbage. When the industry today is such a hodge-podge of “operating,” “operated-by” and “code-share” entities, the data is so useless. Surveys can be useful and any one of us here (with Cranky numero uno) would make a wonderful respondent for any airline should it really care as to how we think about things. Recognizing, none of us represent all travelers, but we probably have a better understanding of more travelers than most others have. SURVEY: First,… Read more »
Nick Barnard
Member

So I was under the understanding that most of these statistics were collected by the airlines and reported to the DOT. (A few of them like complaints to the DOT, are DOT generated..) CF, do you know who does the primary collection of most of this data?

trackback

[…] Why I Put Very Little Stock in the Airline Quality Rating – >> The Cranky Flier – crankyflier.com There's been a lot of talk about the Airline Quality Rating (AQR) lately since the 2012 rankings just came out. I've generally avoided talking about this and e […]

Dale
Guest
In addition to all the problems already mentioned, most of which I agree with, the methodology is bad. Years back the AQR factored meaningless stats like fleet size in trying to judge ?quality?. In more recent years they watered it down to using the DoT stats, but I believe they developed their methodology by rigging it until they came out with the ?right? results?.LCC?s are good, majors are not so good, and regional airlines are awful. Even if you believe that the DoT stats themselves are beautifully representative of the air travel experience?.which as others have pointed out they are… Read more »
Mark Mogel
Guest

At the risk of getting deleted by Cranky for self-promotion …

Rigged? You haven’t seen “rigged” airline report cards until you read my book about Kate Hanni and the stranger-than-fiction story behind the airline passengers’ bill of rights — Diverted: High Flyers and Frequent Liars available on Amazon.com.

The prologue and first chapter can be found at http://markmogel.blogspot.com

… perhaps he will make an exception in this case.

Austria Adventures
Guest

I never really cared about AQR, and if I really needed to know something about an airline I’d just ask people who had used its services in the past. Besides, different strokes for different folks, right? Not everyone is looking for the same things in an airline.

Cedarglen
Guest

Excellent! Most ‘surveys’ and summaries are much the same. Before looking at the results, always look that the method. If method is not described, run like hell.

Cath
Guest

I never really take much notice of reviews or ratings purely because people’s tastes and expectations vary so much. I just prefer to make my own mind up and chalk bad experiences as just that.

Cindy B Sarge
Guest

For the Real Story of Mark Mogel’s participation in the airline passengers rights movement to to http://www.MarkMogel.com. Illuminating. Whether you believe in Kate Hanni’s efforts for airline passengers or not, the claims made by a few who participated for a short period of time should be debunked.

Mark Mogel
Guest

The MarkMogel.com website referred to above was purchased and operated by Kate Hanni — a website she has since been forced to take down. My official website is http://markmogel.blogspot.com.

Paul Brody
Guest

As inaccurate as these surveys are, when you’re an angry passenger, complaining to the DOT is much more meaningful a way to punish an airline for bad behavior than calling the airline.

wpDiscuz