Why I Put Very Little Stock in the Airline Quality Rating

There’s been a lot of talk about the Airline Quality Rating (AQR) lately since the 2012 rankings just came out. I’ve generally avoided talking about this and every other survey out there, except for one missive back in 2007. It seems like this might be a good time to revisit and explain why I don’t like these things. Today, I’ll just focus 07_11_26 grouchsurveyon the AQR.

You’ll hear the AQR referred to as a study or a survey when that’s being pretty generous. What the guys behind this do is take all the information the government releases in the monthly Air Travel Consumer Report (on time performance, mishandled bags, complaints, and denied boardings) and then smash it up into a rating. How do they smash it up? Well, they assign a weight to each metric and then combine them all.

Where did those weights come from? They were “established by surveying 65 airline industry experts regarding their opinion as to what consumers would rate as important.” Awhile later, they surveyed 766 frequent fliers but didn’t find much of a difference so they just kept the weights the same.

So what we have is really just a rehashing of what the government tells us each month. Is that useful? Not particularly. Let’s go over a few of the problems. Keep in mind, most of these complaints are about the underlying data as collected by the government. But since that’s what the AQR uses, no distinction is needed.

  1. Operating Airline – The biggest problem of all is that this is based on operating airline and not marketing airline. What I mean by that is this. When you buy a ticket on United, how often are you flying on United Express? There are more United Express flights than mainline United flights, so the chances are very good that you’ll end up on an Express partner, and not just on United. But the “United” we see here is just for mainline.
    Can you add up the airlines by learning which ones operate for which big airline? No. Think about SkyWest, an airline which flies for Alaska, Delta, United, and US Airways. There’s no way to break apart what’s related to which marketing airline.
  2. The 1% – We should feel lucky that we have the opaque SkyWest data because many of the regionals don’t report at all. Airlines with more than 1 percent of domestic scheduled passenger revenue have to report, and that leaves out many of the little guys. There’s no Comair, no Pinnacle, no Trans States, etc. So it’s a very incomplete picture. Don’t go looking for Spirit or Allegiant either. (Virgin America just got big enough to start reporting.)
  3. Bags per People – People often like to look at trends with this report, but that’s hard to do when it doesn’t adjust for business changes. The most glaring is in the baggage department. Mishandled bags are measured by looking at the number of bags mishandled per 1,000 passengers. In other words, it doesn’t matter how many bags are actually checked but rather only depends upon how many people fly. This is hugely problematic as airlines have started to charge for bags over the last few years. People check fewer bags, so you’d expect there to be fewer mishandled bags. But this report wouldn’t catch that because it only looks at the number of lost bags per total passengers.
  4. The Few, The Loud – Complaints are another issue here. When people complain, most of them complain to the airlines directly with the hope that they will get something in return. The number of people who get so annoyed with the airlines that they bother complaining to the DOT is miniscule. There were 7,465 complaints recorded by the DOT against the US airlines last year. That may sound like a lot until you realize there were more than 630 million passengers enplaned. That means about .001% of passengers complained. With numbers so small, it’s very hard to get any sort of true trend when comparing year over year.
    Delta, for example, had the highest number of complaints of the airlines last year with 1,406. One angry 777 full of passengers could band together and complain to the DOT and that would show as a 20 percent increase if everything else held the same. If you’d like to read more about this, take a look at my post ripping into the Lufthansa catering union for really bad analysis.
  5. Too Broad – One other issue here (and with most ranking efforts) is the broadness of the results. Hawaiian shows up as #2 in the survey, and a lot of that is going to be thanks to its stellar on time performance. But as I’ve written before, that on time performance rides on the coattails of the interisland operation while the long haul flying lags significantly.
  6. These are just a few reasons why I don’t like the AQR, but I don’t want to make you think that there’s another survey I like better. There isn’t. I hate all these types of forced rankings because it’s rarely indicative of what you’re going to get when you fly.

    Get Posts via Email When They Go Live or in a Weekly Digest

There are 22 comments Comments


Your email address will not be published. Required fields are marked *

Please enter an e-mail address