As we get toward the end of the year, I thought it would be fun to dig in to airline operational data and see just what kind of year everyone has had. Using the data from masFlight.com, I pulled together on-time and cancellation performance for all the big US airlines from January 1 through December 15. What did I find? It’s not a surprise, but Delta is just an absolute rock star. And, uh, you might want to avoid American Eagle and United Express. [Edit: I’ve updated to include Alaska, which I think we can safely call the Welterweight Champion]
Let’s take a look at the overall chart, and then I’ll talk about each airline in detail below.
I’m not one to fawn, but damn, Delta. That’s just impressive. As you can see, I broke out mainline and regional operations here so you could see the difference. I could have combined the two, but let’s be honest, you know if you’re booking mainline or regional so it’s worth seeing separately. Now, let’s talk about these airlines in order from awesome to suck.
Yes, Alaska’s performance is head and shoulders above the rest, but let’s be realistic here. Alaska operates most of its flights in the uncongested West. Certainly the airline deserves credit for being able to run a heck of an operation in the punishing Alaskan climate, but it also helps having more than 15 percent of your capacity in Hawai’i. I don’t want to take anything away from Alaska here, because the airline is clearly doing a fantastic job. So let’s just call them the Welterweight champion. The Heavyweight champion, however, is…
For the year, Delta has managed to get 84.2 percent of its mainline flights to their destination within 14 minutes of schedule. This is an airline that has a hub at perenially-delayed LaGuardia and JFK. On top of that, only 0.8 percent of flights were canceled. That’s all well and good, but it’s really the Delta Connection numbers that jumped out at me since we know that network airlines always dump crappy operations on to their regional partners. But Delta Connection had 81.6 percent of flights arrive on time and 3.1 percent of flights were canceled. For a regional airline, that’s stellar. (Heck, that on-time performance is way better than United and American mainline performance.)
Of course, not all regionals are equal. GoJet ran a poor 73.7 percent on-time operation and canceled 5.3 percent of flights. That’s not good. At the other end, SkyWest ran a stellar 87.9 percent on-time operation with only 1.2 percent of flights canceled. But that’s because SkyWest mostly operates in the West where the weather is better and the airports are less congested. One other thing to note, while Shuttle America did have a good on-time rate of 81.6 percent, it canceled 5.7 percent of flights, more than any other Delta regional.
Virgin America: Awesome
Not to steal Virgin America’s thunder, but the airline’s impressive operational performance looks a lot less impressive after seeing Delta and its much more complex operation. Virgin cancels almost nothing (0.4 percent), and it runs on-time 82.5 percent of the time. That is pretty great. But Virgin’s performance ultimately lives and dies with SFO. (The weather has sucked there in December, and Virgin has the worst on-time performance from December 1-15 at 61.7 percent.) If SFO weather is good, then things at Virgin America are good. If not, watch out.
US Airways: Pretty Damn Good
Though US Airways doesn’t belong in the same awesome category as Delta, it’s still running an impressive operation in its own right. Mainline runs on time 81.8 percent of the time and cancels only 1.6 percent of flights. Even Express runs on time 78.8 percent of the time while canceling 4.1 percent of flights. Considering how many little props and regional jets they have buzzing around the busy northeast, that’s pretty good. It’s not Delta-good, but it’s good.
I figured Piedmont with its little Dash-8 props would be the worst of the bunch but it’s not. It actually had a 77.4 percent on time record with 6.1 percent of flights canceled. That cancellation rate is terrible, but it’s not the worst on-time performance. The worst? Air Wisconsin ran a weak 73.9 percent on time with 4.9 percent of flights canceled. SkyWest again takes top honors with 89.2 percent of flights arriving on time and only 1.4 percent of flights canceled.
United: Not So Great
There’s a pretty big gap between US Airways and the next highest performer, United. United’s mainline operation ran on time 77.8 percent of the time with 1.4 percent of flights canceled. For mainline, that’s lacking. The regionals are where things really fell of the cliff, however. United’s regionals ran on time 72.4 percent of the time and canceled a painful 5.3 percent of flights. Ouch.
Looking at the regional partners, I’ll throw out Cape Air’s solid results because those are for a handful of flights in and out of Guam. It’s not impacting most of us and it’s a whole different (translation: easier) operating environment out there. (That’s also why I didn’t include Hawaiian in this chart.) The next best operator? Republic comes in with 76 percent of flights on time and a 5.1 percent cancellation rate. The best cancellation rate belongs to SkyWest which canceled “only” 3.8 percent of flights while running on time 73.1 percent of the time. Who sucked? I think it’s worth calling out the three worst airlines here.
In third place, we have GoJet with only 66.7 percent of flights on time and 4.6 percent of flights canceled. In second place, we have CommutAir with 66.6 percent of flights on time and a whopping 11.3 percent of flights canceled. Wow. And in first place, the worst performer is Trans States, sister carrier to GoJet, with a horrendous 65.7 percent of flights on time and 6.2 percent of flights canceled. Honorable mention certainly goes to Chautauqua which may have run on time 70.7 percent of the time but it canceled 14.9 percent of its flights. Let me repeat that. 14.9 percent of all Chautauqua flights operating as United Express canceled this year.
It’s important to remember that the mainline airline has a lot to do with the operational performance of its regionals. This reflects very poorly on United.
American: Not So Great
In the end, American ended up looking a lot like United. Its mainline operation wasn’t great but the regional operation was abysmal. We’ve seen some signs of life here since the merger, but man, there’s a long way to go. Mainline operations were on time 77.7 percent of the time with 1.4 percent of flights canceled. Meanwhile, the regionals were on time 72.2 percent of the time while canceling 5.2 percent of flights. I told you it looked a lot like United.
Looking at the regionals themselves, Chautauqua was downright scary with only 48.8 percent of flights on time and 11.8 percent of flights canceled. Geez. That is brutal. But American is also somewhat unique in that this year, three quarters of regional ops have been done by Envoy (formerly the airline named American Eagle). That’s going to change but for now, as Envoy goes, so goes American’s regional operations. And Envoy had 73.9 percent of flights on time while seeing 5.2 percent of flights canceled. Just as United’s regional operation reflects poorly on United, so does American’s. The difference is that American has a new team in there actively working on the operation.
JetBlue: Not So Great
JetBlue is a lot like Virgin America. If JFK goes down, so does JetBlue. Even though it has diversified its operations into Boston and Florida/Caribbean, the impact is still pretty severe. And that’s why JetBlue ran on time only 77.1 percent of the time this year. It canceled 2.3 percent of flights. I don’t know if that’s going to really change unless JFK magically gets better.
Frontier: Not So Great
Frontier has been changing into an ultra low cost carrier and part of that apparently involves running a less than stellar operation. For the year, Frontier had 75.3 percent of flights on time but it only canceled 0.4 percent of flights. So as is often the ultra low cost carrier mantra… you’ll get there, but we just don’t know when. I should point out that Frontier does seem to have improved a bit lately. In the first 15 days of December, Frontier is up to 79.6 percent of flights on time. Let’s see if this holds, and hopefully improves.
Allegiant: Not So Great
Allegiant’s year looks a bit different. Sure, it ran only 75.1 percent of flights on time while canceling 0.7 percent. That’s good enough to be in Frontier’s league. But if we look at the first 15 days of December, Allegiant appears to have had some problems. It ran on time only 67.3 percent of the time while canceling 2.7 percent of flights. Let’s hope that’s just a short term glitch.
I wrote about how Southwest killed its on-time performance earlier this year. It was supposedly fixed in August, but that means we’re stuck with 7+ months of garbage to drag them down. Southwest ran 74.4 percent of flights on time and canceled 1.3 percent. That is really not good for an airline that used to be on time all the time, and it’s why Southwest falls into the “suck” category. It now wants to be operating above 80 percent on time, but a quick look at the first couple of weeks in December shows that might be tough. In the first 15 days, it has a 76.3 percent on time rate with 1.1 percent of flights canceled. Sure, weather plays a part in that, but we’ll have to watch these guys closely. Hopefully its worst days are behind it, but it’s far from the front of the pack.
Bringing up the rear is no surprise. It’s Spirit. Spirit has said it’s going to run its airplanes hard to get as much utilization as possible, but that means it won’t be at the top when it comes to on-time performance. With only 72.8 percent of flights running on time, that certainly shows. But it only canceled 0.8 percent of flights. This is that same mantra we talked about earlier. Spirit will get you there… eventually.
And there you have it. Let’s end back on a high note with Delta. Sometimes these numbers are hard to really grasp since they’re just percentages in the ether. So let’s look at it this way. If Delta mainline ran an operation with Southwest’s on-time performance numbers, it would have seen nearly 250 additional flights delayed every day, over 80,000 for the year through December 15. If you assume, say, 175 people on those airplanes, then you have nearly 16 million more people a year who would be delayed.
Good work, Delta. You guys are killing it right now. Let’s hope others try to up their game.
What happened to Alaska.
Ok everyone. Alaska has been added at your request. Of course, it performed very well, but it has a much easier operation to run than an airline like Delta. So Alaska will be crowned the Welterweight champion.
I think most of us just thought it was odd that you initially excluded Alaska, since it’s a larger airline than Frontier, Allegiant, and Spirit.
Yeah, I know. It wasn’t on purpose but at least I fixed it quickly.
Where is Alaska Airlines? It is one of the major US carriers with excellent operational results. It is amazing to me that they have been excluded from this list.
Same question. I had always read that Alaska was close to the top on operational performance?
whats so impressive about Delta? Yes they were the best, but not by much! AA and UA were 77% on time and DL was 84%; AA and UA completed 84% of flights and DL completed 86%…… whats the big deal? Yes, american eagle and united express are a joke…..what else is new?
Percent of flights completed is the secondary axis on the right. So DL is a little over 99%, UA a little under 99%, etc.
john – If Delta ran an operation using United’s numbers, it would have delayed more than 350 additional flights every single day (150+ mainline and 200+ regional). In addition, it would have canceled more than 160 additional flights every day (35 mainline and 125 regional). I’d say that’s a huge difference.
I know you’re trying to present a simple picture here, but this is really a much more complex issue. Any airline can have amazing operational performance if they lengthen their block times and turns, and set up shop at uncongested airports with bright blue skies all year (and conversely, tank them by doing the opposite). Not saying that’s what DL has done (I haven’t looked at the block/turn numbers for a couple years, and they certainly have hubs in some congested places, as you’ve pointed out) but without explicitly controlling for (at least) those factors, it’s hard to say that DL is really so much better except in the sense of “ooooh, big numbers.”
As a passenger, though, I only really care about whether the flight arrives when it says it will. So if Delta over-blocks to give themselves some breathing room (sacrificing aircraft utilization in the process), that’s fine with me. At least they’re giving me an arrival time I can count on and I will book accordingly.
True that, but the thing is on time arrivals are like the stock market — past performance is no guarantee of future gains. If US Air wants to be #1 in a quarter or year, they’re going to be. As a passenger, you have no idea what their operational philosophy is going to be from quarter to the next.
I’m a math guy for a living, and if I were a betting man, the only numbers I could be confident in are the numbers for this month, which may or may not be available depending on 1) When in the month I book my ticket, and 2) Lead time on the ticket.
Yeah… I had an interview awhile back with US Air’s ops department, and they told me that A14 numbers are “purchased.” As in, you can be #1 if you want to be, by adjusting block times accordingly. But since pilot pay is a function of block time, padded schedules = costlier flights.
True.. Its more a matter of promising something then delivering it.
I’m sure that if you tell people it is 8 hours to fly from Denver to Seattle they’ll realize you’re stupidly underpromising, and they’ll call you on it or book away to your competitor.
Andrew C – As LT_DT says, from a passenger perspective, they only care if they get there on time. And that’s the point of this post. But I wasn’t willing to just leave it at that. I don’t have the block performance numbers in front of me, but I do have D0. That’s departures on or before scheduled departure time. So that takes out any issues with block padding. You can see that chart here: http://bit.ly/13DwEhr
There are a couple things that stand out here. Look how much worse United is than American. Considering their A14 arrivals were similar, that means that United must be padding the heck out of its schedule compared to what American is doing. US Airways and Delta are much closer here when it comes to D0, so that makes you think that US Airways is padding less than Delta, but the difference isn’t nearly as striking as with United and American. And just look at Southwest. Atrocious.
But this is one area where I agree with the DOT. If you get where you’re going on time, that’s what really matters.
Interesting that American Eagle and United Express both do *better* than their mainline counterparts in the D0 statistic. So does that mean that both airlines schedule much longer turns than they need to and/or pad the schedules for the regionals less? Or just that regionals are much more likely to be put at the bottom of the priority list by the mainline master and therefore sit on the tarmac waiting for a takeoff slot after pushing back?
United’s mainline D0 below 50% is remarkable. Given that on-time arrival is over 75%, that means that more than half of UA’s late departures wind up on time. That’s some serious padding.
As a consumer, I do care about padding in that, given a choice and price being competitive, I’ll take the shorter scheduled flight if I have reason to believe that it will be on time (or no reason to believe that the more-padded schedule is more likely to be on time). But I certainly agree that arrival with respect to the schedule is the top-line number that should be reported.
Alex- It’s tough to really compare D0 and A14, because they’re very
different. D0 means they have to leave at or before the scheduled time.
So United could in theory just have a lot of flights that leave 5 or 10
minutes late and still arrive within 15 minutes of schedule. I guess a
better comparison is D15, which frankly at United looks fairly similar to
A14. So basically, a quarter of flights leave late but not too late.
Another quarter of flights arrive more late. It’s just sloppy. That’s why
you see a lot of airlines including Delta and the new American (US Airways
did this) really focusing on D0. Get those airplanes out on time and you
don’t have to worry about delays nearly as much on the backend.
Impressive numbers for DAL, just don’t ask the pilots, who retired prior to 2006, what they think of the company. As far as I know, they were the ONLY group to have their retirement terminated and turned over to the PBGC for pennies on the dollar. Now DAL is making billions. Go figure!
That’s because so many of those retiring pilots cashed out their lump sums and drained all the assets out of the pilot pension. Delta had to terminate the pilot pension to survive. It was an awful feature to have in a pension plan. It was negotiated by ALPA on behalf of the pilots. Shot themselves in the foot.
Eh, I’m not an actuarial, but those lump sums are reduced to account for the fact that there’d be future growth of the money in the pension plan.
Only this shouldn’t sink a plan..
JP, upon a little further checking, you might find that god ‘ole UAL did the same thing. When I retired from UAL in 2000, the pension program was overfunded, we all thought we had nothing to be concerned about. Suddenly, Glenn Tilton took over in Sept. 2002, declared bankruptcy in Dec. 2002, and in the intervening period before coming out of bankruptcy, the PBGC assumed many pension responsibilities from UAL. Tilton, of course, walked away with a handsome amount of money when he left, for screwing all the employees and retirees. There just ain’t no justice it seems!
Jim, there was this little disruption at the end of September 2001 that caused a downturn for airlines and the financial markets in general. Its disingenuous to put “overfunded in 2000” and “bankruptcy in 2002” without acknowledging the shock in the middle. I’m sure some of that funding was in stocks and bonds which also took a hit at the end of 2001.
Andrew C: it’s anecdotal, but when I worked them I found that UA block times were enormously padded out of ORD, while MSP was pretty spot on for DL. UA, being dumb, would be pissed when they’d pad block half an hour and we’d then land early, because that would screw up their gate planning.
The AE And UX numbers are proof positive that if you staff threadbare, provided minimal resources, and constantly threaten every job and contract with termination unless cuts are made you will have an inferior product.
Thanks CF. Nobody gives a better rundown of this than you do.
Of course, we never buy a ticket to OVERALL.
I generally fly UA. Amazing how many times you check its website to see on-time performance for a flight and are greeted with: “There is no recent record of delay and cancellation percentages for this flight.”
“Geez, you’ve been flying between these 2 points at nearly the exact same time for the last 20 years! No data?” Oh yes, UA is prone to use 30 differerent flight numbers for basically the same flight for everyone of the 30 days of any given month so no one flight number ever accumulates “significant” historical data.
Not very helpful! Perhaps airlines could program their rezoperation systems to show data by time groups for each city-pair? (Dulles to O;Hare, departing between 7am and 8am, or whatever. Skip reference to any given flight number as any flight number’s data are often insignificant.)
Then there is the dearth of regional on-time performance data, even though airlines are using regionals more and more, even between hubs. This data should be shown.
Then, it sort of makes me chuckle when I see UA giving me a percentage of flights “on-time,” those arriving within 15 minutes of scheduled arrival time, and the percentage “late,” those arriving more than 30 minutes after scheduled arrival time. And those neither “on-time” nor “late,” that is those arriving more than 15 minutes before or after scheduled arrival time but somewhere less than 30 minutes after the scheduled time. These are the “Early” flights and those a little late but not too late. Surely DOT needs to begin a rulemaking, probably take a year or more, to come up with a category name acceptable to all interested parties!
Umm where’s AS?
You Might Want To Go Through Your Data Again. Chautauqua Hasnt Flown For American Or United Since This Summer.
yxq400 – What’s your point? This data is for the entire year through December 15.
Excellent work, Cranky!
Would love to see AS added. Would also be interested in seeing the numbers combined for the mainline+regional carriers, but “scaled” by number of passengers. E.g., UA has a ton of regional flights (> 50% of flights, right?), but the bigger planes of mainline mean more pax overall.
Oliver – So the break down of regional flights is as follows. These are the percent of total marketed flights that are operated by regionals.
American – 45.4%
Alaska – 45.9%
Delta – 49.4%
US Airways – 58.7%
United – 64.4%
Here’s the combined chart without any weighting: http://bit.ly/13DBfQy
I don’t know that weighting really does any good, but if you want to see something weighted, let me know what.
I’m just going to take a crack at your subjective assessments. For a major airline to score “awesome”, I’d want to see A14 numbers north of 85% — >90% would really catch my eye. For completion factor, I’d want to see a cancellation rate of less than 0.5%.
Frankly, on a subjective basis, anything in the 70’s is “fine”, as is a cancellation rate less than 2%.
Quite frankly, United had a better completion factor than US Air (1.4% cx vs 1.6% cx) their A14 rate was only 4% worse. Completion factor is king — there’s no way they can beat out US Air in that department, lose on the A14 by a few points, and get a “Not so great.”
I don’t think Spirit and Southwest deserve “suck” ratings either — not with Spirit’s completion factor being where it is.
Dan – I disagree on completion factor. That is more important for an airline that doesn’t do connections, but delays for heavy-connecting airlines like the legacies and Southwest are going to cause many more issues. That’s actually why I’d consider Southwest to be the worst of the bunch here.
Now regarding United, you seem to be ignoring regional performance which is a huge issue.
When I worked for UAX, fully half of our bonus plan was contingent upon our completion factor. Now, that may have been a factor of how our agreement with UA was structured (aka “we don’t get paid if we don’t fly”), and not necessarily a reflection of what was important to the passenger.
As for my ignorance, it just wasn’t clear from your post how much you were considering both regionals and mainline as separate/combined in your subjective rankings.
I will admit, though, that you simply can’t please everybody. When the parent airline controls so much of your scheduled, it’s not really fair to let them make you the sacrificial lamb to improve their rating and let you take the heat in the public for running a crappy airline. Which I know is what you’re doing here.
I’ve been doing this crap for almost 15 years (first for an airline, now for some government faction) and I have yet to come across an irrefutable way to analyze combined performance of both the regional and major. Weight it based on ASM? RPM? Pick one and go with it, but somebody will shoot holes at it. And they’ll all be right. Mostly.
Yeah, it’s true that there’s no perfect metric. It works much better on
city level. We did an analysis for some people in Richmond that use Cranky
Concierge and it showed very clearly that UA cancels those RIC flights all
the time. I would never book United if I lived in Richmond. Of course
it’s going to look different in some other city. So the broad metrics like
this are really just going to show how well an operation is running
overall. The relevance might not be there for every person.
I also meant to say that completion factor is king. Cancelled flights generally mean a huge time suck for a passenger. A14? Well, you can miss that mark by a minute and still make your connection. And yes, misconnects are a time suck too. If I were ranking airlines, I’d do a weighted comparison, biasing completion factor over A14.
Cranky, can you provide a little context on what constitutes “good?” Based on how you categorized airlines, 80%+ on-time appears to be good; 75%-80% is mediocre, and under 75% is suck. That doesn’t appear to be much of a variance on its face, though I guess when aggregated over thousands of flights a few percentage points do make a good difference.
TRC – It’s all relative so it’s not a hard metric. The point is really how you compare to the industry. And a few points does make a huge difference when you’re talking about such large volume.
Brett — good article, but I’m disappointed you didn’t cite the Chicago fire in affecting United and American’s performance. You made the same mistake the FAA did for a few months. I would /not/ be surprised if that actually had a several-percentage-point effect for the year.
No doubt, Delta is far superior than United/American and I’d be surprised if that changes, although it’ll be interesting to see what Parker does with AA. That said, I don’t think it’ll be as easy to work the US magic, since US’ hubs are in three of four airports with comparatively excellent on-time records (PHX, CLT, PHL).
BigDaddyJ – Hate to burst your bubble, but the impact was just about nil. That was only bad in Chicago for about a week before things started to recover. There was lingering effect but it was really not that much in the grand scheme of things. I pulled the data for January 1 – September 25 and then from November 1- December 15. That’s a pretty conservative chunk I cut out.
Doing that, United’s on-time performance “soared” from 77.8% to 77.9% while completion factor actually dropped from 98.6% to 98.5%. United Express on-time performance went from 77.8% to 77.9% and completion factor dropped from 72.4% to 72.3%.
Meanwhile at American, on-time rates “jumped” from 77.7% to 77.8% while completion stayed flat at 98.6% American Eagle saw on-time performance go from 72.2% to 72.6% with completion factor climbing from 94.8% to 94.9%.
It had no real impact.
Meanwhile, you’re suggesting that Philly is an on-time airport? That place is a mess. The difference is that the majority of operations are from US Airways so it’s good operational performance makes the whole airport look better. DFW and Miami should easily be able to operate better than Philly. And even O’Hare with its new runways should be able to perform well.
I think A14 is an okay metric, but for me personally I’m more curious what A60 is. I don’t fly much, but I’m often an evening flyer, and on direct flights. So if I get there within an hour of my scheduled arrival I’m generally happy.
I’d also be curious if airlines track ontime completion for specific reservations. Back in the mid-80s or whenever they thought of the current system tracking specific flight ontime completion seemed like enough, but its within the realm of possibility to see how many customers arrived within 14 minutes of their scheduled arrival time at their final destination. It’d be a big job, but you could have a computer do it offline..
If you do that, you get into CF’s comment to me earlier about connections. At A14, you’re probably making your connection. At A60, you’re much more likely to miss it.
I’m curious, too, if the airlines track delay at the reservation level. I’m also curious how they would account for getting stuck at the hub overnight. No sane CEO is ever going to consent to any sort of statistic that allows their passenger to accumulate hundreds of minutes of delay.
I’ve seen some academics tackle this, and the FAA has some simulation software that they’re using to calculate passenger value of time for some NextGen activities. So people look at it.
Dan, it is true that at A14 you’re probably making your connection, while you’re much more likely to miss it at A60.
But slightly less than half of passengers on any given flight are flying into a hub to connect to another flight. (Presume that most connecting itineraries are through exactly one hub, sure there are some connections that go through two or more hubs, but there are many more direct flights as well.) So A60 is important in my mind for “how late” the airline is. If you’re constantly running A240 (4 hours late) then thats horrible! but if you’re running A30 more often, thats something the flying public is more likely to accept.
I’m sure no sane CEO would want a statistic that shows how many minutes they’ve delayed their passengers in total, but any sane CEO should be asking for that number or proxies of that number for internal tracking and deliberations.
Errr.. I misphrased that:
It’s interesting to watch the criticism of the story. I’m not sure how many commenters have ever worked for an airline, particularly in an airline ops control environment. I work for a major airline and have previously worked for large and small carriers prior to my current carrier.
In my opinion, the information presented is strictly data. There are always “ifs”, “buts”, and “so whats.” However, running any airline is a very complex endeavor. It is never as simple as one would perceive. You are at the mercy of weather, maintenance, politics, government intervention, etc. Trying to coordinate and reduce the impact to the overall customer base and airline as a whole in the face of all these factors is no easy feat.
Cranky’s data provides a snapshot of the industry and who does well and those that could do better. It gives the lay person another morsel of information on how to potentially base your flight purchase. Do you need to be at your destination on a particular flight and on time? Or are you willing to roll the dice on the low fare carrier and get “same day” service?
As an airline employee of 20+ years, I marvel at what Delta has achieved. Planes break, regional partners can be unreliable, storms erupt, governments go on strike, etc…yet they complete their flights and nearly 85% of them are on time. This coupled with 80%+ load factors does make it “awesome.”
I, and I believe many, if not most of the posters on this blog have a pretty good appreciation of what you employees go through every day. No, we don’t know everything but things seem to go pretty good, and then, well everything goes haywire and we are becoming, well, slightly irate, damn the data!
I fly UA, almost always out of IAD, not a small-time operation. I tend to forget all the great service I get, but then there is that one flight, that one day, when you say, does anyone know what is going on here. I wonder why can’t UA tell me whether or not the flight is operating and if so, when. “Well there’s the weather issue at Podunk; there’s crew who got in late last night; well, that’s something related to our regional (rent-a-plane-and-crew, DBA United Express) that well, you know, what can we do with them! But, here’s are fabulous on-time statistics!”
Then, I’m flying on someone like SunAir, and well, the people I’m talking to, when they aren’t disposed to handling the baggage and ramp operations for the one or two airplanes they have, simply seem clueless about if, and when a flight is operating, I see little difference in these matters between a big and little carrier operation.
I know, sounds like a rant, but really, this “is the flight operating and if so, when” is really more irritating than questions of fare. The on-time performance data are nice, but it comes down something very specific that the data is basically useless.
Anyway, I apperciated your comments
It was data with some subjective commentary (note the headers). As a numbers guy, it’s easy to poke holes at someone else’s analysis. CF posted some numbers, said that somebody did awesome, and somebody else sucked. “Awesome” and “sucky” aren’t data. So it’s not “strictly” data as you allege.
And honestly, I’m not sure what this actually tells a lay person about what airline to fly. if US’s goal is to be #1, they can “purchase” that rating. What happens when they stop doing that? Are you going to know when? At an aggregate level, this information tells a passenger nothing. You need route and time of day performance to make any sort of comparison.
I used to fly NW back in the day, and they had *excellent* operational performance on the route I flew — always a connection and never a delay. I was based in LAX, making mainline-mainline connections. I’m going to choose that over a mainline-RJ connection any day of the week.
As a sorta-frequent Southwest flier, the August changes have helped some, but not entirely. The culprit seems to be unrealistic gate turn times. My last experience was pretty instructive and typical. I was going DAL-PHX with a connection to SMF. The incoming flight was from BHM. It lands on time at 08:55, but is scheduled for a 30-minute turn. Of course, with a full 737-700, that doesn’t happen; we push back 14 minutes late, and arrive at PHX 16 minutes late. The plane was then scheduled to hop to LAS, but with a 45-minute turn there, most of the delay was made up and it made it to LAS only 5 minutes late, so technically on-time. Anecdotally, my recent experience has been that flights scheduled with turns north of 35 minutes tend to do OK, but shorter ones still suffer from delay creep through the day, and there are still a good number of short (30 minutes or less) turns out there.
Thanks for this… I wish the DOT baggage performance report broke out mainline and regional stats the way your comparison does.
I find it interesting that the few times I’ve flown SWA, I have seen delays due to “connecting passengers.” Yet UA and other mainline airlines don’t seem to wait for connecting passengers (ask my fellow passengers, about 25 of us, who missed our connection to MEL in LAX by 11 minutes).
Yet with all of this, SWA seems to have a higher customer satisfaction score. So I’m wondering if the on-time performance is more marketing than anything and airlines are being held hostage by these numbers?
I can honestly say UA would rank a lot higher in my book had I not had to spend an extra 24 hours in LAX because they refused to wait 11 minutes for us (due to delays on UA’s inbound aircraft from ORD and MCO).
Oh and for the record, of 70 segments this year I flew on UA (mostly out of ORD) I was delayed twice significantly and only an additional two more times that were more than 15 minutes but less than 45. So about a 94% on-time rating.
Why did SkyWest perform so much better for Delta than for United?
A lot of regional performance is based on the whims of the mainline
airline. That carrier will dictate when things need to be canceled in a
weather event. So with SkyWest doing more flying in Chicago for United, it
has more flying at risk than Delta which doesn’t use it much at congested
How would the statistics add up when you weighted the percentage of US Air Traffic System delays into an airlines hub operational performance? Take Chicago for example, American and United use it as a hub. If they were provided a percentage buffer for ATC delays beyond their control, what would that do to their ranking?
It’s a good question but I’m not going to be able to put the resource into
pulling that information. I would be surprised if it had any severe impact
on the results.
The reason for Alaska’s great performance is not the markets they fly in but because they invested $40,000,000 in new satellite technology that allows them to fly in more questionable weather. United is too cheap to do this