DOT Finally Makes Operational Statistics Accurate

Government Regulation, Operations

It took years of fighting, regulation, and delays but the DOT has finally started to publish a useful look at airline operational performance.  Now you can see the stats by brand (including regional airlines) and not just by operating airline.   I think this rolled out last month, but the data has been published going back to the beginning of the year.  That means when you see an airline’s numbers now, they’ll include the massive regional operations. 

There are five airlines impacted by this change. 

  • Alaska now includes regional operations by Horizon, PenAir, and SkyWest
  • American now includes all American Eagle flights operated by others
  • Delta now includes all Delta Connection flights operated by others
  • Hawaiian now includes ‘Ohana by Hawaiian flights operated by Empire
  • United now includes United Express flights operated by others

This does not include codeshares operated by airlines under their own brands.  For example, United won’t include Lufthansa-operated codeshares, because those are branded as Lufthansa when flying.  Here’s how things break down.

On-Time Performance

So far, twin out-performers Delta and Hawaiian seem to be taking the hardest hit when it comes to on-time performance while Alaska has actually benefited from the change.

Below you can see the old mainline-only performance on the left with the total mainline + regional number on the right.

Historically, airlines usually do better operationally when it comes to mainline flying than they do with regionals.  A lot of that is by design.  If an airline is facing bad weather in congested airspace, it has to make decisions on which flights to cancel or delay.  More often than not, the smaller regionals are sacrificed so that fewer people are impacted.

With that in mind, it might seem strange that Alaska actually improved its numbers when regionals were added in.  There is a reason for that.  I should note that these numbers are wacky since Virgin America wasn’t added to Alaska’s numbers until April.  Up until that point, Alaska mainline was either the same or better than regionals, so it’s the old Virgin America operation that’s pulling down the performance.

All the other airlines saw the regionals drag down the numbers, but none was hit harder than Delta.  Delta regionals did not fare well in the first quarter of the year, but then they picked up steam after that.  

Interestingly, United and American both followed similar tracks with the mainline doing far better at the beginning of the year.  But by July, regionals were actually outperforming mainline.  


The cancellation numbers aren’t aggregated in the report.  I could add them all together, but July is good enough for this post.

Hawaiian and Delta both had nearly no mainline cancellations (technically 0.1%) but regionals were higher.  For Hawaiian its small regional ‘Ohana had an astronomical 3.9 percent cancellation rate.  Delta’s regional had a 1.5 percent rate.  That brought the totals up for both airlines.

United had a good 0.7% rate on mainline but regionals at 2.5 percent made for a large increase overall.  And American mainline was awful at 2.3 percent, but the regionals were worse at 3.6 percent.  This made American look even worse, naturally.

Lost Baggage

It looks like lost baggage rates have not been combined with regionals.  That’s frustrating since there are some really awful regional performers.  The worst, Envoy, is easy since that airline only flies for American.  That rate of 6.37 reports per 1,000 passengers from July is easily the worst of the pack.  But ExpressJet is next with a rate of 5.46, and we don’t know how that breaks up between partners since it flies as American Eagle, Delta Connection, and United Express.


This bucked the trend in that not all airlines saw regional performance hurt them, at least not looking at second quarter 2018 numbers.  Sure, American saw its rate of .06 involuntary denied boardings (IDBs) per 10,000 passengers rise to .14 when regionals were added in, but that is fairly minor.  Meanwhile United was at .01 across the board.  Delta had .00 (rounded down) for mainline but .01 for regionals, so it wasn’t going to move the needle.

The biggest impact was on Alaska.  It had a .37 rate for mainline, but its regionals were at 1.49.  The combined .61 rate was much worse than anyone except Spirit which was at .63.


While DOT did start showing total complaints including regional operators, it didn’t normalize that stat by creating a rate of complaints per 100,000 passengers as it does by operating carrier.  That’s somewhat puzzling since the data has to be available, but at a quick glance, it doesn’t look like there were any particular outliers anyway.

In the end, does this make a huge difference?  Well, not for me or for readers of this blog, because I’ve long compiled operational stats from masFlight that include regional operators.  But for the rest of the US that might only see this data, it does matter.  Airlines should be responsible for their regional carriers.  Delta has already made a big effort to get regionals performing up to mainline standards, but others need to follow.  Shining a light on this data, will only help make that happen.

Get Cranky in Your Inbox!

The airline industry moves fast. Sign up and get every Cranky post in your inbox for free.

30 comments on “DOT Finally Makes Operational Statistics Accurate

  1. When will DOT start tracking mishandled bags per 1000 handled bags?

    That would be much more useful than a bags/pax calculation since so many new baggage fee policies can change average number of bags checked per passenger pretty significantly and relative carrier size can make a difference

    1. While we’re at it, in an ideal world I’d like to see metrics on the number of carryon bags that were checked at the gate, excluding things like strollers and carseats that were planned to be gate-checked.

      I’ll be the first to admit that this may well be too much of a paperwork burden for the airlines to compute, but it’s a key pain point for many customers, in addition to slowing down airplane turns, and one that I hope airlines have some internal metrics on already.

      I know it’s a low bar, but I’m almost surprised that a Congressman hasn’t raised a stink over this after being forced to gate check their carryon bag.

      1. Are you only looking for forced (invol) gate checked bags, or for every bag? I often hear announcements at the gate encouraging passengers to gate check their unwanted bags ahead of boarding when the flight is full. Would you count those?

        1. Regarding “voluntarily” gate-checked bags vs “involuntarily” gate-checked bags, I’d be open to either. Maybe it’s just me, but the line often gets a little blurred, as I don’t see a ton of difference between gate agents “strongly encouraging” pax to check their bags at the gate, without any other incentive to do so, and pax having to fight the current and drag bags out of the plane to the jetbridge when the overhead bins fill up at the last minute.

          As mentioned, I know it wouldn’t be a perfect metric, and probably wouldn’t be worth the hassle to collect the data, but I’d like to think that someone has collected somewhat similar data in small-scale studies and tried to come up with (or test) ways to reduce the delays that overhead bin congestion contributes to boarding process and airplane turns. Maybe they aren’t a significant contributor to delays, or perhaps the solution truly does lie in extreme Basic Economy fares, to try to induce more pax to either not bring a large carryon or to pay for the privilege.

          I don’t see a way that it could easily be implemented in larger planes, except for perhaps a small portion of the pax, but I’ve always been a fan of the way many RJs and turboprops require pax to drop off and pick up their carryons at the bottom of the jetbridge.

  2. This is incredible news, not just for performance but for safety. It puts a brake on regionals under-bidding each other to win contracts because the majors will finally be held accountable for their performance. Hallelujah! Majors have long used low performing companies to put cost pressure on the better performing ones (it’s why GoJet exists and Mesa survived), this might swing the balance back towards competent performance.

    It’ll be interesting to see how this plays out long term. Majors have had differing philosophies about handling hub delay triage. Some would be quick to cancel or delay regional flights, keeping mainline statistics high and moving the most people, but then stranding them at hubs with their connections canceled. Others delayed the whole hub operation, canceling entire banks but ensuring that most journeys could be completed. This policy might favor the latter, though we’ll have to look at stats from individual weather events to see how it goes. A project for a future article, perhaps?

    1. How does this change safety at all? There’s no evidence that the most punctual airline is the safest (regional or not); safety is a different topic that doesn’t correspond to any of these metrics.

      1. It strongly incentivizes majors to award regional contracts to airlines with good performance, and it is exceedingly hard to have good performance if cost-cutting has eliminated all the redundancy from an operation. A really good safety culture is expensive. Good operational performance is expensive. Both get diminished when the lowest-cost bidder wins a contract.

        There’s a million ways big and little that cost cutting impacts safety and performance. The most obvious is labor. With bare bones staff and low pay you end up with perpetually novice labor who are overwhelmed and overworked, unfamiliar with and unable to handle irregular operations. With better staffing and higher pay you end up with more voices of experience, better mentors for the fewer novices, and the experience and bodies needed to handle irrops.

  3. “Branded codeshare partner airline,” this seems like quite a mouthful.

    As a consumer, if I have a choice of 10 flights from DCA to ORD, over some time period, just tell me, upfront, for each flight, who’s the operator, for whom, or for itself, and the past performance on-time record for the flight I’m looking at. Beyond that, do I care? Should anyone?

  4. I don’t understand why International flights are excluded from ontime performance statistics. As a frequent International traveler, I’ve noticed certain flights and Airlines are chronically late, causing me and hundreds of other Customers to miss their connecting flights. Often these flights are the last ones of the night, thus causing an overnight stay.

    1. A lot of the data reported is actually submitted by the carrier. It’s why small carriers are often excluded from the data — the assumption is that the paperwork requirements are onerous and not worth it.

      Likewise, I assume the assumption is that it would be difficult to enforce reporting requirements on foreign flagged carriers. The infrastructure requirements wouldn’t be trivial, and I’m sure some of the thinking is that the information isn’t worth the over head.

      1. Infrastructure? We’re talking about a few bytes of data. We don’t need the entire foreign airline’s network, just the flights that touch US soil, which could easily be collected with existing 21st century tech. Hell, I bet ACARS could do it automatically.

      2. Imposing it on foreign flagged carriers for flights that touch the US would seem to be possible. Other FAA rules are forced upon them as well (and vice versa – see EC 261/2004 for foreign airlines flying from the EU).

  5. Putting together regional and mainline aircraft brings a different distortion: A delayed or canceled CRJ that impacts 50 people (max) has the same contribution as a delayed 777 with 300 people. Reporting delayed/canceled passengers rather than aircraft would be a meaningful statistic.

    1. There’s 50 million (approximately) ways to slice and dice data. At some point, you have to pick something and go with it. With these kinds of statistics, the simpler they are and the more easier they are to explain, the better. As a practical matter, if I’m flying from LAX to ORD, I want to know what the odds of my flight arriving on time are, regardless of the number of people on the plane. If UA is running a 777, and AA is running and A320, UA would show worse statistics than AA.

      But the whole issue with separating out/lumping in the regional carriers with mainline is the exact kind of thing you were getting at. The regional carriers often have worse performance statistics than the majors, but a lot of that has to do with how FAA traffic flow management decisions impact short haul flights vs long haul flights. Stats reporting and gaming the system get interesting, because with separate statistics, the mainline carriers can shift a disproportionate amount of delay to their regional carriers and then advertise a higher on time performance. This is especially true when the majors are using a sub carrier that historically has been too small to report on its own. By treating all flights equally, the majors will have to change a bit on how they distribute delay.

  6. Alaska’s regionals have an advantage in that they really don’t have many flights to/from the really clogged up east coast airports (and will have none by the end of this month) compared to AS having its planes get stuck at JFK, EWR, PHL, ORD, or one of the DC airports. All three legacies have a hub in at least one of those clogged up places and thus lots of issues with regional flights being delayed or canceled.

      1. SEA is maxed out in terms of gate space, but it’s really not that delay prone. I don’t think I’ve been delayed more than 20 minutes due to weather all year this year.

        1. With the Virgin AMERICA merger Alaska now has a big presence at SFO which is very delay prone due to the fog and the cadence of approaches.

        2. There are threads on Flyertalk about AS proactively cancelling flights to/fro SEA due to capacity constraints and construction of the airport.

    1. You are absolutely right… and a lot of people don’t realize how much lower AS’ on-time stats are in their big east coast stations…. EWR and DCA are both below 60%, JFK is at 65% YET SEA is at 85%.
      AS’ NE stations take the delays and there is no ability to reduce them when those airports are in delay mode.

      It is also noteworthy that the legacies all have much higher on-time percentages in the NE – EWR, DCA and JFK while DL has nearly identical on-time at SEA as AS; DL mainline is actually higher than AS but AS’ regional is slightly higher than DL’s regionals.

      Overall the report is a very helpful step forward in reporting – but it also destroys some of the myths esp. by B6 that legacy carriers get by with higher on-time because the legacies delay their regional flights.

    2. When my commute to work is on delay-prone highways, you know what I do? SCHEDULE MORE TIME for the known delay.

      1. It’s like a delay prone highway, you know you won’t be able to go 15 miles in 15 minutes, but it could take anywhere from 20 to 75 minutes. It’s difficult to predict and block the time correctly.

  7. whew DOT should’ve reported like this since day 1. Having them split by operating carriers mean jack to most people because from a pax’s perspective, I’m not “getting delayed by Republic Air” or “complaining about Envoy” – it’s DL CNNX AA EGL and UA EXP.

    The report definitely shines light to the urban myth that DL’s operations are so miraculously better than everyone instead of mostly being a weighted-function of their airport presence. Let’s look at some busy airports at the 8 largest metros (using the “PSA” definition of it from wiki) where DL isn’t the one calling the shots (page 9-12 of PDF) –

    LGA: AA 66.7 DL 69.2 UA 67.6
    EWR: AA 54.5 DL 58.3 UA 62.1
    JFK: AA 64.4 DL 71.1

    The congested NE
    BOS: AA 66.8 DL 73.8 UA 67.5
    PHL: AA 69.6 DL 72.8 UA 68.6
    DCA: AA 73.1 DL 74.9 UA 71.5 (51% of DL’s DC presence is here)
    BWI: AA 62.1 DL 75.7 UA 71.1
    IAD: AA 65.9 DL 82.1 UA 74.5 (DL’s presence at IAD is roughly the same as at MDW)

    The rest of Top 8 metros
    LAX: AA 82.2 DL 85.4 UA 81.7
    SFO: AA 70.8 DL 84.9 UA 78.9
    ORD: AA 76.8 DL 74.3 UA 77.6 (DL also has a token presence at MDW clocking in 81.6)
    DFW: AA 74.6 DL 74.7 UA 71.4

    Adjusting for volume, the only true shining spot for DL in this data would be SFO at 84.9%. Everyone is equally great at LAX, equally stinky at LGA+EWR+PHL, and equally mediocre at DCA+ORD+DFW.

    But then again, there are also trolls who *claim* they can noticeably tell the difference between a 81% and a 83% when though they’re only flying like 3-4 flights a month, go figure.

    1. Henry,
      all your data confirms is that DL does do better than its peers in the most delay prone airports. Top metros has nothing – absolutely nothing – to do with on-time performance. Airports by name do.

      And the reason why DL’s system is so much better than the rest of its peers is because their hubs – where the vast majority of flights operate – do so much better than their peers.

      How about you include the largest hubs for each airline and you will see that DL’s on-time percentages side by side with other airlines are 10 points better than its peers. When you outperform your peers in each’s largest hubs side by side and then outperform in the most congested airports even by a couple percentage points, customers notice and so does the DOT.

      1. haahhahahh top metros don’t matter ?? You think all those business travelers actually are heading to Syracuse instead of Dallas ? omg i can’t stop laughing hahahaha …. who do you think would be more pissed? those 3-day close in business folks getting the short end of the stick on things like DL LGA-DFW cuz the regional E75 got down-prioritized when NYC airspace is jammed up, or a bunch of family-of-4-with-screaming-infants en route to annual Orlando pilgrimage?

        but it’s okay – i get you. when presented with actual facts (all actually quoted from CF’s linked PDF), the standard response is claiming how none of that matters, and only those cherry-picked metrics that are thoroughly useless to others matter. you’re probably the same type who thinks SEA-KIX is a guaranteed gold mine, despite multiple DL/NW failures, and hardly anyone else figured out how to crack the nut hahhah


        And that’s before highlighting the obvious – many of DL’s so called BFFs only seem to benefit DL unidirectionally instead of being a true two-way street •••••••

        • Aeromexico (AM) – Good chunk of new route chops, including high profile links like MEX-BOS and MEX-IAD. (

        • WestJet (WS) – Now having a mid-life crisis and can’t decide what they wanna do strategically. Trying to be ULCC and super premium at the same time. There’s a reason the 2 previous large attempts at YYC didn’t work out – Canadian Air and Canada 3000. Its only function seems to be keeping AirCanada relatively-more-honest instead of charting truly new waters.

        • GOL (G3) – The stagnation of GOL (and that weird attempt of routing via the Caribbean to reach the US) has given Avianca Brasil a major window of opportunity to backfill connectivity when TAM defected. It’s getting squeezed on the low-end by Azul and on the high-end by AvB. And now DL/GOL/Skyteam-combined is the only one who has no nonstop presence on the all-important MIA-GRU.

        • WestJet (WS) – Now having a mid-life crisis and can’t decide what they wanna do strategically. Trying to be ULCC and super premium at the same time. There’s a reason the 2 previous large attempts at YYC didn’t work out – Canadian Air and Canada 3000. Its only function seems to be keeping AirCanada relatively-more-honest instead of charting truly new waters.

        • Virgin Australia (VA) – The perpetual Oceania basket case, always placing 3rd in the 3-way race with QF and NZ. EY’s 21% share is a lameduck because they’re figuring out survival themselves. ANZ dumped their shares to HNA, then HNA is a mess and now needs to find someone to bail them out. QF+NZ’s newfound friendship is gonna be eating VA’s lunch across the Tasman – that should be crystal clear to anyone without rose tinted glasses.

        And there’s that lack of connectivity expansion. Between NA and AUS/NZ, exc Hawaii, DL+VA is still LAX-SYD/MEL/BNE, where they have been for years (on and off). Meanwhile, Star flies nonstop to 4 airports in A/NZ (SYD MEL BNE AKL) and 5 airports in NA (LAX SFO YVR IAH ORD).

        • AirFranceKLM (AF)(KL) – The respectable KL is perpetually dragged down by AF and god-knows-what-Joon is. They’ve been #3 in a 3-way European legacy race for the longest time. And the warring factions between CDG and AMS aren’t helping things when both IAG and LH Group, for all practical purposes, have truly unified strategic leaderships out of LHR and FRA.

        • Jet Airways (9W) – don’t wanna sound like fear-mongering, but Jet’s situation is heading down the same quagmire where Kingfisher once was (

        And this is a core component of DL’s India strategy after 9W moved their scissor hub to AMS, with the only fallback being DL’s own BOM route which is still up in the air – nothing concrete since May 2018 non-announcement. If it becomes true, welcome to the 1-nonstop-route-to-India-club. Meanwhile, between NA and India, combining all seasonal service, Star flies nonstop DEL-JFK/EWR/IAD/ORD/SFO/YYZ/YVR and BOM-EWR/YYZ.

        If Alitalia and AirBerlin fiascos told us anything, it’s that Etihad is more than willing to walk away, now that James Hogan is no longer CEO with his “EY Alliance” fantasy dream.

  8. The Alaska Airlines description is a bit inaccurate: “now includes regional operations by Horizon, PenAir, and SkyWest”

    Horizon makes sense, since currently every Horizon flight is for Alaska. (Though they used to fly for F9.)
    From what I can see, PenAir operates one route which rolls up to Alaska’s reporting (ANC-DUT) The other flights are sold under a codesharing arrangement. (Alaska’s flights sold under AA’s code doesn’t count against AS or vice versa.. right?)
    SkyWest flies for more majors than just Alaska…

    Some of this is an outgrowth of Alaska’s decision to not have a regional nameplate, but to have the flights flown under AlaskaHorizon and AlaskaSkyWest. (Which looks a whole lot better on the side of the plane than in text.

    Perhaps it should’ve been phrased, “Alaska now includes AlaskaHorizon flights operated by Horizon Airlines, AlaskaSkywest flights operated by Skywest airlines, and a single PenAir route between Anchorage and Dutch Harbor.”

    1. Nick,
      see page 5 of the Sept. DOT Air Travel Consumer Report. It includes the list of regional carriers that roll up as part of the branded codeshare partner for each mainline carrier.
      SkyWest is listed for all 4 mainline carriers except HA. There are other regional carriers that fly for multiple mainline carriers and the reporting shows that.

      The DOT includes only those flights that operate for each mainline carrier and that is what consumers care about. There are differences for the same regional carrier operating for two different mainline carriers at the same airport – because the mainline carrier in most cases schedules the flight and controls enough of the reimbursement to impact the operation.

      The DOT and consumers don’t really care what the specific name is that mainline carriers assign to their regional carrier subcontracted operations.

      1. Tim, The my point wasn’t that the branded name assigned to the regional operator wasn’t listed, it was that the phrasing made it seem like all SkyWest flights (including those sold as American Eagle, Delta Connection, and United Express) rolled up to the Alaska numbers. Likewise the phrasing made it seem like all PenAir flights rolled up to Alaska, when only one route does.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Cranky Flier