Delta CEO Ed Bastian’s wide-ranging keynote last week rolled out all sorts of technological goodies, but there was only one that really caught my eye. PARALLEL REALITY is a technology that can project unique, personalized messages to a number of different people from the same exact screen. This sounds like the future, and I could not understand at all how this worked. So, I hopped on a call with Albert Ng, CEO of Misapplied Sciences, to learn more. It’s just as cool as I’d hoped.
Here’s how Delta’s promo video makes it look:
Yeah, that is pretty slick. But how on earth is that happening? Well, it’s all about the pixels. A normal screen is made up of tiny pixels. Each pixel can show one color of light at a time. When you look at a screen, it broadcasts that light out, and any time the image needs to change, each pixel does its job and switches to a different color. It’s basically a high-tech version of those old-timey card stunts (via GIPHY) at a football game.
I know, I know, two animated GIFs in a single post is a lot, but it helps to illustrate what’s going on.
In PARALLEL REALITY, the difference is that each of those pixels has the ability to simultaneously put out a whole bunch of different colors, and it can then direct those colors to show only in a very specific area. It’s so specific that Albert told me they have the ability to direct one message to one of your eyes and another to the other.
That’s why when you look at the messages in the first GIF above, you see different colors for each one. The display is shooting out those different messages using different colors with pinpoint accuracy.
The first test of this will be in Detroit where a screen can show up to 100 unique messages at a time. That’s just how this was designed, but they could have thousands if it was a big enough display that had that many people around.
Your next question is probably the same as mine was. How does the system know where to aim those light beams?
The initial test requires that you scan your boarding pass to opt in. When you do that, there’s a camera above that scans your outline and starts to follow you in the immediate area. It’s that camera that then tells the system where to direct the beams.
This is just the test, but in the future, they have many ideas about how this could work better. One that I assumed would be in the works is using the Fly Delta app. If you have the app, you could in theory flip on permission and then the cameras can track you once they detect your phone in the area. That is one option, but there are others. And it may be possible they don’t need to use cameras. They’re still exploring all options.
The appeal of this is that it can direct personalized messages in any language. In a disorienting global hub, this is hugely helpful. But it doesn’t just have to replace flight information displays. They could embed screens in the ground to provide arrows taking you straight to your gate. They could put a screen anywhere, really. Your own phone may be able to do some of this, but still, having wayfinding built into the airport environment means you don’t need to rely on your own tech.
I asked about how much this all costs, but of course, they won’t say. But they do say that they see a path in the future where this costs about as much as a regular LED screen. If they can get that to happen, this technology has real potential.