Mar 25, 2014 by in Kinect, Microsoft Kinect, News, Projection, Technology, Travel

As passengers of San Francisco International Airport (SFO) walk into the newly renovated Terminal 3E they are greeted by a 16-foot wide glowing data visualization that hovers above six multi-touch screens, the Flight Deck. The Flight Deck highlights SFO’s global reach, services, amenities, and museum exhibits. We want to share the backstory about the development of the interactive data visualization component of the Flight Deck.


The Projection

Our goal was to create a visual beacon in an airport that stood out from other digital displays and had the ability to be seen on both sides. We envisioned a uniquely shaped floating image without bezels that was constantly moving to capture the attention of passengers whether they are arriving or departing. To meet this challenge, we experimented with several concepts but decided on attempting a 24/7 rear-projection in a bright airport terminal. Rear-projections in backlit environments are difficult since projectors don’t project black. Initial tests (images below) showed that a projection in the space would be possible given the proper projector, proper dark grey rear-projection film, and window treatment. We worked with the architects to specify a frit for the windows immediately behind the projection. We used a 20,000-lumen projector, the brightest rear-projection film we could find, and optimized the placement of the glass and projector for the visitor approach. Visitors approach the Flight Deck parallel to the projector, so that they experience the brightest portion of the projection when walking toward the Flight Deck.


The Experience

We wanted to create an ever-changing experience where a Kinect camera would use people’s presence to alter the background, and real-time data from flight paths would highlight the global reach of SFO from moment to moment. Key rewards were also synchronized with the touch screens so that visitors are able to take over the entire projection if they collect hidden rewards within the touch screen experience. The software was developed using Cinder, and used OSC for network communication. An early decision was to heavily thread our architecture so that we could maintain a minimum of 60 fps for low interaction latency and smooth animation. Unfortunately several components in Cinder were not properly written for heavy multithreading such as the Console or the Kinect class, so we added locks and properly threaded components as needed. Early interaction prototypes tested the limits of our frame rate and latency to determine areas for optimization.


Kinect Interaction

As curious visitors approach the Flight Deck a Kinect camera captures their form and creates triangles and shapes that play with their form using an attraction based physics system, as seen in the image above. As visitors move left to right the triangles leave trails of triangles on the screen that orbit the space, creating constantly changing visuals. Visitors can also use their arms to attract and control the shapes around them. A rear mounted Kinect camera captures approaching visitors and uses computer vision algorithms to convert the depth image to blobs. The computer vision allows for tracking of any number of people, vs two people for skeleton tracking. In addition, we also use depth data from a depth range that is normally unreliable for skeleton tracking. We modified the Kinect API for Cinder to use an extended depth range to capture people who are up to 16 feet away and to see people within a 15 foot wide area.

Data Visualization

Live flight data is visualized on screen as well as inbound and outbound activity from SFO in the last 24 hours. Positional data from flights is received in a polar coordinate system and is converted to a Cartesian coordinate system on the GPU. Our data visualization uses the positional data from flights so that flight paths can be visualized with bends and waviness vs perfectly smoothed flight paths. The challenge with positional data was that live flight data is not always available in all positions of a flight so some interpolation had to be mixed with the actual data. SLERP was utilized for interpolating positions and real data was combined with interpolated data using a gated smoothing algorithm. The approach allowed for us to properly visualize flights, for example flights to Asia, where positions are occasionally lost but picked up at various locations.

Looking Forward

The interactive data visualization represents only one aspect of the Flight Deck. The Flight Deck also consists of a multi-touch experience and a mobile experience and was a collaboration of creative and technical talent. Since the launch, we’ve heard a lot of feedback where people are comparing the Flight Deck to their favorite sci-fi shows and movies. We hope that we can continue to create forward thinking experiences for the public.


Can’t Make it to SFO? Experience Flight Deck Right Here

Mar 25, 2014 by in 5D, Experience Design, Kinect, Multi-touch, Portfolio, Projection, Retail, Technology, Touchscreen, Travel


While Flight Deck is best experienced in person, you can get a quick fly-through of the experience right now. What is Flight Deck? Read all about it here.


RZRShop Highlights from NRF

Jan 27, 2014 by in 5D, Augmented Reality, Experience Design, Kinect, Mobile, Multi-touch, News, Portfolio, Projection, Retail, Technology, Touchscreen

From a surfboard designer that comes to life right before your eyes, to a one-of-a-kind, dual screen Kinect beach soccer game, RZRShop was a huge hit at this year’s National Retail Federation Annual Convention and Expo. Check out some of the highlights.

Tags: ,

Experience The Future of Travel with San Francisco International Airport’s Flight Deck

Jan 24, 2014 by in Experience Design, Kinect, Multi-touch, News, Portfolio, Projection, Retail, Technology, Touchscreen, Travel

The assignment was, well, not so simple. Design and build a fully immersive, best in class, innovative future travel experience for passengers bustling through one of the busiest and fastest growing airports in the country — San Francisco International. The Flight Deck will serve as the gateway experience to the newly renovated Terminal 3, Boarding Area E (or Terminal 3E as it’s known as).

When the Emerging Experiences team was contracted by Hensel-Phelps, one of the largest general contractors and construction managers in the United States, to create this experience we could not have been more excited. For a multi-disciplinary team like us, built of experience designers, creatives, techies, and passionate people that love to collaborate and deliver unexpected, delightful experiences, this project screamed open canvas. Our team is built for these challenging, yet experimental and fun engagements. And the fruits of the tight collaboration between tech and creative shine bright. Literally. Through a 20,000 lumen projector onto oversized panels of glass displaying real-time flight data.


So just what is Flight Deck? The SFO Flight Deck is composed of three distinct but connected digital experiences — an interactive and real time large scale projection (you can’t miss it), six multi-touch kiosks that are rich with beautiful content, and a mobile takeaway component for those on the go. The entire experience resides in Terminal 3, but the projected visualization serves as a beacon calling all SFO guests to contribute to the global SFO travel story. Content in all three experiences encompasses the entire airport and extends its reach into the city of San Francisco and global destinations.

In order to serve the wide gamut of travel guests, the team operated under the following experience pillars at all times. If an idea did not fit under two or more of these pillars, it hit the floor.

1. Quick and meaningful interactions should maintain quality and depth.
2. Playful collaborations should be maintained throughout the experience.
3. Your personal narrative is part of the SFO story.

Why? As mentioned earlier, the assignment was not so simple. Today’s travelers are in a rush. How and why do we make busy passengers take the time to stop and engage? Inspired by the golden age of travel, when it was considered a magical thing and an experience to do so, we wanted to bring back the joy of the travel experience.

SFO is the just the airport to foster that joy. Featuring an on-site museum with collections that could rival most of the best metropolitan museums we’ve ever visited, we were allowed to dig in to the archives of hundreds of photos of the museum’s aviation collection. And we bring a lot of these golden moments to guests in Flight Deck, hoping they take the time to smile and remember the transformative and magical experiences that travel can bring.

Grand Opening

Terminal 3E is scheduled to open for passengers on January 28, but folks were able to get a sneak peek at the T3E Community Day on Saturday, January 25th. The Flight Deck was a huge hit for folks of all ages, some of which said they plan on coming a bit early on their next flight out of SFO just so they could explore it more. Check out our updates from the event on Twitter and Facebook, and be sure to vote for your favorite feature in Terminal 3 here (guess what we voted for?).





The Biggest Thing in Retail is Actually Pretty Small: Welcome to #RZRShop

Jan 13, 2014 by in 5D, Experience Design, Kinect, Microsoft Surface, Mobile, Multi-touch, Near Field, News, Portfolio, Retail, Technology, Touchscreen


For this year’s National Retail Federation Annual Convention and Expo, we’ve built a highly interactive, surf-and-sand-inspired shopping experience. Built inside a uniquely portable container, RZRShop creates big moments inside a smaller, more nimble retail footprint. Digital meets physical with a Perceptive Pixel-powered surfboard designer that comes to life right before your eyes. And battle lines are drawn with a one-of-a-kind, dual screen Kinect beach soccer game and smart vending experience. Powered by the Razorfish 5D Retail Platform, RZRShop seamlessly connects a variety of devices to attract customers, drive product engagement and arm store associates with more contextualized tools. The end result is a fun and personal experience—the way shopping should be.

Visit for more information, and visit us in Microsoft booth #2703 to experience RZRShop for yourself.

We’re Building the Next Big Thing

Jan 07, 2014 by in 5D, Experience Design, Kinect, Microsoft Kinect, Multi-touch, Retail, Technology, Touchscreen

Stay tuned to find out what it is.

Kinect for Windows v2 First Look

Dec 04, 2013 by in Kinect, Lab, Technology

I’ve had a little less than a week to play with the new Kinect for Windows v2 so far, thanks to the developer preview program and the Kinect MVP program. So far it is everything Kinect developers and designers have been hoping for – full HD through the color camera and a much improved depth camera as well as USB 3.0 data throughput.

Additionally, much of the processing is now occurring on the GPU rather than the onboard chip or your computer’s CPU. While amazing things were possible with the first Kinect for Windows sensor, most developers found themselves pushing the performance envelope at times and wishing they could get just a little more resolution or just a little more data speed.  Now they will have both.

At this point the programming model has changed a bit between Kinect for Windows v1 and Kinect for Windows v2. While knowing the original SDK will definitely give you a leg up, a bit of work will still need to be done to port Kinect v1 apps to the new Kinect v2 SDK when it is eventually released.

What’s different between the new Kinect for XBox One and the Kinect for Windows v2?  It turns out not a lot. The Kinect for XBox has a special USB 3.0 adapter that draws both lots of power as well as data from the XBox One. Because it is a non-standard connector, it can’t be plugged straight into a PC (unlike with the original Kinect which had a standard USB 2.0 plug).

To make the new Kinect work with a PC, then, requires a special breakout board. This board serves as an adapter with three ports – one for the Kinect, one for a power source, finally one for a standard USB 3.0 cable.

We can also probably expect the firmware on the two versions of the new Kinect sensor to also diverge over time as occurred with the original Kinect.

Skeleton detection is greatly improved with the new Kinect.  Not only are more joints now detected, but many of the jitters developers became used to working around are now gone. The new SDK recognizes up to 6 skeletons rather than just two. Finally, because of the improved Time-of-Flight depth camera, which replaces the Primesense technology used in the previous hardware, the accuracy of the skeleton detection is much better and includes excellent hand detection.  Grip recognition as well as Lasso recognition (two fingers used to draw) are now available out of the box – even in this early alpha version of the SDK.

I won’t hesitate to say – even this early in the game – that the new hardware is amazing and is leaps and bounds better than the original sensor. The big question, though, is whether it will take off the way the original hardware did.

If you recall, when Microsoft released the first Kinect sensor they didn’t have immediate plans to use it for anything other than a game controller – no SDK, no motor controller, not a single luxury. Instead, creative developers, artists, researchers and hackers figured out ways to read the raw USB data and started manipulating it to create amazingly original applications that took advantage of the depth sensor – and they posted them to the Internet.

Will this happen the second time around?  Microsoft is endeavoring to do better this time by getting an SDK out much earlier. As I mentioned above, the alpha SDK for Kinect v2 is already available to people in the developer preview program. The trick will be in attracting the types of creative people that were drawn to the Kinect two years ago – the kind of creative technologists Microsoft has always had trouble attracting toward other products like Windows Phone and Windows tablets.

My colleagues and I at Razorfish Emerging Experiences are currently working on combining the new Kinect with other technologies such as Oculus Rift, Google Glass, Unity 3D, Cinder, Leap Motion and 4K video. Like a modern day scrying device (or simply a mad scientist’s experiment) we hoping that by simply mixing all these gadgets together we’ll get a glimpse at what the future looks like and, perhaps, even help to create that future.

KinectiChord: Touch Technology Like Never Before

Jun 18, 2013 by in Experience Design, Kinect, Microsoft Kinect, Technology

This week at the Cannes Lions International Festival of Creativity, we debuted our latest creation—KinectiChord: a multiuser, multisensory experience that blends physical and digital in an unexpected and delightful way. On display in the Microsoft Advertising Beach Club, this experience allows multiple users to see, hear and feel technology like never before.

Bringing 5D to Life at NRF

Jan 14, 2013 by in 5D, Augmented Reality, Experience Design, Kinect, Microsoft Kinect, Microsoft Surface, Multi-touch, Near Field, News, Portfolio, Retail, Technology, Touchscreen

Get your hands on the 5D experience by embarking on a unique shopping journey that utilizes a variety of platforms and technologies, including a first of it’s kind, seamlessly-synchronized transparent interactive display wall. It’s located in the Microsoft booth (1005) on Level 3. And to see more of 5D in action, head on over to


We’re building the future of retail (and it’s kind of messy)

Jan 11, 2013 by in 5D, Augmented Reality, Experience Design, Kinect, Lab, Multi-touch, Near Field, News, Portfolio, Retail, Technology, Touchscreen

Our lab is buzzing with activity as the team prepares for the National Retail Federation’s 102nd Annual Convention & EXPO in New York. On display will be the latest iteration of Razorfish 5D— the world’s first cross-device, cross-OS, connected retail platform. Launched at last year’s NRF convention, 5D has already been launched in several markets and was used to create Audi City London, a one-of-a-kind immersive virtual showroom. This year we’re showing how our platform can power customized, personalized and seamlessly synchronized shopping experiences. We threw in some augmented reality and a bunch of transparent displays as well.

Our team will be demonstrating the 5D experience in booth #1005 on Level 3 of the Jacob K. Javits Convention Center. If you can’t make the show, be sure to follow us on Twitter to get the latest updates.