From a surfboard designer that comes to life right before your eyes, to a one-of-a-kind, dual screen Kinect beach soccer game, RZRShop was a huge hit at this year’s National Retail Federation Annual Convention and Expo. Check out some of the highlights.
The assignment was, well, not so simple. Design and build a fully immersive, best in class, innovative future travel experience for passengers bustling through one of the busiest and fastest growing airports in the country — San Francisco International. The Flight Deck will serve as the gateway experience to the newly renovated Terminal 3, Boarding Area E (or Terminal 3E as it’s known as).
When the Emerging Experiences team was contracted by Hensel-Phelps, one of the largest general contractors and construction managers in the United States, to create this experience we could not have been more excited. For a multi-disciplinary team like us, built of experience designers, creatives, techies, and passionate people that love to collaborate and deliver unexpected, delightful experiences, this project screamed open canvas. Our team is built for these challenging, yet experimental and fun engagements. And the fruits of the tight collaboration between tech and creative shine bright. Literally. Through a 20,000 lumen projector onto oversized panels of glass displaying real-time flight data.
So just what is Flight Deck? The SFO Flight Deck is composed of three distinct but connected digital experiences — an interactive and real time large scale projection (you can’t miss it), six multi-touch kiosks that are rich with beautiful content, and a mobile takeaway component for those on the go. The entire experience resides in Terminal 3, but the projected visualization serves as a beacon calling all SFO guests to contribute to the global SFO travel story. Content in all three experiences encompasses the entire airport and extends its reach into the city of San Francisco and global destinations.
In order to serve the wide gamut of travel guests, the team operated under the following experience pillars at all times. If an idea did not fit under two or more of these pillars, it hit the floor.
1. Quick and meaningful interactions should maintain quality and depth.
2. Playful collaborations should be maintained throughout the experience.
3. Your personal narrative is part of the SFO story.
Why? As mentioned earlier, the assignment was not so simple. Today’s travelers are in a rush. How and why do we make busy passengers take the time to stop and engage? Inspired by the golden age of travel, when it was considered a magical thing and an experience to do so, we wanted to bring back the joy of the travel experience.
SFO is the just the airport to foster that joy. Featuring an on-site museum with collections that could rival most of the best metropolitan museums we’ve ever visited, we were allowed to dig in to the archives of hundreds of photos of the museum’s aviation collection. And we bring a lot of these golden moments to guests in Flight Deck, hoping they take the time to smile and remember the transformative and magical experiences that travel can bring.
Terminal 3E is scheduled to open for passengers on January 28, but folks were able to get a sneak peek at the T3E Community Day on Saturday, January 25th. The Flight Deck was a huge hit for folks of all ages, some of which said they plan on coming a bit early on their next flight out of SFO just so they could explore it more. Check out our updates from the event on Twitter and Facebook, and be sure to vote for your favorite feature in Terminal 3 here (guess what we voted for?).
For this year’s National Retail Federation Annual Convention and Expo, we’ve built a highly interactive, surf-and-sand-inspired shopping experience. Built inside a uniquely portable container, RZRShop creates big moments inside a smaller, more nimble retail footprint. Digital meets physical with a Perceptive Pixel-powered surfboard designer that comes to life right before your eyes. And battle lines are drawn with a one-of-a-kind, dual screen Kinect beach soccer game and smart vending experience. Powered by the Razorfish 5D Retail Platform, RZRShop seamlessly connects a variety of devices to attract customers, drive product engagement and arm store associates with more contextualized tools. The end result is a fun and personal experience—the way shopping should be.
Visit emergingexperiences.com/rzrshop for more information, and visit us in Microsoft booth #2703 to experience RZRShop for yourself.
Stay tuned to find out what it is.
I’ve had a little less than a week to play with the new Kinect for Windows v2 so far, thanks to the developer preview program and the Kinect MVP program. So far it is everything Kinect developers and designers have been hoping for – full HD through the color camera and a much improved depth camera as well as USB 3.0 data throughput.
Additionally, much of the processing is now occurring on the GPU rather than the onboard chip or your computer’s CPU. While amazing things were possible with the first Kinect for Windows sensor, most developers found themselves pushing the performance envelope at times and wishing they could get just a little more resolution or just a little more data speed. Now they will have both.
At this point the programming model has changed a bit between Kinect for Windows v1 and Kinect for Windows v2. While knowing the original SDK will definitely give you a leg up, a bit of work will still need to be done to port Kinect v1 apps to the new Kinect v2 SDK when it is eventually released.
What’s different between the new Kinect for XBox One and the Kinect for Windows v2? It turns out not a lot. The Kinect for XBox has a special USB 3.0 adapter that draws both lots of power as well as data from the XBox One. Because it is a non-standard connector, it can’t be plugged straight into a PC (unlike with the original Kinect which had a standard USB 2.0 plug).
To make the new Kinect work with a PC, then, requires a special breakout board. This board serves as an adapter with three ports – one for the Kinect, one for a power source, finally one for a standard USB 3.0 cable.
We can also probably expect the firmware on the two versions of the new Kinect sensor to also diverge over time as occurred with the original Kinect.
Skeleton detection is greatly improved with the new Kinect. Not only are more joints now detected, but many of the jitters developers became used to working around are now gone. The new SDK recognizes up to 6 skeletons rather than just two. Finally, because of the improved Time-of-Flight depth camera, which replaces the Primesense technology used in the previous hardware, the accuracy of the skeleton detection is much better and includes excellent hand detection. Grip recognition as well as Lasso recognition (two fingers used to draw) are now available out of the box – even in this early alpha version of the SDK.
I won’t hesitate to say – even this early in the game – that the new hardware is amazing and is leaps and bounds better than the original sensor. The big question, though, is whether it will take off the way the original hardware did.
If you recall, when Microsoft released the first Kinect sensor they didn’t have immediate plans to use it for anything other than a game controller – no SDK, no motor controller, not a single luxury. Instead, creative developers, artists, researchers and hackers figured out ways to read the raw USB data and started manipulating it to create amazingly original applications that took advantage of the depth sensor – and they posted them to the Internet.
Will this happen the second time around? Microsoft is endeavoring to do better this time by getting an SDK out much earlier. As I mentioned above, the alpha SDK for Kinect v2 is already available to people in the developer preview program. The trick will be in attracting the types of creative people that were drawn to the Kinect two years ago – the kind of creative technologists Microsoft has always had trouble attracting toward other products like Windows Phone and Windows tablets.
My colleagues and I at Razorfish Emerging Experiences are currently working on combining the new Kinect with other technologies such as Oculus Rift, Google Glass, Unity 3D, Cinder, Leap Motion and 4K video. Like a modern day scrying device (or simply a mad scientist’s experiment) we hoping that by simply mixing all these gadgets together we’ll get a glimpse at what the future looks like and, perhaps, even help to create that future.
This week at the Cannes Lions International Festival of Creativity, we debuted our latest creation—KinectiChord: a multiuser, multisensory experience that blends physical and digital in an unexpected and delightful way. On display in the Microsoft Advertising Beach Club, this experience allows multiple users to see, hear and feel technology like never before.
Get your hands on the 5D experience by embarking on a unique shopping journey that utilizes a variety of platforms and technologies, including a first of it’s kind, seamlessly-synchronized transparent interactive display wall. It’s located in the Microsoft booth (1005) on Level 3. And to see more of 5D in action, head on over to emergingexperiences.com/5D.
Our lab is buzzing with activity as the team prepares for the National Retail Federation’s 102nd Annual Convention & EXPO in New York. On display will be the latest iteration of Razorfish 5D— the world’s first cross-device, cross-OS, connected retail platform. Launched at last year’s NRF convention, 5D has already been launched in several markets and was used to create Audi City London, a one-of-a-kind immersive virtual showroom. This year we’re showing how our platform can power customized, personalized and seamlessly synchronized shopping experiences. We threw in some augmented reality and a bunch of transparent displays as well.
Our team will be demonstrating the 5D experience in booth #1005 on Level 3 of the Jacob K. Javits Convention Center. If you can’t make the show, be sure to follow us on Twitter to get the latest updates.
Most Contagious, that is.
We’re excited to announce that Audi City London has claimed Contagious Magazine’s Most Contagious Retail Award at a ceremony today in London. This experience was a year-long collaborative effort between Audi and a wide range of partners, and was launched near Piccadilly Circus just ahead of the summer Olympics. It is delivered by one of the most technologically advanced retail environments ever created and features a variety of multi-touch displays for configuring your Audi from millions of possible combinations. Once you’ve created your personalized Audi at this groundbreaking dealership, you can toss it onto one of the floor-to-ceiling digital “powerwalls” to visualize and explore your configuration at a true 1:1 scale. Audi City London is a true dealership of the future and an effort we were proud to be part of.
Photo: Gaurav Singh
When we’re playing in our Lab, we’re always looking for creative ways to push the limits of technology. Some of our projects are just for fun, and others, like London’s Audi City, completely reinvent the way people shop. We were even thinking about digital wallets before they were cool. So when we set out to create the Razorfish 5D platform, our goal was to design a powerful and highly immersive way for brands to connect with consumers—before, during and after the shopping experience. In our latest video, we show how our 5D platform seamlessly connects a variety of digital devices to better attract consumers into the store, drive product engagement and arm store associates with more contextualized digital tools. The end result is a fun and personal experience, the way shopping should be.