• Innovation
    In Retail:
    Audi City Launched

    Delivering a groundbreaking dealership experience in the heart of London.

  • Personal Shopping
    Re-invented

    Our first-ever Retail Experience Platform, 5D, connects shoppers and their devices with retailers like never before.

  • Beginning Kinect
    Programming
    Emerges

    Razorfish Emerging Experiences Group Authors Industry's First Book on Developing Applications for the Kinect for Windows SDK.

  • Touch the
    Future of Travel

    A unique inspiration, easier access to what you want when you want it, and sharing travel ideas with friends...Razorfish and Delta are making it happen.

  • KinectShop –
    The Next Generation
    of Shopping

    Kinect has many uses beyond games and entertainment. We created KinectShop to demonstrate the use of the Kinect platform in a retail or at-home augmented reality shopping experience.

  • The Science
    of the Perfect Fit

    By partnering with Bodymetrics & their 3D body-mapping technology, we're helping online shoppers find the perfect fit every time in their living room.

  • Empowering Small
    Business Needs

    AT&T came to us asking for a way to showcase their business solutions in-store. We delivered an immersive, touch-driven solution that put the customer behind the wheel and encouraged exploration.

<
>

Future of Retail: Indoor Positioning with iBeacons

Mar 25, 2014 by in iBeacon, Lab, Mobile, Projection, Retail, Technology

In the Razorfish Emerging Experiences labs, we’ve been prototyping next generation retail experiences that can identify and track consumers with the goal of offering timely and relevant information. Using Bluetooth Low Energy (BLE) indoor location-based tracking, we’re able to see movements and identities of smartphones throughout retail environments. Our systems know how many people are in a space (image below), who they are, when they entered, where they went, where they lingered, and when they left. Based on the location of consumers, we can serve contextually relevant content to their smartphones or to displays in their proximity.

2People

Mobile Integration with Retail

BLE indoor position tracking can integrate mobile experiences into a physical retail space. With indoor location tracking, mobile wayfinding can help guide consumers to items in their mobile wish list or even to less-crowded checkout lines. Consumers could use their mobile devices to signal for help, and associates can use location tracking (image below) to find the customer. Displays near a consumer could recognize them and display messages relevant to their shopping history and accept content that is pushed to them from mobile devices.

3

Retail Metrics

BLE indoor tracking allows for in-store metrics tracking that is tied to the identity of a customer. Data can be collected from a customer’s entire shopping visit, instead of only at the checkout line. In the image above, a server tracks the movement of a smartphone from the moment it enters and can collect metrics on dwell times, traffic hot spots, and indoor location history. When retail locations connect this data to POS metrics they could track how long each purchase decision took as well as what other products a customer considered during their visit by tracking linger times in front of products that were not purchased. Retail locations could track the effectiveness of in-store messaging, end cap linger times, and incentives that are pushed to smartphones in-store. BLE indoor tracking can also help associates better service customers. By accessing real-time indoor tracking data, associates can optimize where they need to be to offer assistance and guide sales. When engaged with a customer, associates could also see what in-store items a customer was mostly likely interested in.

We’re excited to work with these new technologies and along the way, find better ways for retailers to more effectively connect with consumers.

Tags:

MAKING OF SFO’S FLIGHT DECK DATA VIZ

Mar 25, 2014 by in Kinect, Microsoft Kinect, News, Projection, Technology, Travel

As passengers of San Francisco International Airport (SFO) walk into the newly renovated Terminal 3E they are greeted by a 16-foot wide glowing data visualization that hovers above six multi-touch screens, the Flight Deck. The Flight Deck highlights SFO’s global reach, services, amenities, and museum exhibits. We want to share the backstory about the development of the interactive data visualization component of the Flight Deck.

12934488993

The Projection

Our goal was to create a visual beacon in an airport that stood out from other digital displays and had the ability to be seen on both sides. We envisioned a uniquely shaped floating image without bezels that was constantly moving to capture the attention of passengers whether they are arriving or departing. To meet this challenge, we experimented with several concepts but decided on attempting a 24/7 rear-projection in a bright airport terminal. Rear-projections in backlit environments are difficult since projectors don’t project black. Initial tests (images below) showed that a projection in the space would be possible given the proper projector, proper dark grey rear-projection film, and window treatment. We worked with the architects to specify a frit for the windows immediately behind the projection. We used a 20,000-lumen projector, the brightest rear-projection film we could find, and optimized the placement of the glass and projector for the visitor approach. Visitors approach the Flight Deck parallel to the projector, so that they experience the brightest portion of the projection when walking toward the Flight Deck.

12934486173
12934786134

The Experience

We wanted to create an ever-changing experience where a Kinect camera would use people’s presence to alter the background, and real-time data from flight paths would highlight the global reach of SFO from moment to moment. Key rewards were also synchronized with the touch screens so that visitors are able to take over the entire projection if they collect hidden rewards within the touch screen experience. The software was developed using Cinder, and used OSC for network communication. An early decision was to heavily thread our architecture so that we could maintain a minimum of 60 fps for low interaction latency and smooth animation. Unfortunately several components in Cinder were not properly written for heavy multithreading such as the Console or the Kinect class, so we added locks and properly threaded components as needed. Early interaction prototypes tested the limits of our frame rate and latency to determine areas for optimization.

12934373415

Kinect Interaction

As curious visitors approach the Flight Deck a Kinect camera captures their form and creates triangles and shapes that play with their form using an attraction based physics system, as seen in the image above. As visitors move left to right the triangles leave trails of triangles on the screen that orbit the space, creating constantly changing visuals. Visitors can also use their arms to attract and control the shapes around them. A rear mounted Kinect camera captures approaching visitors and uses computer vision algorithms to convert the depth image to blobs. The computer vision allows for tracking of any number of people, vs two people for skeleton tracking. In addition, we also use depth data from a depth range that is normally unreliable for skeleton tracking. We modified the Kinect API for Cinder to use an extended depth range to capture people who are up to 16 feet away and to see people within a 15 foot wide area.

Data Visualization

Live flight data is visualized on screen as well as inbound and outbound activity from SFO in the last 24 hours. Positional data from flights is received in a polar coordinate system and is converted to a Cartesian coordinate system on the GPU. Our data visualization uses the positional data from flights so that flight paths can be visualized with bends and waviness vs perfectly smoothed flight paths. The challenge with positional data was that live flight data is not always available in all positions of a flight so some interpolation had to be mixed with the actual data. SLERP was utilized for interpolating positions and real data was combined with interpolated data using a gated smoothing algorithm. The approach allowed for us to properly visualize flights, for example flights to Asia, where positions are occasionally lost but picked up at various locations.

Looking Forward

The interactive data visualization represents only one aspect of the Flight Deck. The Flight Deck also consists of a multi-touch experience and a mobile experience and was a collaboration of creative and technical talent. Since the launch, we’ve heard a lot of feedback where people are comparing the Flight Deck to their favorite sci-fi shows and movies. We hope that we can continue to create forward thinking experiences for the public.

Tags:

Can’t Make it to SFO? Experience Flight Deck Right Here

Mar 25, 2014 by in 5D, Experience Design, Kinect, Multi-touch, Portfolio, Projection, Retail, Technology, Touchscreen, Travel

>

While Flight Deck is best experienced in person, you can get a quick fly-through of the experience right now. What is Flight Deck? Read all about it here.

Tags:

Audi City Berlin Launched

Feb 05, 2014 by in 5D, Experience Design, Microsoft Kinect, Mobile, Multi-touch, News, Portfolio, Retail, Technology, Touchscreen

First London, then Beijing and now Berlin: Audi City has transformed the dealership experience in ways never before seen. It’s delivered by one of the most technologically advanced retail environments ever created and features a variety of multi-touch displays for configuring your Audi from millions of possible combinations. Personalize your Audi then toss it onto one of the floor-to-ceiling digital “powerwalls” to visualize and explore your configuration at a true 1:1 scale. Audi City is a true dealership of the future and an effort we are proud to be part of.

Untitled-2

2

3

4

Tags:

RZRShop Highlights from NRF

Jan 27, 2014 by in 5D, Augmented Reality, Experience Design, Kinect, Mobile, Multi-touch, News, Portfolio, Projection, Retail, Technology, Touchscreen

From a surfboard designer that comes to life right before your eyes, to a one-of-a-kind, dual screen Kinect beach soccer game, RZRShop was a huge hit at this year’s National Retail Federation Annual Convention and Expo. Check out some of the highlights.

Tags: ,

Experience The Future of Travel with San Francisco International Airport’s Flight Deck

Jan 24, 2014 by in Experience Design, Kinect, Multi-touch, News, Portfolio, Projection, Retail, Technology, Touchscreen, Travel

Untitled-1

The assignment was, well, not so simple. Design and build a fully immersive, best in class, innovative future travel experience for passengers bustling through one of the busiest and fastest growing airports in the country — San Francisco International. The Flight Deck will serve as the gateway experience to the newly renovated Terminal 3, Boarding Area E (or Terminal 3E as it’s known as).

When the Emerging Experiences team was contracted by Hensel-Phelps, one of the largest general contractors and construction managers in the United States, to create this experience we could not have been more excited. For a multi-disciplinary team like us, built of experience designers, creatives, techies, and passionate people that love to collaborate and deliver unexpected, delightful experiences, this project screamed open canvas. Our team is built for these challenging, yet experimental and fun engagements. And the fruits of the tight collaboration between tech and creative shine bright. Literally. Through a 20,000 lumen projector onto oversized panels of glass displaying real-time flight data.

2

So just what is Flight Deck? The SFO Flight Deck is composed of three distinct but connected digital experiences — an interactive and real time large scale projection (you can’t miss it), six multi-touch kiosks that are rich with beautiful content, and a mobile takeaway component for those on the go. The entire experience resides in Terminal 3, but the projected visualization serves as a beacon calling all SFO guests to contribute to the global SFO travel story. Content in all three experiences encompasses the entire airport and extends its reach into the city of San Francisco and global destinations.

In order to serve the wide gamut of travel guests, the team operated under the following experience pillars at all times. If an idea did not fit under two or more of these pillars, it hit the floor.

1. Quick and meaningful interactions should maintain quality and depth.
2. Playful collaborations should be maintained throughout the experience.
3. Your personal narrative is part of the SFO story.

Why? As mentioned earlier, the assignment was not so simple. Today’s travelers are in a rush. How and why do we make busy passengers take the time to stop and engage? Inspired by the golden age of travel, when it was considered a magical thing and an experience to do so, we wanted to bring back the joy of the travel experience.

SFO is the just the airport to foster that joy. Featuring an on-site museum with collections that could rival most of the best metropolitan museums we’ve ever visited, we were allowed to dig in to the archives of hundreds of photos of the museum’s aviation collection. And we bring a lot of these golden moments to guests in Flight Deck, hoping they take the time to smile and remember the transformative and magical experiences that travel can bring.

Grand Opening

Terminal 3E is scheduled to open for passengers on January 28, but folks were able to get a sneak peek at the T3E Community Day on Saturday, January 25th. The Flight Deck was a huge hit for folks of all ages, some of which said they plan on coming a bit early on their next flight out of SFO just so they could explore it more. Check out our updates from the event on Twitter and Facebook, and be sure to vote for your favorite feature in Terminal 3 here (guess what we voted for?).

Untitled-1

4

3

Untitled-1


The Biggest Thing in Retail is Actually Pretty Small: Welcome to #RZRShop

Jan 13, 2014 by in 5D, Experience Design, Kinect, Microsoft Surface, Mobile, Multi-touch, Near Field, News, Portfolio, Retail, Technology, Touchscreen

Untitled-1

For this year’s National Retail Federation Annual Convention and Expo, we’ve built a highly interactive, surf-and-sand-inspired shopping experience. Built inside a uniquely portable container, RZRShop creates big moments inside a smaller, more nimble retail footprint. Digital meets physical with a Perceptive Pixel-powered surfboard designer that comes to life right before your eyes. And battle lines are drawn with a one-of-a-kind, dual screen Kinect beach soccer game and smart vending experience. Powered by the Razorfish 5D Retail Platform, RZRShop seamlessly connects a variety of devices to attract customers, drive product engagement and arm store associates with more contextualized tools. The end result is a fun and personal experience—the way shopping should be.

Visit emergingexperiences.com/rzrshop for more information, and visit us in Microsoft booth #2703 to experience RZRShop for yourself.


Big Ideas Come in Small Packages

Jan 08, 2014 by in 5D, Experience Design, Microsoft Kinect, News, Retail, Technology, Touchscreen

NRF Container

Our container arrived at New York’s Javits Center this morning where our team is prepping for next week’s National Retail Federation annual convention and expo. What’s in the container? Stay tuned.

Tags: , ,

We’re Building the Next Big Thing

Jan 07, 2014 by in 5D, Experience Design, Kinect, Microsoft Kinect, Multi-touch, Retail, Technology, Touchscreen

Stay tuned to find out what it is.


Kinect for Windows v2 First Look

Dec 04, 2013 by in Kinect, Lab, Technology

I’ve had a little less than a week to play with the new Kinect for Windows v2 so far, thanks to the developer preview program and the Kinect MVP program. So far it is everything Kinect developers and designers have been hoping for – full HD through the color camera and a much improved depth camera as well as USB 3.0 data throughput.

Additionally, much of the processing is now occurring on the GPU rather than the onboard chip or your computer’s CPU. While amazing things were possible with the first Kinect for Windows sensor, most developers found themselves pushing the performance envelope at times and wishing they could get just a little more resolution or just a little more data speed.  Now they will have both.

At this point the programming model has changed a bit between Kinect for Windows v1 and Kinect for Windows v2. While knowing the original SDK will definitely give you a leg up, a bit of work will still need to be done to port Kinect v1 apps to the new Kinect v2 SDK when it is eventually released.

What’s different between the new Kinect for XBox One and the Kinect for Windows v2?  It turns out not a lot. The Kinect for XBox has a special USB 3.0 adapter that draws both lots of power as well as data from the XBox One. Because it is a non-standard connector, it can’t be plugged straight into a PC (unlike with the original Kinect which had a standard USB 2.0 plug).

To make the new Kinect work with a PC, then, requires a special breakout board. This board serves as an adapter with three ports – one for the Kinect, one for a power source, finally one for a standard USB 3.0 cable.

We can also probably expect the firmware on the two versions of the new Kinect sensor to also diverge over time as occurred with the original Kinect.

Skeleton detection is greatly improved with the new Kinect.  Not only are more joints now detected, but many of the jitters developers became used to working around are now gone. The new SDK recognizes up to 6 skeletons rather than just two. Finally, because of the improved Time-of-Flight depth camera, which replaces the Primesense technology used in the previous hardware, the accuracy of the skeleton detection is much better and includes excellent hand detection.  Grip recognition as well as Lasso recognition (two fingers used to draw) are now available out of the box – even in this early alpha version of the SDK.

I won’t hesitate to say – even this early in the game – that the new hardware is amazing and is leaps and bounds better than the original sensor. The big question, though, is whether it will take off the way the original hardware did.

If you recall, when Microsoft released the first Kinect sensor they didn’t have immediate plans to use it for anything other than a game controller – no SDK, no motor controller, not a single luxury. Instead, creative developers, artists, researchers and hackers figured out ways to read the raw USB data and started manipulating it to create amazingly original applications that took advantage of the depth sensor – and they posted them to the Internet.

Will this happen the second time around?  Microsoft is endeavoring to do better this time by getting an SDK out much earlier. As I mentioned above, the alpha SDK for Kinect v2 is already available to people in the developer preview program. The trick will be in attracting the types of creative people that were drawn to the Kinect two years ago – the kind of creative technologists Microsoft has always had trouble attracting toward other products like Windows Phone and Windows tablets.

My colleagues and I at Razorfish Emerging Experiences are currently working on combining the new Kinect with other technologies such as Oculus Rift, Google Glass, Unity 3D, Cinder, Leap Motion and 4K video. Like a modern day scrying device (or simply a mad scientist’s experiment) we hoping that by simply mixing all these gadgets together we’ll get a glimpse at what the future looks like and, perhaps, even help to create that future.