Future of Retail: Indoor Positioning with iBeacons

Mar 25, 2014 by in iBeacon, Lab, Mobile, Projection, Retail, Technology

In the Razorfish Emerging Experiences labs, we’ve been prototyping next generation retail experiences that can identify and track consumers with the goal of offering timely and relevant information. Using Bluetooth Low Energy (BLE) indoor location-based tracking, we’re able to see movements and identities of smartphones throughout retail environments. Our systems know how many people are in a space (image below), who they are, when they entered, where they went, where they lingered, and when they left. Based on the location of consumers, we can serve contextually relevant content to their smartphones or to displays in their proximity.


Mobile Integration with Retail

BLE indoor position tracking can integrate mobile experiences into a physical retail space. With indoor location tracking, mobile wayfinding can help guide consumers to items in their mobile wish list or even to less-crowded checkout lines. Consumers could use their mobile devices to signal for help, and associates can use location tracking (image below) to find the customer. Displays near a consumer could recognize them and display messages relevant to their shopping history and accept content that is pushed to them from mobile devices.


Retail Metrics

BLE indoor tracking allows for in-store metrics tracking that is tied to the identity of a customer. Data can be collected from a customer’s entire shopping visit, instead of only at the checkout line. In the image above, a server tracks the movement of a smartphone from the moment it enters and can collect metrics on dwell times, traffic hot spots, and indoor location history. When retail locations connect this data to POS metrics they could track how long each purchase decision took as well as what other products a customer considered during their visit by tracking linger times in front of products that were not purchased. Retail locations could track the effectiveness of in-store messaging, end cap linger times, and incentives that are pushed to smartphones in-store. BLE indoor tracking can also help associates better service customers. By accessing real-time indoor tracking data, associates can optimize where they need to be to offer assistance and guide sales. When engaged with a customer, associates could also see what in-store items a customer was mostly likely interested in.

We’re excited to work with these new technologies and along the way, find better ways for retailers to more effectively connect with consumers.


Kinect for Windows v2 First Look

Dec 04, 2013 by in Kinect, Lab, Technology

I’ve had a little less than a week to play with the new Kinect for Windows v2 so far, thanks to the developer preview program and the Kinect MVP program. So far it is everything Kinect developers and designers have been hoping for – full HD through the color camera and a much improved depth camera as well as USB 3.0 data throughput.

Additionally, much of the processing is now occurring on the GPU rather than the onboard chip or your computer’s CPU. While amazing things were possible with the first Kinect for Windows sensor, most developers found themselves pushing the performance envelope at times and wishing they could get just a little more resolution or just a little more data speed.  Now they will have both.

At this point the programming model has changed a bit between Kinect for Windows v1 and Kinect for Windows v2. While knowing the original SDK will definitely give you a leg up, a bit of work will still need to be done to port Kinect v1 apps to the new Kinect v2 SDK when it is eventually released.

What’s different between the new Kinect for XBox One and the Kinect for Windows v2?  It turns out not a lot. The Kinect for XBox has a special USB 3.0 adapter that draws both lots of power as well as data from the XBox One. Because it is a non-standard connector, it can’t be plugged straight into a PC (unlike with the original Kinect which had a standard USB 2.0 plug).

To make the new Kinect work with a PC, then, requires a special breakout board. This board serves as an adapter with three ports – one for the Kinect, one for a power source, finally one for a standard USB 3.0 cable.

We can also probably expect the firmware on the two versions of the new Kinect sensor to also diverge over time as occurred with the original Kinect.

Skeleton detection is greatly improved with the new Kinect.  Not only are more joints now detected, but many of the jitters developers became used to working around are now gone. The new SDK recognizes up to 6 skeletons rather than just two. Finally, because of the improved Time-of-Flight depth camera, which replaces the Primesense technology used in the previous hardware, the accuracy of the skeleton detection is much better and includes excellent hand detection.  Grip recognition as well as Lasso recognition (two fingers used to draw) are now available out of the box – even in this early alpha version of the SDK.

I won’t hesitate to say – even this early in the game – that the new hardware is amazing and is leaps and bounds better than the original sensor. The big question, though, is whether it will take off the way the original hardware did.

If you recall, when Microsoft released the first Kinect sensor they didn’t have immediate plans to use it for anything other than a game controller – no SDK, no motor controller, not a single luxury. Instead, creative developers, artists, researchers and hackers figured out ways to read the raw USB data and started manipulating it to create amazingly original applications that took advantage of the depth sensor – and they posted them to the Internet.

Will this happen the second time around?  Microsoft is endeavoring to do better this time by getting an SDK out much earlier. As I mentioned above, the alpha SDK for Kinect v2 is already available to people in the developer preview program. The trick will be in attracting the types of creative people that were drawn to the Kinect two years ago – the kind of creative technologists Microsoft has always had trouble attracting toward other products like Windows Phone and Windows tablets.

My colleagues and I at Razorfish Emerging Experiences are currently working on combining the new Kinect with other technologies such as Oculus Rift, Google Glass, Unity 3D, Cinder, Leap Motion and 4K video. Like a modern day scrying device (or simply a mad scientist’s experiment) we hoping that by simply mixing all these gadgets together we’ll get a glimpse at what the future looks like and, perhaps, even help to create that future.

Leap Motion Unboxing

Jul 29, 2013 by in Lab, Technology

No editorializing; just showing our initial experience with the gestural device that fits in the palm of your hand.

Adweek Feature Story: Emerging Experiences goes coast-to-coast with a new Lab in San Francisco.

Jun 04, 2013 by in Lab, News


Razorfish Emerging Experiences has opened a new lab in our San Francisco Razorfish office across from Pier 39 and the renowned Fisherman’s Wharf. Equal parts workspace and client demonstration area, the lab is invaluable for our team to design, build and test some of the most engaging and transformational experiences in the marketplace. Leveraging the success of our Atlanta Lab and its evolution over the past 5 years, the San Francisco Lab is our newest digital sandbox.

In two related articles referencing the work in our new Lab, Christopher Heine writes in Adweek that “employing the latest technology at point of sale is nothing new—for years businesses from car rental companies to Nordstrom department stores have unhooked from the wires. But the trend has gone from merely ringing up sales via mobile devices to a deeply immersive in-store experience—fully digitized but crucially featuring that face-to-face element…”

The Lab showcases 360 degree video content across multiple displays and projection surfaces, features emerging technologies such as transparent displays and multi-touch and gesture-based sensors powered by our proprietary Razorfish 5D Platform. Watch an Audi be configured in precise detail through the application that empowers Audi City or sit back and watch as data is visualized through one of our latest projects.

Physical meets digital and the customer’s journey will never be the same. Innovating Tomorrow, Today. For appointments please contact Wade Forst (, our Director of Emerging Experiences in San Francisco.

Emerging Experiences Lab at Converge: Razorfish Client Summit 2013

May 01, 2013 by in Advertising, Experience Design, Lab, News, Technology

Take a look behind the scenes of the Lab at the ARIA in Las Vegas. A true manifestation of what we do in the Emerging Experiences group, the Lab set-up brings to life the ideas behind this year’s Client Summit theme, Convergence. To learn more about the ideas that drive our passions, read more about what Razorfish’s Global CEO, Bob Lord and Global CTO, Ray Velez have to say in their new book.


The NFC Gumball Machine goes on tour

Feb 19, 2013 by in Lab, Near Field, News, Technology

During a two day prototyping session at Razorfish’s Frankfurt office last summer, our team built an NFC-enabled gumball machine that was filled with apps, music, games and other fun content. And to help celebrate the new Galaxy SIII Mini, it was recently showcased at Samsung’s Flagship Store here in Frankfurt. Now, after a short pit stop to make a few tweaks and improvements, we’re taking it to Mobile World Congress next week. If you happen to be in Barcelona, be sure to stop by the NFC & Mobile Money Pavilion (Hall 7) to check it out.

We’re building the future of retail (and it’s kind of messy)

Jan 11, 2013 by in 5D, Augmented Reality, Experience Design, Kinect, Lab, Multi-touch, Near Field, News, Portfolio, Retail, Technology, Touchscreen

Our lab is buzzing with activity as the team prepares for the National Retail Federation’s 102nd Annual Convention & EXPO in New York. On display will be the latest iteration of Razorfish 5D— the world’s first cross-device, cross-OS, connected retail platform. Launched at last year’s NRF convention, 5D has already been launched in several markets and was used to create Audi City London, a one-of-a-kind immersive virtual showroom. This year we’re showing how our platform can power customized, personalized and seamlessly synchronized shopping experiences. We threw in some augmented reality and a bunch of transparent displays as well.

Our team will be demonstrating the 5D experience in booth #1005 on Level 3 of the Jacob K. Javits Convention Center. If you can’t make the show, be sure to follow us on Twitter to get the latest updates.

Enhanced Consumer Connections, Powered by Razorfish 5D

Nov 15, 2012 by in 5D, Experience Design, Kinect, Lab, Microsoft Surface, Mobile, Multi-touch, Near Field, Retail, Technology, Touchscreen

When we’re playing in our Lab, we’re always looking for creative ways to push the limits of technology. Some of our projects are just for fun, and others, like London’s Audi City, completely reinvent the way people shop. We were even thinking about digital wallets before they were cool. So when we set out to create the Razorfish 5D platform, our goal was to design a powerful and highly immersive way for brands to connect with consumers—before, during and after the shopping experience. In our latest video, we show how our 5D platform seamlessly connects a variety of digital devices to better attract consumers into the store, drive product engagement and arm store associates with more contextualized digital tools. The end result is a fun and personal experience, the way shopping should be.

Leading the Future of Retail: AdWeek Features Atlanta’s Emerging Experiences Lab

Oct 16, 2012 by in Lab, News, Retail, Technology

Today Christopher Heine of AdWeek published “Razorfish’s Atlanta Lab Focuses on In-Store Digital” highlighting the Emerging Experiences Lab as a multi-faceted innovative space equipped to continue tackling the changing retail landscape.

Regarding a recent report, Heine concludes:

Bottom line, retailers need to do more than simply slap digital elements into their locations… they need to create seriously-planned interactive customer experiences.

Razorfish’s Emerging Experiences lab is a mind-blowing candy store stocked with seamlessly connected technologies that facilitate the creation of magic moments for guests. It provides an immersive physical space that clients can leverage to strategize, implement, prototype, and employ these interactive experiences for their customers.

From concept to completion, the Emerging Experience Practice is a one-stop shop for clients looking to collaborate with a team of committed, enthusiastic specialists to ultimately create custom solutions that are grounded in the reality of business. The Lab is a unifying space not only for emerging technologies, but also for designers, developers, strategists, and stakeholders too.

In the Lab, all of the walls come down. Traditional barriers between agency and client as well as client and customer are removed. Technology recedes in and out of view through the cycle of creation as it integrates with thoughtful experience touch points.

The results of this one-of-a-kind mix? Solutions that are sustainable and occur as a natural result of discoveries during the envisioning process.

It’s always so exciting when a client visits the Lab for the first time. By experiencing the possibilities in a physical space, the client is inspired by this type of thinking and how it relates to their business. Subconsciously, authentic consumer experiences begin to occur.

The sensory nature of the Lab helps foster the most compelling and innovative ideas possible. It is something that can not be achieved by observing a focus group or relying on evolving data.

It’s brainstorming at its finest. And prototyping at its fastest.

Clients can experience their customers’ point of view in a way that was once never possible.

Razorfish is committed. Our team members are committed. All of the chips are in and the Lab is situated as a crucial space to help our clients realize and understand the needs of today’s customers.

NFC Gumball Machine

Jun 15, 2012 by in Lab, Mobile, Near Field

Near field technology has been around for a couple of years now, but will it finally have its breakthrough later this year when the new iPhone comes out? A good reason for us to take another closer look at the technology.

Introducing Digital-Gum-Goods.

This is an NFC-enabled Gum Machine we have built at Razorfish that is packed with all sorts of digital goodies: Apps, movies, songs, ebooks, as well as other exclusive and location-based content that can be pushed to a phone. Simply enter a coin and turn the lever – then follow the animation and tap your smartphone next to the release chute.


The project was realized in a 2 day prototyping session at Razorfish’s Frankfurt office. In terms of hardware, we used a Samsung Galaxy Tab, an NFC shield, a simple reed switch and two Arduino microcontrollers – all nicely fitted into an original Gum Machine metal base.

This is an example of how NFC Technology can provide a missing link between the physical and the digital by bringing the best of both worlds together.

Want to keep updated? We’d love to hear your thoughts and ideas about the blurring line between the physical and digital worlds. Be invited to join the conversation on facebook.

Download Presskit | High-resolution images for print available on request.