From a surfboard designer that comes to life right before your eyes, to a one-of-a-kind, dual screen Kinect beach soccer game, RZRShop was a huge hit at this year’s National Retail Federation Annual Convention and Expo. Check out some of the highlights.
Get your hands on the 5D experience by embarking on a unique shopping journey that utilizes a variety of platforms and technologies, including a first of it’s kind, seamlessly-synchronized transparent interactive display wall. It’s located in the Microsoft booth (1005) on Level 3. And to see more of 5D in action, head on over to emergingexperiences.com/5D.
Our lab is buzzing with activity as the team prepares for the National Retail Federation’s 102nd Annual Convention & EXPO in New York. On display will be the latest iteration of Razorfish 5D— the world’s first cross-device, cross-OS, connected retail platform. Launched at last year’s NRF convention, 5D has already been launched in several markets and was used to create Audi City London, a one-of-a-kind immersive virtual showroom. This year we’re showing how our platform can power customized, personalized and seamlessly synchronized shopping experiences. We threw in some augmented reality and a bunch of transparent displays as well.
Our team will be demonstrating the 5D experience in booth #1005 on Level 3 of the Jacob K. Javits Convention Center. If you can’t make the show, be sure to follow us on Twitter to get the latest updates.
Fresh out of R&D from the Razorfish Emerging Experiences team is a product code-named “5D”. 5D started out as an idea to re-invent personal shopping. Our goal was to create a retail experience platform for both consumers and sales associates that enables multi-channel sales through immersive and connected digital devices in retail environments. And the only way to do it is to seamlessly integrate five key components – devices, content, experiences, analytics and CRM with a touch of digital magic!
The team announced 5D at the 2012 NRF Convention & Expo in New York City in partnership with NEC and Microsoft. Leveraging Windows Embedded, Microsoft Surface, MS Tag, Windows Phone and Kinect for Windows we created a prototype around a fictitious brand “Razorfashion” that demonstrates how various touch points along the customer journey can attract consumers into the store, drive product engagement and arm store associates with more contextualized digital tools.
You can read the full press release here
In 2010, Microsoft released Kinect – a controller-free gaming and entertainment experience for the Xbox 360. Your body is the controller – joysticks and buttons are replaced with the users’ movements and gestures. It turns out Kinect has many uses beyond games and entertainment. Razorfish’s Emerging Experiences team created KinectShop to demonstrate the use of the Kinect platform in as a retail or at-home augmented reality shopping experience.
KinectShop allows shoppers to cycle through an assortment of products, in this case purses, and visualize the products as part of their outfit, thereby better informing the purchase decision. The natural interaction offered by the Kinect platform allows the shoppers to quickly develop a 1-to-1 connection with the product through the use of augmented reality. In augmented reality, shelf space is infinite, so while this concept experience is limited to purses it could host entire catalogs of products, such as clothing, hats, sunglasses, shoes, jewelry, makeup and more.
As an in-store experience, the retailer can bring catalog and online inventory into the store without actually having the inventory on hand. Further, it allows a shopper to still try on inventory not available in the store or out of stock, capturing a sale that might otherwise be lost. Ultimately, the shopper can decide that they like the product and add it to their shopping cart or wishlist.
Because the experience is virtual, it presents possibilities to become portable and even transcend beyond the store. With an experience like KinectShop, a shopper can easily scan a QR code or swipe their NFC smartphone to take their experience with them and use wayfinding tools to locate the product in-store. Additionally, shoppers could later retrieve their wishlist at home using the company’s web site, tablet, mobile experience or even an Xbox or PC version of the experience in their living room.
Shopping is inherently a social activity and the experience could not only support multiple users simultaneously but it also has the natural ability to leverage social tools. For instance, pictures taken with the virtual products can be shared through Facebook and Twitter to help solicit feedback from friends.
We plan on leveraging the Kinect platform to enhancing the experience further in future versions. For example, user recognition could help record and save preferences intuitively to your profile. Microphones can be used to employ voice commands – for example, saying “I love it” automatically adds the item to the wishlist. The experience will one day even offer recommendations of coordinating items based on the colors in the clothes that shoppers are wearing.
As you can see devices like Kinect clearly have uses beyond the gaming console. We are just scratching the surface on the types of experiences this technology will enable in the future.
Read more about it over at Fast Company.
A wide range of Oakley products are designed for sports fans and outdoor living people who are dependent on their equipment when practising their passion and living their dreams. Choosing the right sunglass lens makes a significant difference when sports and outdoor activities are taken seriously. To guide the consumer through this decision process, we have implemented an iPhone and iPad App which simulates realistic scenarios by using engaging 3d-panorama landscapes wrapped in an intuitive touch- and accelerometer-based interface.
The overall experience features more than 18 lens tints in spectacular environments and various weather conditions. Once the perfect lens is selected, a detailed product information is just one touch away.
Since we were running the experience in the Microsoft booth, we decided to add some new characters – the most popular of which being Steve Ballmer:
We used the experience as a way to engage with conference attendees and demonstrate an innovative use of Microsoft Tag technology. As conference attendees had their RockstAR snapshot taken, we’d ask them to download the tag reader application to their mobile device. Afterwards, they could take a snapshot of the Microsoft Tag and retrieve their photo. We took over 300 photos at the event.
The RockstAR experience is another example of how you can use tag technology to extend an interactive in-store experience to a customers’ mobile device. Wishlists, shopping carts, mobile content delivery, product ratings & reviews and wayfinding are some of the examples of how tag technology can be used to change the way people shop in retail.
Check out our pictures from the event.
We recently had the opportunity to debut the RockstAR experience at SXSW – check out video of the experience in action. We like to think of it as the classic photo booth taken to the next level with augmented reality, multi-touch and social integration. Let’s go behind-the-scenes and take a look at both the software and hardware that brings this experience to life.
First, let’s talk software. The application was built on the recently announced Razorfish Vision Framework. The framework provides a platform to power augmented reality, gestural and other vision-based experiences. For the RockstAR experience, we are analyzing each frame coming from an infrared camera to determine if faces are found in the crowd. Once a face is detected, it is assigned a unique ID and tracked. Once we receive a lock on the face, we can pass position and size information to the experience where we can augment animations and graphics on top of the color camera feed. This technology has a variety of uses. For instance, face tracking can be used to track impressions on static or interactive digital experiences in the retail environment. Here is a screenshot taken from the debug-mode of the experience which shows the face tracking engine at work using the infrared camera.
In addition to the vision-based technology, the experience was fully multi-touch enabled – users can gesture on a virtual joystick to swap out bands and snap pictures.
Because the classic photo booth experience is a social activity, we took it to the next level with twitter and Flickr integration. As pictures were snapped, we’d immediately make them available online. A QR code was rendered with each picture to quickly allow users to navigate to the RockstAR photo on their mobile device. Once the experience is extended to mobile, users can email the pictures to their friends, set it as wallpaper, re-tweet it to their twitter followers, etc.
Let’s move on to hardware. Unfortunately, you can’t purchase infrared AR-ready cameras at your local Walmart… at least not until Project Natal comes out later this year. Therefore, we needed to build a dual-camera system that would support the face tracking in infrared and the color video feed for display on the screen. We decided to go with 2 commercial-grade Firefly MV cameras with custom lenses.
One of the cameras we modified to see only infrared light by replacing the IR-blocking filter with a IR band-pass filter. This allows only a narrow range of infrared light to reach the camera CCD.
We also purchased and tested a variety of infrared illuminators. These are used to illuminate the environment with invisible infrared light allowing the infrared camera to accurately track faces in low-light conditions.
Sparks were flying as we fused the color and infrared cameras together — just another day at the office.
We created a portable rig for the camera and infrared illuminators. Adjustable camera mounts and industrial strength velcro provide flexibility and portability across a variety of installations.
We used a presentation remote clicker as an alternative way to drive the experience. We primarily used it as a remote camera trigger which allowed us to quickly snap pictures of unsuspecting people from a distance.
The experience was powered by a 55″ multi-touch screen and a CPU provided by DFI Technologies. We’ve been working with DFI to build PCs that will power the next-generation of interactive experiences. These PCs are small form factor and can be mounted behind the multi-touch screen.
Last but not least, we bring you the pink rug. We can’t reveal too much information about this technology… we need to keep some things secret. Just know that it is critical to the overall experience.
We recently created an experience named RockstAR which features augmented reality and multi-touch technology. It is the classic photo booth experience taken to the next level with interactive technology, social integration (currently the experience posts to twitter, twitpic and flickr), good ole fashioned Rock n Roll and a little 80s video game nostalgia. We also can’t leave out the pink rug – one of the most important parts of any experience.
The application is the first demonstration of the Razorfish Vision Framework (RVF) and it is integrated with our Razorfish Touch Framework (RTF). The experience was featured at several SXSW Interactive Conference 2010 events including the Razorfish and Microsoft parties.
Stay tuned as we’ll be posting a behind-the-scenes tech walkthrough in the next week.
We’ve been very busy since the Windows Phone 7 Series experience launch in Spain, and there’s plenty more excitement over the next couple of weeks. We will be announcing the Razorfish Vision Framework at this year’s SxSW Interactive Conference. We have created a prototype application that is a mixture of Augmented Reality and Multi-touch called RockstAR and will be showcasing it at the Razorfish hosted cocktail party on Saturday the 13th at the Paradise bar. Come by and augment your reality!
Below is a quick list of the conferences we’re at in March. Please let us know if you are also attending and we’ll make sure you get a chance to check out our experiences.
- GDC 2010 (March 9-13, SanFran) – Lesley just returned from San Francisco, CA where we helped Microsoft showcase the WP7S experience in their Game Developers Conference booth.
- SxSW 2010 Interactive (March 12-15, Austin) – We’re hosting a panel on the 12th, please come interact with us as we discuss “Touch + The Holy Grail of Delight” at 2pm http://my.sxsw.com/events/event/6124. We are also launching a fun new AR experience for the Razorfish cocktail party. Finally we will be at the Microsoft Speakeasy event showcasing the WP7S experience.
- MIX10 (March 15-17, Vegas) – @stevedawson and @hulljon will be attending the conference. The WP7S experience will also be making an appearance … with updated content!
- CTIA (March 22-25, Vegas) – WP7S experience will be out in force at this international wireless conference.
A quick thanks to all the peeps that helped out with RockstAR: