Tag: rockstAR

RockstAR on Tour: Web 2.0 Expo San Francisco

May 09, 2010 by in Augmented Reality, Mobile, Multi-touch, Technology, Touchscreen

We took the show on the road for the Web 2.0 Expo in San Francisco. We worked with the Microsoft Tag team to bring the RockstAR augmented reality experience to the event.

web20-1

Since we were running the experience in the Microsoft booth, we decided to add some new characters – the most popular of which being Steve Ballmer:

ballmer_shot2

We used the experience as a way to engage with conference attendees and demonstrate an innovative use of Microsoft Tag technology. As conference attendees had their RockstAR snapshot taken, we’d ask them to download the tag reader application to their mobile device. Afterwards, they could take a snapshot of the Microsoft Tag and retrieve their photo. We took over 300 photos at the event.

web20-2

The RockstAR experience is another example of how you can use tag technology to extend an interactive in-store experience to a customers’ mobile device. Wishlists, shopping carts, mobile content delivery, product ratings & reviews and wayfinding are some of the examples of how tag technology can be used to change the way people shop in retail.

Check out our pictures from the event.


The Technology Behind RockstAR

Apr 13, 2010 by in Augmented Reality, Lab, Multi-touch, Technology

We recently had the opportunity to debut the RockstAR experience at SXSW – check out video of the experience in action. We like to think of it as the classic photo booth taken to the next level with augmented reality, multi-touch and social integration. Let’s go behind-the-scenes and take a look at both the software and hardware that brings this experience to life.

RockstAR

First, let’s talk software. The application was built on the recently announced Razorfish Vision Framework. The framework provides a platform to power augmented reality, gestural and other vision-based experiences. For the RockstAR experience, we are analyzing each frame coming from an infrared camera to determine if faces are found in the crowd. Once a face is detected, it is assigned a unique ID and tracked. Once we receive a lock on the face, we can pass position and size information to the experience where we can augment animations and graphics on top of the color camera feed. This technology has a variety of uses. For instance, face tracking can be used to track impressions on static or interactive digital experiences in the retail environment. Here is a screenshot taken from the debug-mode of the experience which shows the face tracking engine at work using the infrared camera.

face tracking

In addition to the vision-based technology, the experience was fully multi-touch enabled – users can gesture on a virtual joystick to swap out bands and snap pictures.

joystick

Because the classic photo booth experience is a social activity, we took it to the next level with twitter and Flickr integration. As pictures were snapped, we’d immediately make them available online. A QR code was rendered with each picture to quickly allow users to navigate to the RockstAR photo on their mobile device. Once the experience is extended to mobile, users can email the pictures to their friends, set it as wallpaper, re-tweet it to their twitter followers, etc.

RockstAR twitter and flickr

Let’s move on to hardware. Unfortunately, you can’t purchase infrared AR-ready cameras at your local Walmart… at least not until Project Natal comes out later this year. Therefore, we needed to build a dual-camera system that would support the face tracking in infrared and the color video feed for display on the screen. We decided to go with 2 commercial-grade Firefly MV cameras with custom lenses.

camera

One of the cameras we modified to see only infrared light by replacing the IR-blocking filter with a IR band-pass filter. This allows only a narrow range of infrared light to reach the camera CCD.

infrared filter

We also purchased and tested a variety of infrared illuminators. These are used to illuminate the environment with invisible infrared light allowing the infrared camera to accurately track faces in low-light conditions.

infrared illuminator

Sparks were flying as we fused the color and infrared cameras together — just another day at the office.

We created a portable rig for the camera and infrared illuminators. Adjustable camera mounts and industrial strength velcro provide flexibility and portability across a variety of installations.

rig2

We used a presentation remote clicker as an alternative way to drive the experience. We primarily used it as a remote camera trigger which allowed us to quickly snap pictures of unsuspecting people from a distance.

clicker

The experience was powered by a 55″ multi-touch screen and a CPU provided by DFI Technologies. We’ve been working with DFI to build PCs that will power the next-generation of interactive experiences. These PCs are small form factor and can be mounted behind the multi-touch screen.

dfi

Last but not least, we bring you the pink rug. We can’t reveal too much information about this technology… we need to keep some things secret. Just know that it is critical to the overall experience.

rug


Augment Your Reality with RockstAR

Apr 07, 2010 by in Augmented Reality, Experience Design, Multi-touch, Portfolio, Touchscreen

 

We recently created an experience named RockstAR which features augmented reality and multi-touch technology. It is the classic photo booth experience taken to the next level with interactive technology, social integration (currently the experience posts to twitter, twitpic and flickr), good ole fashioned Rock n Roll and a little 80s video game nostalgia. We also can’t leave out the pink rug – one of the most important parts of any experience.

The application is the first demonstration of the Razorfish Vision Framework (RVF) and it is integrated with our Razorfish Touch Framework (RTF). The experience was featured at several SXSW Interactive Conference 2010 events including the Razorfish and Microsoft parties.

Stay tuned as we’ll be posting a behind-the-scenes tech walkthrough in the next week.


SxSW Interactive, MIX10, Game Developers Conference, CTIA … and that’s just March

Mar 10, 2010 by in Augmented Reality, Experience Design, Lab, Multi-touch, Technology, Touchscreen

RockstAR SxSW Announcement

We’ve been very busy since the Windows Phone 7 Series experience launch in Spain, and there’s plenty more excitement over the next couple of weeks. We will be announcing the Razorfish Vision Framework at this year’s SxSW Interactive Conference. We have created a prototype application that is a mixture of Augmented Reality and Multi-touch called RockstAR and will be showcasing it at the Razorfish hosted cocktail party on Saturday the 13th at the Paradise bar. Come by and augment your reality!

Also …

Below is a quick list of the conferences we’re at in March. Please let us know if you are also attending and we’ll make sure you get a chance to check out our experiences.

msft_speakeasy

  • GDC 2010 (March 9-13, SanFran) – Lesley just returned from San Francisco, CA where we helped Microsoft showcase the WP7S experience in their Game Developers Conference booth.
  • SxSW 2010 Interactive (March 12-15, Austin) – We’re hosting a panel on the 12th, please come interact with us as we discuss “Touch + The Holy Grail of Delight” at 2pm http://my.sxsw.com/events/event/6124. We are also launching a fun new AR experience for the Razorfish cocktail party. Finally we will be at the Microsoft Speakeasy event showcasing the WP7S experience.
  • MIX10 (March 15-17, Vegas) – @stevedawson and @hulljon will be attending the conference. The WP7S experience will also be making an appearance … with updated content!
  • CTIA (March 22-25, Vegas) – WP7S experience will be out in force at this international wireless conference.

A quick thanks to all the peeps that helped out with RockstAR: