When it comes to travel, people care more about where they’re going than how they’re getting there. Delta Airlines understands this and asked Razorfish’s Emerging Experiences team to create an engaging experience for the WIRED Holiday Store in NYC. We wanted to tap into users’ imagination and sense of playfulness so that they walk away from the experience thinking about what kind of destinations they want to go to next and, of course, Delta.
In four (4) weeks we concepted, designed, developed and launched The Untravel Idea – a new, personal way for leisure travelers to encounter destinations. We wanted users to touch the future of travel.
The experience gives people an open-ended, creative experience that puts the user in control. First users choose the type mood they are looking for on their next getaway. From there users can select from a wide variety of relevant words that match their mood, and when put together, show them a range of destination possibilities. The result is a beautiful montage of photographic imagery that will transport the user’s imagination.
To extend the experience beyond the store, users are prompted to use their mobile device to snap a pic of a QR tag associated with each destination that allows them to explore additional destination info, video and travel packages.
The event was a complete success. People couldn’t wait to see where Delta would take them next. The Takeaway – Delta is not just an airline; they’re giving me new ways to discover travel destinations.
This is the future of travel … and it’s just the beginning.
Our team was asked to help launch the Microsoft Windows Phone 7 project at Mobile World Congress 2010. The project was a whirlwind experience – starting with 5 weeks of design/development and 11 days of deployment and support that spanned 2 continents and countless late nights. It was all worth it when Steve Ballmer made the introduction and we were all a part of history as the next generation in mobile experiences was announced to the 50k MWC attendees and a larger worldwide audience. The people lucky enough to be in attendance couldn’t wait to get their hands on the experience we built.
We setup 10 touchscreens in 2 locations and the experiences were in constant use. Microsoft has since taken the touchscreens to countless other events including MIX10, SXSW 2010, CES, CTIA and many many more.
In addition to the touchscreen experiences, we also worked with our Seattle team to produce a microsite experience that would allow those not in attendance to get a taste of the phone.
Since we were running the experience in the Microsoft booth, we decided to add some new characters – the most popular of which being Steve Ballmer:
We used the experience as a way to engage with conference attendees and demonstrate an innovative use of Microsoft Tag technology. As conference attendees had their RockstAR snapshot taken, we’d ask them to download the tag reader application to their mobile device. Afterwards, they could take a snapshot of the Microsoft Tag and retrieve their photo. We took over 300 photos at the event.
The RockstAR experience is another example of how you can use tag technology to extend an interactive in-store experience to a customers’ mobile device. Wishlists, shopping carts, mobile content delivery, product ratings & reviews and wayfinding are some of the examples of how tag technology can be used to change the way people shop in retail.
We recently had the opportunity to debut the RockstAR experience at SXSW – check out video of the experience in action. We like to think of it as the classic photo booth taken to the next level with augmented reality, multi-touch and social integration. Let’s go behind-the-scenes and take a look at both the software and hardware that brings this experience to life.
First, let’s talk software. The application was built on the recently announced Razorfish Vision Framework. The framework provides a platform to power augmented reality, gestural and other vision-based experiences. For the RockstAR experience, we are analyzing each frame coming from an infrared camera to determine if faces are found in the crowd. Once a face is detected, it is assigned a unique ID and tracked. Once we receive a lock on the face, we can pass position and size information to the experience where we can augment animations and graphics on top of the color camera feed. This technology has a variety of uses. For instance, face tracking can be used to track impressions on static or interactive digital experiences in the retail environment. Here is a screenshot taken from the debug-mode of the experience which shows the face tracking engine at work using the infrared camera.
In addition to the vision-based technology, the experience was fully multi-touch enabled – users can gesture on a virtual joystick to swap out bands and snap pictures.
Because the classic photo booth experience is a social activity, we took it to the next level with twitter and Flickr integration. As pictures were snapped, we’d immediately make them available online. A QR code was rendered with each picture to quickly allow users to navigate to the RockstAR photo on their mobile device. Once the experience is extended to mobile, users can email the pictures to their friends, set it as wallpaper, re-tweet it to their twitter followers, etc.
Let’s move on to hardware. Unfortunately, you can’t purchase infrared AR-ready cameras at your local Walmart… at least not until Project Natal comes out later this year. Therefore, we needed to build a dual-camera system that would support the face tracking in infrared and the color video feed for display on the screen. We decided to go with 2 commercial-grade Firefly MV cameras with custom lenses.
One of the cameras we modified to see only infrared light by replacing the IR-blocking filter with a IR band-pass filter. This allows only a narrow range of infrared light to reach the camera CCD.
We also purchased and tested a variety of infrared illuminators. These are used to illuminate the environment with invisible infrared light allowing the infrared camera to accurately track faces in low-light conditions.
Sparks were flying as we fused the color and infrared cameras together — just another day at the office.
We created a portable rig for the camera and infrared illuminators. Adjustable camera mounts and industrial strength velcro provide flexibility and portability across a variety of installations.
We used a presentation remote clicker as an alternative way to drive the experience. We primarily used it as a remote camera trigger which allowed us to quickly snap pictures of unsuspecting people from a distance.
The experience was powered by a 55″ multi-touch screen and a CPU provided by DFI Technologies. We’ve been working with DFI to build PCs that will power the next-generation of interactive experiences. These PCs are small form factor and can be mounted behind the multi-touch screen.
Last but not least, we bring you the pink rug. We can’t reveal too much information about this technology… we need to keep some things secret. Just know that it is critical to the overall experience.
We recently created an experience named RockstAR which features augmented reality and multi-touch technology. It is the classic photo booth experience taken to the next level with interactive technology, social integration (currently the experience posts to twitter, twitpic and flickr), good ole fashioned Rock n Roll and a little 80s video game nostalgia. We also can’t leave out the pink rug – one of the most important parts of any experience.
The application is the first demonstration of the Razorfish Vision Framework (RVF) and it is integrated with our Razorfish Touch Framework (RTF). The experience was featured at several SXSW Interactive Conference 2010 events including the Razorfish and Microsoft parties.
Stay tuned as we’ll be posting a behind-the-scenes tech walkthrough in the next week.
The Razorfish team in Germany partnered with Realtime Technology AG to build configuration experiences for the Audi A1 world premiere at the international motor show in Geneva. They are designed to attract and engage young people and to demonstrate the wide range of customization possibilities of the new Audi.
The first experience is located on the main stage, featuring a 24″ Multi-Touch display allowing users to interact with the car configuration and an additional 65″ display with synchronized high-definition 3d-rendering in real-time to garner even more attention. The complex configuration scenario is wrapped in a simple and easy-to-use interface. The application is based on Windows 7 and the Razorfish Touch Framework.
The second configurator runs on Microsoft Surface and is based on the Audi A4 configurator. The multi-user environment allows individuals to place physical tokens on the table and configure their favorite A1 in a collaborative way. The extravagant competition kit adds exciting new possibilities to spice up the user’s virtual car.
Both configurators can be seen live at Geneva Motor Show until March 14, 2010.
Before we left for the evening, we recorded a quick walkthrough of the Windows Phone booth and EMC (Executive Meeting Center) locations where we have touch experiences deployed to support the Windows Phone 7 Series launch event.
Members of the press and blogging community have been recording video of the experience throughout the conference. These videos have begun appearing online – here are a couple of the videos we’ve found:
After a long night of celebrating the successful launch of Windows Phone 7 Series in Barcelona, we are back at the Windows Phone booth at Mobile World Congress. The crowds are still huge and the experiences are running great. Each experience is collecting touch and interaction information in the background – we are going to begin processing this information to determine how many sessions we are seeing, average session time, the most popular areas of the experience, etc. We will use this information as a guide to optimize the experience for the next event.
The Windows Phone team is showing live projected demonstrations of the device in the theatre area – these demonstrations are attracting huge crowds.
1:13pm (+1 GMT) – Day 2, Mobile World Congress. Video of Albert Shum, Director of Mobile Experience Design at Microsoft, using the multi-touch experience built by Razorfish as a tool to explain some of the thinking that Microsoft put into the user interface. Razorfish partnered with the Microsoft team to deliver multi-touch experiences which emulate a Windows Phone 7 Series device. The experience is deployed on multiple 40″ multi-touch displays at Mobile World Congress in Barcelona.
Members of the press camped out at the Windows Phone press lounge located across the plaza from Mobile World Congress. Because of the huge turnout for the announcement, much of the press watched the launch event live from the downstairs press lounge. After the show, we launched 6 experiences at this location allowing members the press to touch and interact with Windows Mobile 7 Series for the first time.
Members of the press who weren’t able to watch the event in the theatre or the press lounge huddled around screens outside in the reception area. We went live with 2 experiences at this location.
Conference attendees watching the event live at the Windows Phone booth at Mobile World Congress. We had an additional 2 experiences running at this location.
Cameras were out as the interface was unveiled for the first time. The phone interface design was kept a secret up until launch day. Preventing pictures and other leaks of information from making it to the press turned out to be a huge undertaking. The Windows Phone team went to great lengths to prevent leaks – in fact, many of the Microsoft employees working on the team never had the opportunity to see the interface until launch day. We based our experience off of some hands-on time in Redmond and videos of the experience. Our team was able to reverse-engineer the design, animation and interaction of the user interface. Accuracy was extremely important and we had to ensure the design and motion in our experience was a perfect re-creation of the experience on the actual device. We built the experience on top of the Razorfish Touch Framework. Using the framework allowed us to rapidly develop the application from scratch in under 4 weeks.
The product launch was a huge success and the Windows Phone team has been celebrating in Barcelona. The reaction from the press and blog community has been overwhelmingly positive. The conference is far from over but so far we are off to a great start!