Lowe's

Lowe's

Revolutionizing the retail experience through app-based augmented reality.
The Challenge

Tackle ever-evolving AR technology to build an easy-to-use 3D space planning app for Lowe’s Innovation Labs.

The Solution

We crafted two sleek, powerful applications for devices enabled with Google's Tango platform that guarantee ease of use and functionality for both customers and Lowe’s alike.

The Result

We helped launch two first-to-market products for Lowe’s, forever changing how customers can shop for, design, and improve their homes – defining the future of the retail experience.

Lowe's Vision
Lowe’s Vision: Crafting a Unique Partnership

LIL had begun working on a prototype of an app that would enable customers to use augmented reality as a home improvement tool. Finger Food brought this to life.

Lowe’s aim for the app was to let their customers measure any room quickly and easily, map out the space, and then style it with Lowe's products, using a blend of the real and the digital. When we came on board, LIL had formed a partnership with Google and Lenovo to power this app with Tango (Google’s augmented reality computing platform) and Lenovo’s Phab 2 Pro smartphone – both of which were still in development at the time.

Stepping into the project, our goal was to ramp up development on the prototype to meet aggressive timelines that were dependent on two major events: demos at Google I/O in May 2016 and Lenovo Tech World in June 2016.

Our ability to work resourcefully and inventively was recognized by Lowe’s right away. As the project unfolded, we worked collaboratively with LIL to break new ground in the world of retail.

"Finger Food was an integral part of bringing our Lowe's Vision platform from concept to reality. They proved that they were invested in moving the needle for VR and AR software. Finger Food was not afraid to dive deep and overcome hard problems to make the technology shine."

Mason Sheffield
Director of Lab Technology, Lowe's Innovation Labs
Responding to Ever-Evolving Platforms and Devices

LIL has stepped into a world of emerging technologies. We built the Vision app to embrace these.

A big challenge we had to tackle was that as we were developing the Vision app and new ideas emerged, we were also working with technology that was in development and with devices that were not yet production ready. Google's Tango platform, for example, had not yet been released to the public and the Phab 2 Pro was still being developed.

Lowe's Vision

Because of how the app was being built and how new the platforms being used were, we also had to create new ways for various elements within the app to communicate and interact with each other. We also needed to avoid compromising functionality and ease of use. As the demand for new features grew and the platform and hardware changed, we demonstrated our adaptability and ingenuity, and quickly grew our team to meet these needs.

More than any other mobile platform, Tango technology from Google integrates directly into the environment around it, and we had to keep this in mind at every stage of design, development, and testing to ensure smooth, accurate 3D representation.

We also had to dig into the specific ways the Lowe’s Vision app would be used and the environments we expected the device to be in to drive our process of technological experimentation and iteration forward.

Lowe's Vision
Ensuring an Unmatched User Experience

Lowe’s has created a new way for customers to shop. Finger Food helped launch this first-to-market product.

The Lowe’s Vision app allows customers to use a Tango-enabled smartphone like the Lenovo Phab 2 Pro to map and measure their space, visualize how different Lowe’s products would look and fit within the space, and get ideas for how to improve their home. It also changes how customers can shop.

In keeping with LIL’s vision for the app, we wanted to ensure that what we built would allow this augmented reality technology to be used in an accessible and unintimidating way. We were able to empower Lowe’s customers and the LIL team at the same time by providing an insightful user experience, a highly crafted user interface, and innovative engineering.

Lowe's Vision Navigator
Lowe’s Vision: In-Store Navigation - Rethinking the In-Store Shopping Experience

What do you do when your client has stores that are over 100,000 square feet in size? You build an app that uses AR to help customers find the products they need.

Building on what Lowe’s Innovation Lab and Finger Food achieved with Lowe’s Vision, the Lowe’s Vision: In-Store Navigation app allows customers to use a Tango-enabled device to efficiently navigate through Lowe’s stores and find the products they need.

Lowe's Vision Navigator

Simplifying the home improvement project process, the app allows users to create a shopping list of what they need and includes access to product reviews. This makes it easy for customers to choose the right products and then keep track of each item while shopping. Acting like a visual GPS, the app gives customers digital directions that literally appear in front of them, leading them step by step to each Lowe’s product on their shopping list via the most efficient route.

Lowe’s Vision: In-Store Navigation app not only makes the shopping experience more efficient and more fun for customers, it also helps Lowe’s workflow. The app can notify stores of low stock, which leads to quicker reordering and restocking, ensuring staff can spend more time advising customers on their projects.

The app uses a combination of visual mapping, depth perception, and area learning to provide a simplified and more enjoyable retail experience. Our team was able to deliver the experience on time, with all the committed features fully implemented.

Technology -
Lowe’s Vision

Custom features + Unity gaming engine + Android SDK + Google's Tango Platform = Lowe’s Vision App.

Although it increased the technical complexity of the project, our team made a key decision early on that would improve the user experience of the app. We wanted the app to feel like a native Android app, while still being able to use the content pipeline and tooling that came with the Unity game engine. To accomplish this, we built a hybrid app that uses the Android SDK for the UI and the Unity engine for the 3D environment and communication with the Google Tango platform.

Lowe's Vision

Because Unity and Android use completely different programming environments and run separately from each other on the device, to ensure the level of integration needed, we built our own communication layer to allow them to talk to each other.

In the app, interacting with Google's Tango platform takes two forms. First is the device’s “pose” – its position and orientation in space. This is enabled by a Tango technology called “Motion Tracking.” Once you start running a Tango-enabled app, the device is able to tell exactly how it is being moved and rotated. This all manifests itself within the Unity world as camera movement; moving and rotating the device has a corresponding effect on the Unity camera.

The second is using depth perception. Tango devices have sensors that continuously scan the environment in front of them; those scans are made available in the form of a point cloud. Because of the underlying sensor technology and how often the device performs a scan, the point cloud can appear to shimmer from moment to moment. Different materials and lighting conditions can also cause distortions that appear as geometry that isn’t there. The combination of these factors meant that we had to develop our own understanding of the environment around the device and build an abstraction of that environment, tailored to our app.

Another challenge we faced was that the Lenovo Phab 2 Pro was still being actively developed while we worked on the app. We had to write the majority of the app on a different device, the Tango Tablet Development Kit, which works differently, while ensuring compatibility with the Phab 2 Pro.

At the same time, in the lead up to public release, the Tango SDK was also under very active development and was changing month to month, sometimes week to week. Often the functionality we depended on was altered or removed altogether, and we had to adapt continuously to be able to keep the Lowe’s Vision app working as we pushed to completion.

Lowe's Vision Navigator
"The industry partnership between Finger Food, Lowe's Innovation Labs, and Google enabled us to work on bleeding-edge technology before it was released to the public. We were given the opportunity to build experiences that no one had ever seen, inventing new ways of interacting with the world around us."
Daryl Hawkins
Technical Director, Finger Food Studios

Technology -
Lowe’s Vision
In-Store Nav

Along with a tight two-month deadline for production, the biggest challenge we faced while developing Lowe’s Vision: In-Store Navigation app was testability.

As the app only functions in physical locations that are outside of Canada, to test any of the work we had done meant we had to travel to the US. The specific physical locations that we could test in weren’t something that we could control, so we decided to build a system that would let us emulate these locations anywhere we wanted. In effect, we hid entire Lowe’s stores in our facility, using augmented reality.

To accomplish this, we used a Google Tango technology called Area Learning. This allows a Tango device to capture the visual features of a location, store them and then later recognize that same location. The features that are captured are so detailed that the device can later recognize exactly where it is in the original scanned space.

Lowe's Vision Navigator

In combination with pose tracking, the device can extrapolate its position out from the original scan with high accuracy, while still being anchored to the original location. The technology we built let us create Area Definition Files of our office and make the device believe it was actually in a Lowe’s store. This unblocked us during early development and let us quickly iterate on the UI and user interaction models.

However, this was only a stopgap solution; our office is much smaller than a real Lowe’s store. The testing we were doing was only partially valid as we weren’t using the app on the same scale as end-users. Our creative work around? We reached out to Google and asked them to scan our 25,000 sqft Holodeck location, effectively making it another Lowe’s store that the app would recognize.

Our strengths in adaptability, creative thinking, and multi-faceted tech solutions meant we were able to help make Lowe’s Vision and Lowe’s Vision: In-Store Navigation a reality, revolutionizing the retail experience.

More Work

Sphero Arcade

Learn programming by recreating classic games with a Sphero robot

Sphero Edu

Evolving the relationship between technology, learning, and robotics