Jennifer Tidy, VP, ModiFace, helps beauty brands customize ModiFace’s augmented reality and imaging technology in order to create unique solutions for in-store, mobile and web platforms. Having been at the intersection of augmented reality, computer vision and beauty for almost a decade, and having been instrumental in the growth of ModiFace-powered mobile apps, which recently surpassed 100 million downloads, Jennifer has a unique perspective on how to digitally succeed with consumers. Here, Jennifer writes about a unique technology that ModiFace is unveiling this week for recommending foundation shades, based on a large-scale scientific study involving 364,000 participants.

As consumers continue to seek out advice on everything from makeup to skin care products, it’s clear that a key strategic growth area for almost every beauty brand is to provide personalized beauty recommendations.

ModiFace, the largest provider of augmented reality technology to beauty brands, is keen on deciphering one of the most difficult personalized beauty recommendation services existing today: foundation matching. The service, one that eludes even the most makeup-savvy consumer, typically requires a person-to-person, in-depth consultation with an experienced beauty adviser. And lots of trial and error.

Realizing this, in 2012 Modiface began a project to explore methods to scientifically address challenges with personalized beauty. More specifically, it set out to detect the exact skin tone of users through their photos as a basis for recommending foundation shades.

Detecting a skin-tone from a user photo might seem simple and straightforward. Why not just take any photo, get the skin color in the photo and use that as the estimate for the user’s skin tone? It turns out that this simple method becomes extremely inaccurate in different lighting conditions and with different cameras. Depending on how the light reflects on the skin, and depending on the color characteristics of the camera, a single photo can provide a vastly different result for skin-tone estimation. Which means that if the skin-tone estimate is wrong, so will the recommended foundation shade.

As Modiface began this exploration, it realized that mathematically, it becomes possible to estimate lighting information based on differential analysis of images (i.e. by looking at a pair of images of the person). It then becomes possible to extend this to analyze multiple images to accurately estimate the lighting conditions of each photo, the camera parameters and the skin-tone of the person in the images (which is ultimately what Modiface most cared about). The more images of the person, the more accurate the skin-tone estimate would become. The challenge at this point was where Modiface could easily find multiple tagged photos of a person. Social media seemed like a rather obvious answer.

After months of perfection and fine-tuning of this technology, Modiface created a prototype application that would, after getting permission from each user, access their social media images, analyze their photos to determine the most accurate skin-tone estimate, and recommend foundation shades to them. The skin-tone estimate was then compared to the real-life skin tone of the user and any errors were noted as part of the study. After two years of data collection, over 364,000 users participated and 1.6 million photos were analyzed.

Once all of the data was collected, the results were quite remarkable. 98.3% of users had a correct skin-tone estimate (corresponding to a CIE76 color error of 2.31). This essentially meant that, on average, there was not a noticeable difference between the detected skin tone and the true skin tone. What this means is that using multiple photos, which can instantly and seamlessly be accessed on social media, it is possible to solve one piece of the personalization puzzle.

The results of this study are being presented this week by ModiFace scientists at the International Symposium on Multimedia in Miami. Currently, Modiface is looking into using this patent-pending technology for extracting other information, including hair color, skin characteristics (such as skin texture, wrinkles, spots, etc.), as well as building a makeup usage profile of users based on their social media photos.

The ultimate goal of this is a personalized augmented reality system that can analyze one’s face, detect detailed facial information and preview a live simulation of recommended products on a live video. Years ago such a technology would have seemed like science fiction. However, Modiface has reason to believe that 2016 will be the year that many brands will bring such a system to their customers in stores and in mobile apps.