ModiFace, the leading provider of augmented reality technology, has addressed one of the biggest challenges facing the segment today: it has discovered a way to estimate the lighting of an environment resulting in the truest beauty simulation possible. ModiFace will bring this upgraded technology, Light Field Rendering, which accounts for all necessary tone and lighting parameters, to its 70+ brand partners in 2017. Here, Parham Aarabi, Founder and CEO of ModiFace, who is a Professor of Computer Engineering at the University of Toronto and an author on over 150 scientific publications, discusses the innovation.

Ten years ago, ModiFace was started with the goal of enabling users to virtually try-on beauty products. Much has happened since then, including the rise of powerful smartphones that enable live video makeup and skin-care simulation. From our first skin simulation app (created in partnership with Allergan in 2007) to the first live video makeup mirror (created in partnership with Sephora in 2013), more and more beauty brands realize the benefits of augmented reality and have join us as partners. Today, over 70 global brands (including both Sephora and Allergan) use ModiFace’s technology to simulate their beauty products on mobile, web, messaging and in-store platforms. Based on this experience, we have observed time and again that when it comes to augmented reality, more realism always results in more success.

When realism is achieved, which translates to the believability that the product simulation is a true depiction of the real-life product, consumers will try-on, believe, and buy. And then they will come back, try-on more, and buy more. The trust in the visualizations provides a frictionless way for customers to experiment with products and to buy, leading to increased sales and a better customer-brand relationship. If brands fail on the realism factor, none of the benefits would be achieved, and a great opportunity for increasing sales and engagement would be lost. With this realization, in 2010 we focused on creating a revolutionary technology that would redefine the level of realism in augmented reality apps.

In order to realistically simulate a beauty product, we needed to know the exact color and finish of the product, the exact tone and texture of the skin, and the lighting of the environment, also known as the Light Field. While most of these parameters could be estimated from a photo or video, the Light Field always remained the most elusive parameter for estimation.

After nearly a decade of working on this problem, we discovered a way to estimate the Light Field by analyzing types of information in a photo, such as hair and background that had never been used before. By analyzing all of these elements, it became possible to estimate the Light Field based on which true-to-life shade of a product could be rendered. We call this type of augmented reality, which accounts for all necessary tone and lighting parameters, Light Field Rendering (LFR). LFR allows for makeup colors to be realistically shown under all lighting situations for all skin tones. It can even show makeup realistically when the lighting on the face varies from side to side. Initial user studies show a dramatic increase in the initial “wow” factor for LFR-enabled augmented reality apps, along with a substantive increase in time spent (+47%), sharing (+22%), and conversions (+31%).

Our focus for 2017 will be on bringing LFR technology upgrades to all our partners, and to apply LFR to other visualization categories beyond makeup including hair color simulation. These and other advancements in augmented reality will continue to increase the dependence of both consumers and brands on virtual try-on, providing a universal way that products could be reliably explored, tested, and selected both online and in-store.