[TECH FOCUS] Prototype #4

Virtual try-on for make-up: which technology has been explored?

Goals

Right from the start of the sprint, the main goal was clear: creating a realistic digital experience in a virtual try-on make-up app.

In augmented reality apps, providing realistic experiences is fundamental, especially if the user has no interaction with the AR elements. In addition to it, we decided to focus on a secondary but still important goal: looking for alternatives to ARKit (the SDK to develop Apple's AR).

ARKit is an excellent tool, but the high competition in the augmented reality creation effects field has led other competitors, such as Facebook, to find other solutions, for example developing SparkAR.

It is then important to study the current state of the art and analyze different solutions to develop virtual makeup.

Notes on the app development

For this project, the development team has been split into two teams: one micro team working with Unity and ARKit, developing an application for the iPad Pro that provides, in addition to good tracking and graphical realism, high-quality UX and UI. The second micro team has studied some alternative solutions, exploring the development of the virtual makeup with Facebook's SparkAR.

  1. ARKit

ARKit has allowed us to trace the face of a user and recreate a 3D model of the face in real-time, but we only need the lips for now, so we isolated the area somehow.
Luckily using Unity we could apply a 3D model to a texture with transparency, which makes some parts of the model invisible.

This is the result:

We have been able to make invisible the rest of the face, showing only the user's lips. The next step was to graphically simulate the lipstick, to do so we used PBR (Physically Based Rendering) materials. These materials allowed us to make a surface more or less shiny and reflective, helping us to simulate wrinkles and irregularities. This last point is very important to keep in mind since to simulate the movement of the lips, we have to generate a so-called Normal Map, a texture that uses particular colours. Starting from the previous image, we were able to generate a very good Normal Map:

Now that we finished with the details concerning the face, we needed to set the lighting and apply post-processing. We used a single light source angled at 45° downwards from the camera so that we could obtain reflections on "glossy" lipsticks.

This light source used Light Estimation, which allowed us to change the intensity based on the real light perceived by the camera of the device, in this way the realistic illumination we got was better. The post-processing allowed us to modify what the user sees, so we could make some changes to improve the realism of the application.

We included 4 post-processing:

  • Grain

The grain allows us to add "noise" on the screen, this is very useful to standardise virtual and real elements as the camera often shows "noisy" real elements and "perfect" virtual elements.

  • Bloom

The bloom allows us to increase the intensity of the reflection. This helps us a lot as we aim to make the reflections of glossy lipsticks very visible.

  • Vignette

This effect creates a shading effect at the edges of the screen, allowing us to keep our focus in the centre of the screen. It's a non-invasive effect, but very effective.

  • Color Grading

The color grading allows us to make changes to the colors, such as increasing the temperature, the contrast and so on. In this case, it’s useful to slightly increase the contrast and color saturation, to make the scene slightly more intense.


  1. Spark AR

Spark AR studio is the framework by Facebook to create AR filters for FB (Facebook) and IG (Instagram). The system is limited compared to editors like Unity and it provides a basic set of features to work with.

Interestingly, face tracking is very advanced; this works by simply analysing the phone's camera feed.

The interactions in Spark AR studio are created by graphic language including a node editor.

The final result is a mono-brand filter that you can use directly on your phone, within the Facebook and Instagram apps. Users can access the content simply by scanning a QR code or following a URL.

The user can choose the color of the lipstick by selecting one of the colors on the right side of the screen. With Spark AR there is an added possibility of taking snapshots or recording videos that are saved directly on the device. From the point of view of the distribution, this opportunity allows very fast updates and use.


Challenges

  1. ARKit

During the development of the app we jumped into a problem, the ARKit tracking option is limited. There are situations and cases where lip tracking is not accurate. This happens for the different lighting in the room, the shape of the person's face, the presence of a beard and other factors. However, we can assume that Apple will improve the quality of ARKit in the future, with much more accurate tracking.

  1. SparkAR

We found some limits for Spark AR when creating the User Interface content.
The system is very limited and hardly supports various phone resolutions. For this reason, we decided to adapt the interface with a simpler and functional version, set for dynamic adaptation to the resolution of the different device screens.


Final results

The results obtained were more than satisfactory for both ARKit and SparkAR.
We were able to implement the design team's vision, resulting in a simple and easy to use app that delivers high-quality AR content.


Future developments

In futuro sarà di sicuro utile svolgere delle sessioni di test e raccogliere feedback da utenti differenti. In questo modo riusciremo a migliorare ancora di più la resa grafica dei rossetti. Nel caso di SparkAR, molte delle possibilità future dipendono da Facebook: è da tempo infatti che si parla di “Hub” di effetti, dove raggruppare multipli effetti insieme (comodissimo per presentare diversi rossetti). Infine un miglioramento del sistema di UI di SparkAR migliorerebbe tantissimo la qualità della UX, rendendo gli effetti più comodi da utilizzare.

In the future, it will certainly be useful to conduct some user testing sessions and gather feedback from different users. In this way, we will be able to improve the graphic rendering of the lipsticks even more.
In the case of SparkAR, many of the future possibilities depend on Facebook. In fact, for some time now we've been talking about the "Hub" of effects, where you can group multiple effects together (very convenient when presenting different lipsticks).
An improvement of the SparkAR UI system would greatly improve the quality of the UX, making the effects more comfortable to use.

Share UQIDO EXPLORATORIUM