In the aftermath of global Covid-19 turmoil, the cosmetic industry underwent a profound transformation. The new normal reshaped consumer behavior, steering a significant shift towards online shopping experiences. However, for consumers navigating the digital realm in search of face care products, a persistent challenge loomed large—the inability to physically interact with the products before purchase. The quest for the perfect shade of foundation or the ideal skincare regimen often leads to uncertainty, deterring customers from confidently exploring and investing in products. The pandemic magnified this dilemma, compelling a leading cosmetic manufacturer to confront the evolving landscape of consumer needs. Discover Movate’s Virtual Try-On solution, offering an immersive and convenient shopping experience for a top brand’s customers.
Key business challenges
The global cosmetic manufacturer involved Movate in a quest to provide personalize and immersive shopping experience to its online customers and to address the following challenges:
Increased return rate: The absence of in-person cosmetic interactions before purchase led to a soaring return rate, hitting 23.5% in 2022, sharply contrasting the beauty industry’s 4.3% average return rate.
Decrease sales: The absence of in-store product trials has led to uncertainty among customers regarding product quality and suitability, caused a 16.7% sales drop for popular face care and lip care items.
Impact on CSAT: Limited online product trials and the absence of engaging shopping experiences significantly impacted customer satisfaction.
Limited customer reach was posed another challenge as there were difficulties in reaching a broader customer base, especially those preferring in-person product trials, which thereby hindered the brand’s wider audience connection.
To cater to a diverse user base across various geographic locations and device capabilities, Movate implemented a comprehensive suite of features accommodating lower-end mobile devices and desktop computers without webcams, as well as low-fidelity network connections.
Movate’s virtual try-onsolution implementation included both real-time Virtual Try-On and Non-real-time Virtual Try-On capabilities.
Real-time Virtual Try-On offers users the ability to interact with products dynamically, experiencing their texture and color in real-time. This functionality relies on a live video feed from the user’s camera, allowing them to move and visualize how the product appears on themselves.
In contrast, non-real-time Virtual Try-On caters specifically to lower-end mobile devices and areas with limited network capabilities. This feature enables users to upload a photo and preview how the product would appear on them within the uploaded image.
Virtual Try-On solution overview
In the initial phase, our solution journey commenced with an exhaustive analysis of consumer feedback on the global cosmetic manufacturer’s face makeup, eye makeup, and lip makeup products, categorizing sentiments to understand consumer preferences. Delving deeper into the subconscious buying behavior, extensive user research via in-person and telephonic interviews with diverse consumers across age groups was conducted. The focus was to glean insights into their purchasing habits and expectations from a Virtual Try-On app.
Subsequently, a pivotal decision point arose building versus procuring an existing solution, considering the available Virtual Try-On applications in the market. Given the extensive spectrum of colors and effects within the cosmetic line, ensuring an exact match of the Virtual Try-On’s color palette with the product’s range was imperative.
While existing solutions promised rapid deployment, they lacked personalized features crucial for consumers, demanding considerable customization efforts.
Further progression entailed defining a seamless user flow, prioritizing a user-centric experience. Using “draw.io” for flow outlines and “Justinmind’ for prototype development, meticulous attention was placed on creating mock-ups and prototypes to ensure a streamlined user experience.
Our initial phase involved implementing facial recognition, acknowledging the inherent uniqueness of each human face yet leveraging common facial landmarks for detection. While conventional neural networks employ 68 facial landmarks for coordinate detection, this model inadequately covers the forehead region. To address this, a customized model incorporating an additional 13 landmarks was developed, specifically catering to the forehead area. This augmentation enabled precise head detection, essential for executing makeup effects on the forehead.
Using the coordinates of facial landmarks, Movate established critical zones relative to specific landmarks.
Describing these zones through convex polygons, the team determined specific application areas for cosmetics, ensuring accurate placement and simulation of makeup effects on targeted facial regions.
AR & Virtual Try-On SDK
Realism stands as a cornerstone in Augmented Reality (AR) applications, especially in user-interaction scenarios. When selecting the ideal Software Development Kit (SDK) for Virtual Try-On implementation, our evaluation encompassed several prominent options:
ARKit (by Apple): ARKit excels in real-time facial tracking and 3D face modeling, particularly isolating lip areas. However, during lip movement simulations, accuracy inconsistencies were noticed, affected by lighting conditions and facial shapes. Additionally, its exclusivity to iOS devices posed a limitation.
Spark AR (by Meta): Spark AR boasts advanced face tracking capabilities and allows users to select lipstick colors via an intuitive interface. Yet, concerns arose regarding limited systemsupport for various phone resolutions, impacting usability.
ARCore (by Google): With superior motion tracking and light estimation, ARCore thrives in low-light environments. However, its compatibility solely with Android devices contradicted our pursuit for a platform-agnostic SDK.
DeepAR: Addressing limitations observed in ARCore and ARKit, DeepAR emerged as a comprehensive solution supporting iOS, MacOS, Android, and HTML platforms. Meeting Movate’s requirement for web and mobile support, DeepAR offers versatile features like face filters and background segmentation. However, its resource-intensive nature demands substantial RAM and high processing GPU.
Given the option for non-real-time Virtual Try-On through image uploads, DeepAR was selected for simulation creation.
UI & product application simulation
The User Interface (UI) serves as the pivotal point of interaction between users and the Virtual Try-On application. Movate’s emphasis was on crafting a lightweight, intuitive, and user-friendly UI to ensure an unparalleled user experience. This UI facilitates seamless product selection and immediate display of results upon selection.
To mimic product applications, we implemented color and texture alterations within the region of interest. Leveraging Facial Zone mapping and facial landmark identification, Movate precisely identified distinct facial areas such as the lips for lip makeup and the entire face for face makeup, incorporating nuanced light variations to simulate diverse makeup effects.
AI-powered skin analysis
The integration of AI-based Skin Analysis represented a pivotal stride in our solution’s evolution. Utilizing convolutional neural networks (CNNs) model and image processing techniques, Movate developed a robust system to conduct in-depth analysis of the user’s skin.
Our process involved capturing user-provided images or utilizing device cameras for real-time skin assessment. Through meticulous preprocessing stages, including noise reduction, normalization, and precise segmentation, we isolated facial skin regions.
Leveraging CNN models, Movate meticulously analyzed diverse skin attributes encompassing texture, color, pores, wrinkles, and pigmentation.
By employing feature extraction and pattern recognition techniques, Movate’s system adeptly categorized skin types (dry, oily, combination, etc.) and provided an overarching assessment of skin condition. Post-analysis, the system generates personalized product recommendations tailored to address specific skin attributes, ensuring the organization’s skincare regimen effectively meets individual needs.
Client business outcomes
The impact of Virtual Try-On was profound, yielding substantial financial and operational advantages for the leading global cosmetic manufacturer. The numbers and benefits underscore the solution’s impact:
A staggering 60% increase in total sales accompanied by a sevenfold surge in brand engagement.
An impressive 21% upsurge in online sales revenue.
Achieving an 8.5% reduction in return rates due to more informed purchasing decisions.
A notable 17.6% decline in cart abandonment rates.
Environmental conservation through reduced carbon footprint by minimizing the necessity for physical product samples.
By successfully catering to the segment of customers who value product trials before purchase, the client witnessed an expanded market reach with the help of Movate’s solution.
The quality of Augmented Reality (AR) technology and precise facial zone mapping and coordinate identification in the Virtual Try-On feature are pivotal for accurately detecting facial zones and seamlessly applying virtual products for a realistic appearance.
Here are the takeaways from the implementation:
Facial Landmarks model selection: Initially, our implementation relied on the neural network FaceLandmarks 68 model, which lacked crucial points for the upper forehead area. This limitation posed challenges for facial makeup simulations, particularly in applying products to the forehead. To address this, we built a custom model incorporating an additional 13 landmarks tailored specifically for the forehead region.
Selection of Augmented Reality (AR) SDK: In our initial assessment, ARKit and ARCore were excluded due to their close integration with specific mobile operating systems. Instead, Spark AR SDK was chosen. However, during early testing phases, limitations surfaced in system support across varying phone resolutions.
Faced by these limitations, the team at Movate got back to reworking and reassessing DeepAR for the development process.
Bringing it home
Virtual Try-On technology emerges as a transformative catalyst in retail, seamlessly merging the convenience of online shopping with the tactile gratification of in-store experiences. It offers an innovative and immersive shopping journey, redefining consumer engagement.
With profound expertise in computer vision, ML, AR, Movate has honed unparalleled AR and AI practices for Virtual Try-On solution development. Our adaptive approach allows us to tailor solutions to your business needs, crafting unique and captivating experiences for your clientele. Be it comprehensive Virtual Try-On solutions or targeted AR development, partner with Movate to drive business outcomes.
Pallab, a Senior Director, and Enterprise Solution Architect, drives cloud initiatives and practices at Movate. With over 16 years of experience spanning diverse domains and global locations, he’s a proficient Multi-Cloud Specialist. Across major cloud Hyperscalers, Pallab excels in orchestrating successful migrations of 25+ workloads. His expertise extends to security, Bigdata, IoT, and Edge Computing. Notably, he’s masterminded over 10 cutting-edge use cases in Data Analytics, AI/ML, IoT, and Edge Computing, solidifying his reputation as a trailblazer in the tech landscape. LinkedIn profile