Makeup Transfer

Makeup transfer is the process of taking a particular make up style from a reference image and translating that same style onto a different source image. We talk to machine learning specialist Kseniya Buraya about her work.

Far from your typical application for machine learning, you may be wondering what the vision behind this research is. Firstly, and most obviously, it will be useful for anyone who wears makeup, which is a huge proportion of the population. Being able to preview different styles of makeup before buying could save a lot of regrettably expensive purchases. However, research into computerised makeup recognition could also have wide-reaching connotations for more general facial recognition applications: assisting everything from FaceID to surveillance.


Above is one of the maps in Kseniya’s Litmaps workspace entitled ‘Makeup Transfer’. From the map we can easily identify the left-most two papers as the potentially seminal examples. In particular {{"replaceWith":"span","content":"Digital face makeup by example (Guo et al.)","action":"inspect","windowId":"main","articleId":2121293528}} appears to be cited by the majority of later works.

This is not a brand-new field of research, with seminal papers being published around 15 years ago, and of course there are significant challenges to creating an effective, natural, and efficient makeup transfer algorithm.


As can be seen from the map, one of the common approaches to tackling this problem is the use of a type of Artificial Intelligence, trained neural network called a Generalised Adversarial Network (GAN). GANs gained a lot of exposure when following the viral success of “StyleGAN”, which generated impressively realistic portraits of people completely from scratch. Check out some of StyleGAN's descendants: The basic premise of these tools is the creation of two banks of training data - one which is known (for example a database of real photos of people's faces), and one which is a mix of real, and artificially generated examples. Two programs are then trained, one whose job is to tell the difference between real and fake examples (the discriminator), and one whose job is to generate fakes which trick the discriminator.

Makeup GAN example from Chen et al, 2019
Makeup GAN results from Chen et al. 2019 (source, Litmaps Seed)

While progress is being made, previous efforts fail to preserve some key aesthetic features such as facial structure, lighting (shadows can be easily misinterpreted as skin tone or existing makeup), and skin imperfections or features. Keep an eye on Kseniya's work for updates on the state of the art in generative makeup transfer.

Share your research with the world

We’re looking to promote high-quality literature maps that can help other researchers, and the wider public, better-understand a broad range of research topics.

If you've been working on a research project, literature review, or have critical thoughts on the research process, we'd love to hear from you.

Tell us about your research