Leveraging Deep Transfer Learning for Multi-modal Affect Recognition in the Wild
Heysem Kaya
Date: 16:00 – 17:00, Friday, 08.11.2019
Location: Minnaert – 2.02
Title: Leveraging Deep Transfer Learning for Multi-modal Affect Recognition in the Wild
Abstract: While research in deep learning is becoming widespread with the exponential growth of “Big-Data” sources; training and tuning deep models that generalize well on unseen data is a very costly “end-to-end” process starting from collecting, cleaning, annotating data as well as the computational resources needed to train such models. For both efficient and effective use of deep learning, finding ways to employ pre-trained models is of utmost importance to achieve state-of-the-art results on challenging problems. We showcase two multimodal affect recognition pipelines, namely emotion recognition in the wild (using EmotiW 2015 challenge data) and multimodal apparent personality trait estimation. In the former, we show that a selective use of DCNN models (e.g. using a model pre-trained on face recognition data instead of a model trained on object recognition data for facial emotion recognition) and the way we align faces matter for transfer learning. In the latter, we link the predictions of the apparent personality traits to explanations of job candidate invitations.