Publication

Back to overview

Leveraging Random Label Memorization for Unsupervised Pre-Training

Type of publication Peer-reviewed
Publikationsform Original article (peer-reviewed)
Author Pondenkandath Vinaychandran, Alberti Michele, Puran Sammer, Ingold Rolf, Liwicki Marcus,
Project HisDoc III : Large-Scale Historical Document Classification
Show all

Original article (peer-reviewed)

Journal Workshop of Integration of Deep Learning Theories at Conference on Neur
Page(s) 1 - 6
Title of proceedings Workshop of Integration of Deep Learning Theories at Conference on Neur

Open Access

URL https://arxiv.org/abs/1811.01640
Type of Open Access Repository (Green Open Access)

Abstract

We present a novel approach to leverage large unlabeled datasets by pre-training state-of-the-art deep neural networks on randomly-labeled datasets. Specifically, we train the neural networks to memorize arbitrary labels for all the samples in a dataset and use these pre-trained networks as a starting point for regular supervised learning. Our assumption is that the "memorization infrastructure" learned by the network during the random-label training proves to be beneficial for the conventional supervised learning as well. We test the effectiveness of our pre-training on several video action recognition datasets (HMDB51, UCF101, Kinetics) by comparing the results of the same network with and without the random label pre-training. Our approach yields an improvement - ranging from 1.5\% on UCF-101 to 5\% on Kinetics - in classification accuracy, which calls for further research in this direction.
-