Loading…
NIPS 2015 has ended
Thursday, December 10 • 11:00 - 15:00
Semi-supervised Sequence Learning

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

We present two approaches to use unlabeled data to improve Sequence Learningwith recurrent networks. The first approach is to predict what comes next in asequence, which is a language model in NLP. The second approach is to use asequence autoencoder, which reads the input sequence into a vector and predictsthe input sequence again. These two algorithms can be used as a “pretraining”algorithm for a later supervised sequence learning algorithm. In other words, theparameters obtained from the pretraining step can then be used as a starting pointfor other supervised training models. In our experiments, we find that long shortterm memory recurrent networks after pretrained with the two approaches becomemore stable to train and generalize better. With pretraining, we were able toachieve strong performance in many classification tasks, such as text classificationwith IMDB, DBpedia or image recognition in CIFAR-10.


Speakers

Thursday December 10, 2015 11:00 - 15:00 EST
210 C #5

Attendees (0)