Next.ML made its way over to our side of the country on April 27 in Cambridge, MA. At the conference, our Head of Research, Alec Radford, led his workshop on General Sequence Learning Using Recurrent Neural Networks again with a couple of new updates.
Alec’s presentation now includes a tutorial with sample code on how to use Passage, his open-source library for text analysis with recurrent neural networks.
Recurrent Neural Networks hold great promise as general sequence learning algorithms. As such, they are a very promising tool for text analysis. However, outside of very specific use cases such as handwriting recognition and recently, machine translation, they have not seen widespread use. Why has this been the case?
In this workshop, Alec will introduce RNNs as a concept. Then you’ll sketch how to implement them and cover the tricks necessary to make them work well. With the basics covered, we will investigate using RNNs as general text classification and regression models, examining where they succeed and where they fail compared to more traditional text analysis models.
Finally, a simple Python and Theano library for training RNNs with a scikit-learn style interface will be introduced and you’ll see how to use it through several hands-on tutorials on real world text datasets.