arrow_back Hacking Self-attention architectures to address Unsupervised text tasks
Industrialized Capsule Networks for Text Analytics
Submitted by Vijay Srinivas Agneeswaran, Ph.D (@vijayagneeswaran) on Wednesday, 3 April 2019
Multi-label text classification is an interesting problem where multiple tags or categories may have to be associated with the given text/documents. Multi-label text classification occurs in numerous real-world scenarios, for instance, in news categorization and in bioinformatics (gene classification problem, see [Zafer Barutcuoglu et. al 2006]). Kaggle data set is representative of the problem: https://www.kaggle.com/jhoward/nb-svm-strong-linear-baseline/data.
Several other interesting problems in text analytics exist, such as abstractive summarization [Chen, Yen-Chun 2018], sentiment analysis, search and information retrieval, entity resolution, document categorization, document clustering, machine translation etc. Deep learning has been applied to solve many of the above problems – for instance, the paper [Rie Johnson et. al 2015] gives an early approach to applying a convolutional network to make effective use of word order in text categorization. Recurrent Neural Networks (RNNs) have been effective in various tasks in text analytics, as explained here (http://karpathy.github.io/2015/05/21/rnn-effectiveness/). Significant progress has been achieved in language translation by modelling machine translation using an encoder-decoder approach with the encoder formed by a neural network [Dzmitry Bahdanau et. al 2014].
However, as shown in [Dan Rosa de Jesus et. al 2018] , certain cases require modelling the hierarchical relationship in text data and is difficult to achieve with traditional deep learning networks because linguistic knowledge may have to be incorporated in these networks to achieve high accuracy. Moreover, deep learning networks do not consider hierarchical relationships between local features as pooling operation of CNNs lose information about the hierarchical relationships.
We show one industrial scale use case of capsule networks which we have implemented for our client in the realm of text analytics – news categorization. We show, using the precision, recall and F1 metrics the performance of capsule networks on the news categorization task. Importantly, we discuss how to tune key hyper-parameters of capsule networks such as batch size, number of filters and size of filters, initial learning rate, number of capsules and dimension of capsules. We also discuss the key challenges faced and how we have industrialized capsulet networks using KubeFlow.
- History of impact of machine learning and deep learning on NLP.
- Motivation for capsule networks and how they can be used in text analytics.
- Implementation of capsule networks in TensorFlow.
- Benchmarking of capsule networks with dynamic routing for a real multi-label text classification use case for news categorization.
[Zafer Barutcuoglu et. al 2006] Zafer Barutcuoglu, Robert E. Schapire, and Olga G. Troyanskaya. 2006. Hierarchical multi-label prediction of gene function. Bioinformatics 22, 7 (April 2006), 830-836. DOI=http://dx.doi.org/10.1093/bioinformatics/btk048 [Rie Johnson et. al 2015] Rie Johnson, Tong Zhang: Effective Use of Word Order for Text Categorization with Convolutional Neural Networks. HLT-NAACL 2015: 103-112. [Dzmitry Bahdanau et. al 2014] Bahdanau, Dzmitry et al. “Neural Machine Translation by Jointly Learning to Align and Translate.” CoRR abs/1409.0473 (2014). [Dan Rosa de Jesus et. al 2018] Dan Rosa de Jesus, Julian Cuevas, Wilson Rivera, Silvia Crivelli (2018). “Capsule Networks for Protein Structure Classification and Prediction”, available at https://arxiv.org/abs/1808.07475. [Yequan Wang et. al 2018] Yequan Wang, Aixin Sun, Jialong Han, Ying Liu, and Xiaoyan Zhu. 2018. Sentiment Analysis by Capsules. In Proceedings of the 2018 World Wide Web Conference (WWW ‘18). International World Wide Web Conferences Steering Committee, Republic and Canton of Geneva, Switzerland, 1165-1174. DOI: https://doi.org/10.1145/3178876.3186015 Chen, Yen-Chun and Bansal, Mohit (2018), “Fast Abstractive Summarization with Reinforce-Selected Sentence Rewriting”, eprint arXiv:1805.11080.
We illustrate how capsule networks can be industrialized:
- Overview of NLP and how machine learning and deep learning have been used in various NLP tasks.
- Overview of capsule networks and how they help in handling spatial relationships between objects in an image.
- We also learn about how they can be applied to text analytics.
- We show an implementation of capsule networks, which are useful in text analytics – we also benchmark our implementation and discuss hyper-parameters to be tuned.
- We also show how to industrialized capsule networks by using KubeFlow.
This presentation shall be made at the O’Reilly conference on Artificial Intelligence @New York in April 2019 also. We shall showcase some progress we shall subsequently make at the Fifth Elephant.
PLease note that this session will have 2 speakers - self and Abhishek Kumar.
Dr. Vijay Srinivas Agneeswaran has a Bachelor’s degree in Computer Science & Engineering from SVCE, Madras University (1998), an MS (By Research) from IIT Madras in 2001, a PhD from IIT Madras (2008) and a post-doctoral research fellowship in the LSIR Labs, Swiss Federal Institute of Technology, Lausanne (EPFL). He currently heads data sciences at Publicis Sapient, India. He has spent the last eighteen years creating intellectual property and building data-based products in Industry and academia. In his current role, he has led the team that delivered real-time hyper-personalization for a global auto-major as well as other work for various clients aross domains such as retail, banking/finance, telecom, automotive etc. He has built PMML support into Spark/Storm and realized several machine learning algorithms such as LDA, Random Forests over Spark. He led a team that designed and implemented a big data governance product for a role-based fine-grained access control inside of Hadoop YARN. He and his team have also built the first distributed deep learning framework on Spark. He is a professional member of the ACM and the IEEE (Senior) for the last 15+ years. He has five full US patents and has published in leading journals and conferences, including IEEE transactions. His research interests include distributed systems, data sciences as well as Big-Data and other emerging technologies.