Class Representative Projection for Text-based Zero-Shot Learning
Metadata[+] Show full item record
There have been significant advances in supervised machine learning and enormous benefits from deep learning for a range of diverse applications. Despite the success of deep learning, in reality, very few works have shown progress in text classification. Transfer learning, known as the zero-shot learning (ZSL) or generalized zero-shot learning (G-ZSL), is receiving much attention due to its ability to transfer knowledge learned from a known (seen) domain to unknown (unseen) domains. But most of the ZSL works are relying on large training corpus and external semantic knowledge. Thus, there are very few studies that have investigated the improvement of text classification performance in sorely text-based ZSL/G-ZSL. In this thesis, a class representative framework was proposed for text-based ZSL by designing the novel projection method, learned from the seen classes, and applying it to transfer the knowledge to the unseen classes effectively. We designed a three-step approach, which consists of (1) sentence-based embeddings, (2) deep neural networks, and (3) class-based representative classifiers. Experimental results show that the proposed projection framework achieves the best classification results in text-based ZSL/G-ZSL compared with the state-of-the-art approaches investigated with three benchmark datasets including large newsgroup post of 20 classes called 20 Newsgroup Dataset and DBpedia dataset on various topics.
Table of Contents
Introduction -- Background and Related Work -- Literature on Document Embeddings -- Proposed Framework -- Conclusion and future work
M.S. (Master of Science)