Contributors
Abstract
We investigate the value of feature en- gineering and neural network models for predicting successful writing. Similar to previous work, we treat this as a binary classification task and explore new strategies to automatically learn representations from book contents. We evaluate our feature set on two different corpora created from Project Gutenberg books. The first presents a novel approach for generating the gold standard labels for the task and the other is based on prior research. Using a combination of hand-crafted and recurrent neural network learned representations in a dual learning setting, we obtain the best performance of 73.50% weighted F1-score.
Read the paper Here
Download the Dataset
Cite the paper using
@InProceedings{maharjan-EtAl:2017:EACLlong, author = {Maharjan, Suraj and Arevalo, John and Montes, Manuel and Gonz\'{a}lez, Fabio A. and Solorio, Thamar}, title = {A Multi-task Approach to Predict Likability of Books}, booktitle = {Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers}, month = {April}, year = {2017}, address = {Valencia, Spain}, publisher = {Association for Computational Linguistics}, pages = {1217--1227}, abstract = {We investigate the value of feature engineering and neural network models for predicting successful writing. Similar to previous work, we treat this as abinary classification task and explore new strategies to automatically learn representations from book contents. We evaluate our feature set on two different corpora created from Project Gutenberg books. The first presents a novel approach for generating the gold standard labels for the task and the other is based on prior research. Using a combination of hand-crafted and recurrent neural network learned representations in a dual learning setting, we obtain the best performance of 73.50% weighted F1-score.}, url = {http://www.aclweb.org/anthology/E17-1114} }
For any query, please contact the first author skar3 AT uh DOT edu.