Improving Natural Language Processing Tasks with Human Gaze-Guided Neural Attention
Ekta Sood,Simon Tannert,Philipp Mueller,Andreas Bulling
A lack of corpora has so far limited advances in integrating human gaze data as asupervisory signal in neural attention mechanisms for natural language processing(NLP). We propose a novel hybrid text saliency model (TSM) that, for the firsttime, combines a cognitive model of reading with explicit human gaze supervisionin a single machine learning framework. On four different corpora we demonstratethat our hybrid TSM duration predictions are highly correlated with human gazeground truth. We further propose a novel joint modeling approach to integrate TSMpredictions into the attention layer of a network designed for a specific upstreamNLP task without the need for any task-specific human gaze data. We demonstratethat our joint model outperforms the state of the art in paraphrase generation onthe Quora Question Pairs corpus by more than 10% in BLEU-4 and achievesstate of the art performance for sentence compression on the challenging GoogleSentence Compression corpus. As such, our work introduces a practical approachfor bridging between data-driven and cognitive models and demonstrates a newway to integrate human gaze-guided neural attention into NLP tasks.


