Abstract

 

Context-Sensitive Semantic Smoothing Using Semantically Relatable Sequences

We propose a novel approach to context sensitive semantic smoothing by making use of an intermediate, "semantically light" representation for sentences, called Semantically Relatable Sequences (SRS). SRSs of a sentence are tuples of words appearing in the semantic graph of the sentence as linked nodes depicting dependency relations. In contrast to patterns based on consecutive words, SRSs make use of groupings of non-consecutive but semantically related words. Our experiments on TREC AP89 collection show that the mixture model of SRS translation model and Two Stage Language Model (TSLM) of Lafferty and Zhai achieves MAP scores better than the mixture model of MultiWord Expression (MWE) translation model and TSLM. Furthermore, a system, which for each test query selects either the SRS or the MWE mixture model based on better query MAP score, shows significant improvements over the individual mixture models.

Kamaljeet S. Verma, Pushpak Bhattacharyya