Temporal and Aspectual Entailment

Thomas Kober, Sander Bijl de Vroe, Mark Steedman


Abstract
Inferences regarding “Jane’s arrival in London” from predications such as “Jane is going to London” or “Jane has gone to London” depend on tense and aspect of the predications. Tense determines the temporal location of the predication in the past, present or future of the time of utterance. The aspectual auxiliaries on the other hand specify the internal constituency of the event, i.e. whether the event of “going to London” is completed and whether its consequences hold at that time or not. While tense and aspect are among the most important factors for determining natural language inference, there has been very little work to show whether modern embedding models capture these semantic concepts. In this paper we propose a novel entailment dataset and analyse the ability of contextualised word representations to perform inference on predications across aspectual types and tenses. We show that they encode a substantial amount of information relating to tense and aspect, but fail to consistently model inferences that require reasoning with these semantic properties.
Anthology ID:
W19-0409
Volume:
Proceedings of the 13th International Conference on Computational Semantics - Long Papers
Month:
May
Year:
2019
Address:
Gothenburg, Sweden
Editors:
Simon Dobnik, Stergios Chatzikyriakidis, Vera Demberg
Venue:
IWCS
SIG:
SIGSEM
Publisher:
Association for Computational Linguistics
Note:
Pages:
103–119
Language:
URL:
https://aclanthology.org/W19-0409
DOI:
10.18653/v1/W19-0409
Bibkey:
Cite (ACL):
Thomas Kober, Sander Bijl de Vroe, and Mark Steedman. 2019. Temporal and Aspectual Entailment. In Proceedings of the 13th International Conference on Computational Semantics - Long Papers, pages 103–119, Gothenburg, Sweden. Association for Computational Linguistics.
Cite (Informal):
Temporal and Aspectual Entailment (Kober et al., IWCS 2019)
Copy Citation:
PDF:
https://aclanthology.org/W19-0409.pdf
Code
 tttthomasssss/iwcs2019 +  additional community code
Data
Billion Word BenchmarkOne Billion Word BenchmarkSNLI