Closed-domain event extraction uses predefined event schema to discover and extract desired events of particular type from text. An event schema contains several event types and their corresponding event structures. We use the ACE terminologies to introduce an event structure as follows:
D.Ahn [the stages of event extraction] first proposed to divide the ACE event extraction task into four subtasks: trigger detection, event/trigger type identification, event argument detection, and argument role identification.
Without predefined event schemas, open-domain event extraction aims at detecting events from texts and in most cases, also clustering similar events via extracted event key-words. Event keywords refer to those words/phrases mostly describing an event, and sometimes keywords are further divided into triggers and arguments.
The first two tasks mainly focus on event detection; and the rest three tasks are for event clustering. While the relation between the five tasks is evident, each requires a distinct evaluation process and encourages different approaches to address the particular problem.
<ahref="https://arxiv.org/pdf/2008.00364.pdf">An Overview of Event Extraction from Text,2019</a> by<i>Frederik Hogenboom, Flavius Frasincar, Uzay Kaymak, Franciska de Jong:
</a></summary><blockquote><palign="justify">
One common application of text mining is event extraction,which encompasses deducing specific knowledge concerning incidents re-ferred to in texts. Event extraction can be applied to various types ofwritten text, e.g., (online) news messages, blogs, and manuscripts. Thisliterature survey reviews text mining techniques that are employed forvarious event extraction purposes. It provides general guidelines on howto choose a particular event extraction technique depending on the user,the available content, and the scenario of use.
<ahref="https://arxiv.org/pdf/2008.00364.pdf">A Survey of Event Extraction from Text,2019</a> by<i>Wei Xiang, Bang Wang </a></summary><blockquote><palign="justify">
<ahref="https://arxiv.org/pdf/2008.00364.pdf">A Survey of Textual Event Extraction from Social Networks,2017</a> by<i>Mohamed Mejri, Jalel Akaichi </a></summary><blockquote><palign="justify">
<ahref="https://arxiv.org/pdf/2008.00364.pdf">A Survey of event extraction methods from text for decision support systems,2016</a> by<i>Frederik Hogenboom, Flavius Frasincar, Uzay Kaymak, Franciska de Jong, Emiel Caron </a></summary><blockquote><palign="justify">
<ahref="https://arxiv.org/pdf/2008.00364.pdf">A Survey of Textual Event Extraction from Social Networks,2014</a> by<i>Vera DanilovaMikhail AlexandrovXavier Blanco </a></summary><blockquote><palign="justify">
<a>Open-domain Event Extraction and Embedding for Natural Gas Market Prediction, arxiv 2020 (<ahref="https://github.com/">Github</a>)</summary><blockquote><palign="justify">
<a>One for All: Neural Joint Modeling of Entities and Events, AAAI2019(<ahref="https://github.com/pytorch/fairseq">Github</a>)</summary><blockquote><palign="justify">
<a>Doc2EDAG: An End-to-End Document-level Framework for Chinese Financial Event Extraction(<ahref="https://github.com/zihangdai/xlnet">Github</a>)</summary><blockquote><palign="justify">
<ahref="https://arxiv.org/abs/1907.11692">Exploiting the Matching Information in the Support Set for Few Shot Event Classification, PAKDD 2020(<ahref="https://github.com/pytorch/fairseq">Github</a>)</summary><blockquote><palign="justify">
Viet Lai, Franck Dernoncourt, Thien Huu Nguyen
</p></blockquote></details>
<details/>
<summary/>
<ahref="https://arxiv.org/abs/1907.11692">Towards Few-Shot Event Mention Retrieval : An Evaluation Framework and A Siamese Network Approach, 2020(<ahref="https://github.com/pytorch/fairseq">Github</a>)</summary><blockquote><palign="justify">
</p></blockquote></details>
#### 2018
<details/>
<summary/>
<ahref="https://arxiv.org/abs/1907.11692">Zero-Shot Transfer Learning for Event Extraction, ACL 2018(<ahref="https://github.com/pytorch/fairseq">Github</a>)</summary><blockquote><palign="justify">
Lifu Huang, Heng Ji, Kyunghyun Cho, Ido Dagan, Sebastian Riedel, Clare R. Voss
<ahref="https://arxiv.org/abs/1907.11692">Cross-lingual Structure Transfer for Relation and Event Extraction, EMNLP2019(<ahref="https://github.com/pytorch/fairseq">Github</a>)</summary><blockquote><palign="justify">
Ananya Subburathinam, Di Lu, Heng Ji, Jonathan May, Shih-Fu Chang, Avirup Sil, Clare Voss
</p></blockquote></details>
<details/>
<summary/>
<ahref="https://arxiv.org/abs/1907.11692">A Hybrid Character Representatin for Chinese Event Detection, IJCNLP 2019(<ahref="https://github.com/pytorch/fairseq">Github</a>)</summary><blockquote><palign="justify">
Xi Xiangyu ; Zhang Tong ; Ye Wei ; Zhang Jinglei ; Xie Rui ; Zhang Shikun
</p></blockquote></details>
<details/>
<summary/>
<ahref="https://arxiv.org/abs/1907.11692">Event Detection with Trigger-Aware Lattice Neural Network, EMNLP 2019(<ahref="https://github.com/pytorch/fairseq">Github</a>)</summary><blockquote><palign="justify">
Ning Ding, Ziran Li, Zhiyuan Liu, Haitao Zheng, Zibo Lin
</p></blockquote></details>
<details/>
<summary/>
<ahref="https://arxiv.org/abs/1907.11692">Doc2EDAG: An End-to-End Document-level Framework for Chinese Financial Event Extraction, EMNLP 2019(<ahref="https://github.com/pytorch/fairseq">Github</a>)</summary><blockquote><palign="justify">
Shun Zheng, Wei Cao, Wei Xu, Jiang Bian
</p></blockquote></details>
#### 2018
<details/>
<summary/>
<ahref="https://arxiv.org/abs/1907.11692">DCFFE: A Document-level Chinese Financial Event Extraction System based on Automatically Labelled Training Data, ACL 2018(<ahref="https://github.com/pytorch/fairseq">Github</a>)</summary><blockquote><palign="justify">
Yang, Hang and Chen, Yubo and Liu, Kang and Xiao, Yang and Zhao, Jun
</p></blockquote></details>
<details/>
<summary/>
<ahref="https://arxiv.org/abs/1907.11692">Nugget Proposal Networks for Chinese Event Detection, ACL 2018(<ahref="https://github.com/pytorch/fairseq">Github</a>)</summary><blockquote><palign="justify">
Lin, Hongyu and Lu, Yaojie and Han, Xianpei and Sun, Le
</p></blockquote></details>
#### 2016
<details/>
<summary/>
<ahref="https://arxiv.org/abs/1907.11692">A convolution bilstm neural network model for chinese event extraction,NLPCC 2016(<ahref="https://github.com/pytorch/fairseq">Github</a>)</summary><blockquote><palign="justify">
<ahref="https://arxiv.org/abs/1907.11692">Neural Cross-Lingual Event Detection with Minimal Parallel Resources, EMNLP2019(<ahref="https://github.com/pytorch/fairseq">Github</a>)</summary><blockquote><palign="justify">
The scarcity in annotated data poses a great challenge for event detection (ED). Cross-lingual ED aims to tackle this challenge by transferring knowledge between different languages to boost performance. However, previous cross-lingual methods for ED demonstrated a heavy dependency on parallel resources, which might limit their applicability. In this paper, we propose a new method for cross-lingual ED, demonstrating a minimal dependency on parallel resources. Specifically, to construct a lexical mapping between different languages, we devise a context-dependent translation method; to treat the word order difference problem, we propose a shared syntactic order event detector for multilingual co-training. The efficiency of our method is studied through extensive experiments on two standard datasets. Empirical results indicate that our method is effective in 1) performing cross-lingual transfer concerning different directions and 2) tackling the extremely annotation-poor scenario.
<ahref="https://arxiv.org/abs/1907.11692">Neural Cross-Lingual Event Detection with Minimal Parallel Resources, EMNLP2019(<ahref="https://github.com/pytorch/fairseq">Github</a>)</summary><blockquote><palign="justify">
The scarcity in annotated data poses a great challenge for event detection (ED). Cross-lingual ED aims to tackle this challenge by transferring knowledge between different languages to boost performance. However, previous cross-lingual methods for ED demonstrated a heavy dependency on parallel resources, which might limit their applicability. In this paper, we propose a new method for cross-lingual ED, demonstrating a minimal dependency on parallel resources. Specifically, to construct a lexical mapping between different languages, we devise a context-dependent translation method; to treat the word order difference problem, we propose a shared syntactic order event detector for multilingual co-training. The efficiency of our method is studied through extensive experiments on two standard datasets. Empirical results indicate that our method is effective in 1) performing cross-lingual transfer concerning different directions and 2) tackling the extremely annotation-poor scenario.
<ahref="https://arxiv.org/abs/1907.11692">Neural Cross-Lingual Event Detection with Minimal Parallel Resources, EMNLP2019(<ahref="https://github.com/pytorch/fairseq">Github</a>)</summary><blockquote><palign="justify">
The scarcity in annotated data poses a great challenge for event detection (ED). Cross-lingual ED aims to tackle this challenge by transferring knowledge between different languages to boost performance. However, previous cross-lingual methods for ED demonstrated a heavy dependency on parallel resources, which might limit their applicability. In this paper, we propose a new method for cross-lingual ED, demonstrating a minimal dependency on parallel resources. Specifically, to construct a lexical mapping between different languages, we devise a context-dependent translation method; to treat the word order difference problem, we propose a shared syntactic order event detector for multilingual co-training. The efficiency of our method is studied through extensive experiments on two standard datasets. Empirical results indicate that our method is effective in 1) performing cross-lingual transfer concerning different directions and 2) tackling the extremely annotation-poor scenario.
<ahref="https://arxiv.org/abs/1907.11692">Neural Cross-Lingual Event Detection with Minimal Parallel Resources, EMNLP2019(<ahref="https://github.com/pytorch/fairseq">Github</a>)</summary><blockquote><palign="justify">
The scarcity in annotated data poses a great challenge for event detection (ED). Cross-lingual ED aims to tackle this challenge by transferring knowledge between different languages to boost performance. However, previous cross-lingual methods for ED demonstrated a heavy dependency on parallel resources, which might limit their applicability. In this paper, we propose a new method for cross-lingual ED, demonstrating a minimal dependency on parallel resources. Specifically, to construct a lexical mapping between different languages, we devise a context-dependent translation method; to treat the word order difference problem, we propose a shared syntactic order event detector for multilingual co-training. The efficiency of our method is studied through extensive experiments on two standard datasets. Empirical results indicate that our method is effective in 1) performing cross-lingual transfer concerning different directions and 2) tackling the extremely annotation-poor scenario.
<ahref="https://arxiv.org/abs/1907.11692">Neural Cross-Lingual Event Detection with Minimal Parallel Resources, EMNLP2019(<ahref="https://github.com/pytorch/fairseq">Github</a>)</summary><blockquote><palign="justify">
The scarcity in annotated data poses a great challenge for event detection (ED). Cross-lingual ED aims to tackle this challenge by transferring knowledge between different languages to boost performance. However, previous cross-lingual methods for ED demonstrated a heavy dependency on parallel resources, which might limit their applicability. In this paper, we propose a new method for cross-lingual ED, demonstrating a minimal dependency on parallel resources. Specifically, to construct a lexical mapping between different languages, we devise a context-dependent translation method; to treat the word order difference problem, we propose a shared syntactic order event detector for multilingual co-training. The efficiency of our method is studied through extensive experiments on two standard datasets. Empirical results indicate that our method is effective in 1) performing cross-lingual transfer concerning different directions and 2) tackling the extremely annotation-poor scenario.
<summary/><ahref="https://catalog.ldc.upenn.edu/LDC2006T06">ACE2005 English Corpus</a></summary><blockquote><palign="justify">
ACE 2005 Multilingual Training Corpus contains the complete set of English, Arabic and Chinese training data for the 2005 Automatic Content Extraction (ACE) technology evaluation. The corpus consists of data of various types annotated for entities, relations and events by the Linguistic Data Consortium (LDC) with support from the ACE Program and additional assistance from LDC.