NLP Findings of ACL

Unsupervised Domain Adaptation for Event Detection using Domain-specific Adapters

May 8, 2021

Due to the multi-dimensional variation of textual data, detection of event triggers from new domains can become a lot more challenging. This prompts a need to research on domain adaptation methods for event detection task, especially for the most practical unsupervised setting. Recently, large transformer-based language models, e.g. BERT, have become essential to achieve top performance for event detection. However, their unwieldy nature also prevents effective adaptation across domains. To this end, this work proposes a Domain-specific Adapter-based Adaptation (DAA) framework to improve the adaptability of BERT-based models for event detection across domains. By explicitly representing data from different domains with separate adapter modules in each layer of BERT, DAA introduces a novel joint representation learning mechanism and a Wasserstein distance-based technique for data selection in adversarial learning to substantially boost the performance on target domains. Extensive experiments and analysis over different datasets (i.e., LitBank, TimeBank, and ACE-05) demonstrate the effectiveness of our approach.

Overall

< 1 minute

Nghia Ngo Trung, Duy Phung, Thien Huu Nguyen

ACL Findings 2021

Share Article

Related publications

NLP NAACL Top Tier
April 4, 2024

*Thanh-Thien Le, *Viet Dao, *Linh Van Nguyen, Nhung Nguyen, Linh Ngo Van, Thien Huu Nguyen

GA-LLM NLP NAACL Top Tier
April 4, 2024

Thang Le, Tuan Luu

NLP EMNLP Findings
January 26, 2024

Thang Le, Luu Anh Tuan