Efficient Methods for Natural Language Processing: A Survey
Published in Transactions of the Association for Computational Linguistics, 2023
This work provides a comprehensive survey on methods that improve efficiency in NLP.
Download here
Published in Transactions of the Association for Computational Linguistics, 2023
This work provides a comprehensive survey on methods that improve efficiency in NLP.
Download here
Published in Computational Linguistics, 2022
In this work, we propose to re-order instances during annotation according to their difficulty. In experiments with simulated and real-world users, we show that such annotation curricula can significantly reduce the annotation time.
Download here
Published in Proceedings of the 58th Annual Meeting of the ACL, 2020
In this work, we operationalize active learning for scenarios where the human oracle cannot be a mere labeler (such as language learning). We devise sampling strategies that aim to select instances that benefit both, model and user, and show that this leads to an improved model training over existing active learning strategies and at the same time, results in exercises that suit a learner well.
Download here
Published in Proceedings of the 57th Annual Meeting of the ACL, 2019
In this work, we propose two novel generation strategies for C-Tests, a type of gap filling exercises. In contrast to previous approaches, our strategies do not rely upon a static gap placement and size. This allows teachers to generate C-Tests of multiple difficulties from a single input text. In a user study, we show that both our proposed methods are capable successfully manipulating the C-Test difficulty.
Download here