Efficient Methods for Natural Language Processing: A Survey

Published in Transactions of the Association for Computational Linguistics, 2023

Abstract

Recent work in natural language processing (NLP) has yielded appealing results from scaling model parameters and training data; however, using only scale to improve performance means that resource consumption also grows. Such resources include data, time, storage, or energy, all of which are naturally limited and unevenly distributed. This motivates research into efficient methods that require fewer resources to achieve similar results. This survey synthesizes and relates current methods and findings in efficient NLP. We aim to provide both guidance for conducting NLP under limited resources, and point towards promising research directions for developing more efficient methods.

Citation

@article{10.1162/tacl_a_00577,
    author = {Treviso, Marcos and Lee, Ji-Ung and Ji, Tianchu and Aken, Betty van and Cao, Qingqing and Ciosici, Manuel R. and Hassid, Michael and Heafield, Kenneth and Hooker, Sara and Raffel, Colin and Martins, Pedro H. and Martins, André F. T. and Forde, Jessica Zosa and Milder, Peter and Simpson, Edwin and Slonim, Noam and Dodge, Jesse and Strubell, Emma and Balasubramanian, Niranjan and Derczynski, Leon and Gurevych, Iryna and Schwartz, Roy},
    title = "{Efficient Methods for Natural Language Processing: A Survey}",
    journal = {Transactions of the Association for Computational Linguistics},
    volume = {11},
    pages = {826-860},
    year = {2023},
    month = {07},
    abstract = "{Recent work in natural language processing (NLP) has yielded appealing results from scaling model parameters and training data; however, using only scale to improve performance means that resource consumption also grows. Such resources include data, time, storage, or energy, all of which are naturally limited and unevenly distributed. This motivates research into efficient methods that require fewer resources to achieve similar results. This survey synthesizes and relates current methods and findings in efficient NLP. We aim to provide both guidance for conducting NLP under limited resources, and point towards promising research directions for developing more efficient methods.}",
    issn = {2307-387X},
    doi = {10.1162/tacl_a_00577},
    url = {https://doi.org/10.1162/tacl\_a\_00577},
    eprint = {https://direct.mit.edu/tacl/article-pdf/doi/10.1162/tacl\_a\_00577/2143614/tacl\_a\_00577.pdf},
}