Transformers in Natural Language Processing - Traitement du Langage Parlé Access content directly
Book Sections Year : 2023

Transformers in Natural Language Processing

François Yvon

Abstract

This chapter presents an overview of the state of the art in natural language processing, exploring one specific computational architecture, the Transformer model, which plays a central role in a wide range of applications. This architecture condenses many advances in neural learning methods and can be exploited in many ways : to learn representations for linguistic entities ; to generate coherent utterances and answer questions ; to perform utterance transformations, an illustration being their automatic translation. These different facets of the architecture will be successively presented, also allowing us to discuss its limitations.
Fichier principal
Vignette du fichier
Transformers-en.pdf (980.98 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-04224531 , version 1 (02-10-2023)

Identifiers

Cite

François Yvon. Transformers in Natural Language Processing. Mohamed Chetouani; Virginia Dignum; Paul Lukowicz; Carles Sierra. Human-Centered Artificial Intelligence. Advanced Lectures, 13500, Springer International Publishing, pp.81-105, 2023, Lecture Notes in Computer Science, 978-3-031-24348-6. ⟨10.1007/978-3-031-24349-3_6⟩. ⟨hal-04224531⟩
51 View
154 Download

Altmetric

Share

Gmail Facebook X LinkedIn More