Mostrar el registro completo del ítem
González-Barba, JÁ.; Hurtado Oliver, LF.; Pla Santamaría, F. (2021). TWilBert: Pre-trained deep bidirectional transformers for Spanish Twitter. Neurocomputing. 426:58-69. https://doi.org/10.1016/j.neucom.2020.09.078
Por favor, use este identificador para citar o enlazar este ítem: http://hdl.handle.net/10251/187684
Título: | TWilBert: Pre-trained deep bidirectional transformers for Spanish Twitter | |
Autor: | González-Barba, José Ángel | |
Entidad UPV: |
|
|
Fecha difusión: |
|
|
Resumen: |
[EN] In recent years, the Natural Language Processing community have been moving from uncontextualized word embeddings towards contextualized word embeddings. Among these contextualized architectures, BERT stands out due ...[+]
|
|
Palabras clave: |
|
|
Derechos de uso: | Reconocimiento - No comercial - Sin obra derivada (by-nc-nd) | |
Fuente: |
|
|
DOI: |
|
|
Editorial: |
|
|
Versión del editor: | https://doi.org/10.1016/j.neucom.2020.09.078 | |
Código del Proyecto: |
|
|
Agradecimientos: |
This work has been partially supported by the Spanish Ministerio de Ciencia, Innovacion y Universidades and FEDER founds under project AMIC (TIN2017-85854-C4-2-R), and the Generalitat Valenciana under GiSPRO (PROMETEU/2018/176) ...[+]
|
|
Tipo: |
|