Datos Identificativos | 2024/25 | |||||||||||||
Asignatura | Modelado da Linguaxe | Código | 614544009 | |||||||||||
Titulación |
|
|||||||||||||
Descriptores | Ciclo | Período | Curso | Tipo | Créditos | |||||||||
Mestrado Oficial | 2º cuadrimestre |
Primeiro | Optativa | 3 | ||||||||||
|
Bibliografía básica | |
Jurafsky, Daniel & James H. Martin (2021). “N-gram Language Models.” Speech and Language Processing, Capítulo 3. https://web.stanford.edu/~jurafsky/slp3/ Jurafsky, Daniel & James H. Martin (2021). “Vector Semantics and Embeddings.” Speech and Language Processing, Capítulo 6. https://web.stanford.edu/~jurafsky/slp3/ Jurafsky, Daniel & James H. Martin (2021). “Neural Networks and Neural Language Models.” Speech and Language Processing, Capítulo 7. https://web.stanford.edu/~jurafsky/slp3/ Jurafsky, Daniel & James H. Martin (2021). “Sequence Labeling for Parts of Speech and Named Entities.” Speech and Language Processing, Capítulo 8. https://web.stanford.edu/~jurafsky/slp3/ Devlin, Jacob, Ming-Wei Chang, Kenton Lee & Kristina Toutanova (2018). “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 4171–4186, Minneapolis, Minnesota. Association for Computational Linguistics. Erk, Katrin (2012). "Vector space models of word meaning and phrase meaning: A survey." Language and Linguistics Compass 6.10: 635-653. |
|
Bibliografía complementaria | |
Baroni, Marco, Raffaella Bernardi & Roberto Zamparelli (2014). “Frege in space: A program for compositional distributional semantics.” Linguistic Issues in Language Technologies 9(6): 5-110. Baroni, Marco, Georgiana Dinu & Germán Kruszewski (2014). “Don’t count, predict! A systematic comparison of context-counting vs. context-predicting semantic vectors.” In Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) , pp. 238–247, Baltimore, Maryland. Association for Computational Linguistics. Church, Kenneth Ward, Zeyu Chen & Yanjun Ma (2021). “Emerging trends: A gentle introduction to fine-tuning.” Natural Language Engineering, 27: 763–778. Devlin, Jacob, Ming-Wei Chang, Kenton Lee & Kristina Toutanova (2018). “BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 4171–4186, Minneapolis, Minnesota. Association for Computational Linguistics. Erk, Katrin (2012). "Vector space models of word meaning and phrase meaning: A survey." Language and Linguistics Compass 6.10: 635-653. Hirschberg, Julia & Christopher D. Manning (2015). "Advances in natural language processing." Science 349.6245: 261-266. Linzen, Tal (2016). "Issues in evaluating semantic spaces using word analogies." In Proceedings of the 1st Workshop on Evaluating Vector-Space Representations for NLP, pp. 13–18, Berlin, Germany. Association for Computational Linguistics. Mikolov, Tomas, Wen-tau Yih & Geoffrey Zweig (2013). "Linguistic Regularities in Continuous Space Word Representations." In Proceedings of the 2013 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 746–751, Atlanta, Georgia. Association for Computational Linguistics. Taher Pilehvar, Mohammad & Jose Camacho-Collados (2021). Embeddings in Natural Language Processing: Theory and Advances in Vector Representations of Meaning. Morgan & Claypool (Synthesis Lectures on Human Language Technologies, volume 47). |