ד"ר אייל קולמן עיבוד שפות טבעיות בטכניקות למידה עמוקה 83-374
- Deep Learning by Ian Goodfellow, Yoshua Bengio, Aaron Courville,2016
- Attention is all you need, Vaswani, Ashish and Shazeer, Noam and Parmar, Niki and Uszkoreit, Jakob and Jones, Llion and Gomez, Aidan N and Kaiser, Lukasz and Polosukhin, Illia, 2017
- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, Devlin, Jacob and Chang, Ming-Wei and Lee, Kenton and Toutanova, Kristina, 2018
- Word2Vec: Efficient estimation of word representations in vector space, Mikolov, Tomas and Chen, Kai and Corrado, Greg and Dean, Jeffrey, 2013
- Word2vec Explained: deriving Mikolov et al.'s negative-sampling word-embedding method, Goldberg, Yoav and Levy, Omer, 2014
- Deep Learning for Search, Tommaso Teofili, 2019
- Pegasus: Pre-training with extracted gap-sentences for abstractive summarization, Zhang, Jingqing and Zhao, Yao and Saleh, Mohammad and Liu, Peter, 2020
- Language models are unsupervised multitask learners, Radford, Alec, et al., 2019 (GPT2 paper)
- Language models are few-shot learners, Brown Tom et. al., 2020 (GPT3 paper)
תאריך עדכון אחרון : 10/11/2024