Study machine learning/deep learning papers. Summarize & Implementation
Title | Year | Task | Link | Code Review |
---|---|---|---|---|
NEURAL MACHINE TRANSLATION BY JOINTLY LEARNING TO ALIGN AND TRANSLATE | 2014 | #NLP | Link | |
Generative Adversarial Nets | 2014 | #Generative Models | Link | Link |
Conditional Generative Adversarial Nets | 2014 | #Generative Models | Link | |
Effective Approaches to Attention-based Neural Machine Translation | 2015 | #NLP | Link | |
Deep Residual Learning for Image Recognition | 2015 | #Computer Vision | Link | |
UNSUPERVISED REPRESENTATION LEARNING WITH DEEP CONVOLUTIONAL GENERATIVE ADVERSARIAL NETWORKS | 2015 | #Generative Models | Link | |
U-Net: Convolutional Networks for Biomedical Image Segmentation | 2015 | #Computer Vision | Link | |
Deep Neural Networks for YouTube Recommendations | 2016 | #Recommendation Systems | Link | |
Item2Vec: Neural Item Embedding for Collaborative Filtering | 2016 | #Recommendation Systems | Link | |
Wide & Deep Learning for Recommender Systems | 2016 | #Recommendation Systems | Link | |
SESSION-BASED RECOMMENDATIONS WITH RECURENT NEURAL NETWORKS | 2016 | #Recommendation Systems | Link | |
Attention Is All You Need | 2017 | #NLP | Link | |
Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks | 2017 | #Generative Models | Link | |
Mask R-CNN | 2017 | #Computer Vision | Link | |
Improving Language Understanding by Generative Pre-Training | 2018 | #NLP | Link | |
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding | 2018 | #NLP | Link | |
Language Models are Unsupervised Multitask Learners | 2020 | #NLP | Link | |
Language Models are Few-Shot Learners | 2020 | #NLP | Link | |
AN IMAGE IS WORTH 16X16 WORDS: TRANSFORMERS FOR IMAGE RECOGNITION AT SCALE | 2021 | #Computer Vision | Link |