Contrastive learning review As a form of unsupervised learning, contrastive learning plays an ever more important role in deep learning. Here's a review of contrastive learning in CV since 2018, including 4 stages a 2022-05-03 #deep learning #paper reading
Switch blog theme to FLUID The former "yilia" theme starts to be buggy since it was no longer maintained. I switch to this "FUILD" theme, for now, hopefully it will stand longer. 2022-04-30 #hexo #blog
Masked Autoencoder(MAE) Published in Dec 2021, this new work by Kaiming He draws a lot of attention from the community. The astounding result of unsupervised transfer learning and the capability of reconstructing highly mas 2022-04-27 #deep learning #paper reading
Vision Transformer Presented in 2021, the vision transformer model (ViT) is the most influential work in the CV field recent years. Its variants outperform the dominant convolutional networks in almost all CV tasks suc 2022-04-21 #deep learning #paper reading
GPT1-3 GPT-3 is the most popular generative language model now. With more than 100 billion parameters, the performance is proved to be great and by now there are more than hundreds of works (commercial or a 2022-04-18 #deep learning #paper reading
Bert The BERT is the most important achievement in the NLP field in the last 4 years. It makes the transfer learning of NLP tasks possible and the transformer framework dominant the NLP field. This is a 2022-04-15 #deep learning #paper reading
Introduction to GNN This is a tech blog written by google research team in 2021 that introducing the graph neural network. GNN has gradually become popular in the last 4 years. Personally, I think the graph structure lo 2022-04-14 #deep learning #paper reading
Transformer The transformer is the most important achievement in the last 5 years. It presents the fourth class of deep learning models besides MLP, CNN and RNN. And had a huge impact on the entire deep learning 2022-04-12 #deep learning #paper reading
ResNet Since its introduction in 2015, ResNet and its variants have accounted for 50% of deep neural networks in use. The idea of "Residual" has been proved to be efficient and important to deep NN. This 2022-04-09 #deep learning #paper reading
AlexNet It has been 10 years since AlexNet has been brought out. It is one of the cornerstones of this surge of deep learning. This is a series of paper reading notes, hopefully, to push me to read paper c 2022-04-07 #deep learning #paper reading