A collection of graph embedding, deep learning, recommendation, knowledge graph, heterogeneous graph papers with reference implementations
Created by gh-md-toc
-
2017- KDD - Dynamic Attention Deep Model for Article Recommendation by Learning Human Editors’ Demonstration
- Xuejian Wang, Lantao Yu, Kan Ren
- news recommendation
-
2018 - WWW - DKN: Deep Knowledge-Aware Network for News Recommendation
- Hongwei Wang, Fuzheng Zhang, Xing Xie, Minyi Guo
- news recommendation; knowledge graph
-
2018 - KDD - Deep Interest Network for Click-Through Rate Prediction
- Guorui Zhou, Kun Gai, et al
- click prediction
-
2018 - Recsys - Learning Consumer and Producer Embeddings for User-Generated Content Recommendation
- Wang-Cheng Kang, Julian McAuley
- user based
-
2019 - ICML - Compositional Fairness Constraints for Graph Embeddings
-
2019 - KDD - NPA Neural News Recommendation with personalized attention
-
2013 -WSDM - News Recommendation via Hypergraph Learning: Encapsulation of User Behavior and News Content
- Lei Li, Tao Li
-
2018 - CIKM - Weave & Rec : A Word Embedding based 3-D Convolutional Network for News Recommendation
-
2018 - IJCAI - A3NCF: An Adaptive Aspect Attention Model for Rating Prediction
- Zhiyong Cheng, Ying Ding, Xiangnan He, Lei Zhu, Xuemeng Song
-
2019 - WSDM - Social Attentional Memory Network: Modeling Aspect- and Friend-level Differences in Recommendation
- Chong Chen, Min Zhang, et al
-
2019 - WWW - Graph Neural Networks for Social Recommendation
- Wenqi Fan, Yao Ma, Jiliang Tang
-
2019 - CIKM - Spam Review Detection with Graph Convolutional Networks
- spam review is hard to detect using the content itself, considering the content with other reviews is important
-
2019 - EMNLP - Reviews Meet Graphs Enhancing User and Item Representations for recommendation with Hierachical Attentive Graph Neural Network
- Chuhan Wu, Fangzhao Wu, Tao Qi, Suyu Ge, Yongfeng Huang, and Xing Xie
-
2019 - KDD - DAML Dual Attention Mutual Learning between Ratings and reviews
-
2018 - WWW - Neural At-tentional Rating Regression with Review-level Explanations
- 2019 - TKDE - Personalizing Graph Neural Networks with Attention Mechanism for Session-based Recommendation
-
2020 - AAAI - Efficient Heterogeneous Collaborative Filtering without Negative Sampling for Recommendation
-
2018 - CIKM - Sequential Recommendation Through Mixtures of Heterogeneous Item Relationships
- Wang-Cheng Kang, Mengting Wan, Julian McAuley
-
2018 - WWW - Latent Relational Metric Learning via Memory-based Attention for Collaborative Ranking
- Yi Tay and, Luu Anh Tuan, Siu Cheung Hui
- 2019 - NIPS - Learning Disentangled Representations for Recommendation
- Jianxin Ma, Peng Cui
-
2018 - AAAI - Explainable Recommendation Through Attentive Multi-View Learning
-
2018 - CIKM - RippleNet : Propagating User Preferences on the Knowledge Graph for Recommender Systems
-
2019 - AAAI - Explainable Reasoning over Knowledge Graphs for Recommendation
-
Min Zhang website (aim at explainable recommender system)
-
2019 - Representation Learning on Graphs: Methods and Applications
-
2019 - A Comprehensive Survey on Graph Neural Networks
- Zonghan Wu ,Philip S. Yu
- 2019 - NIPS - Graph Representation Learning
-
2019 - Arxiv - Revisiting Graph Neural Networks: All We Have is Low-Pass Filters
- *本文作者提出了几个对于GNN的理解,其中两点挺有意思,1.Input features consist of low-frequency true features and noise. The true features have sufficient information for the machine learning task;2.multiplying graph signals with propagation matrices corresponds to low-pass filtering *
-
2019 - Arxiv - Feature-Attention Graph Convolutional Networks for Noise Resilient Learning
- 作者提出以往的GCN是假设了数据当中的每个node的content都是正确无误的,并且各个特征之间是相互独立的,并且他们对于节点的表征学习同等重要,这篇文章则认为这个假设有问题。首先使用LSTM对node content进行编码,其次设计了基于feature的attention来汇聚信息,这样每个节点的所有特征会因为不同的汇聚,而产生变化,并且在汇聚过程中,挑选出来最具有代表性的特征来进行汇聚,例如两个节点paper,有很多公用词,但是这届特征不应当是两篇文章被分类的原因,也不是两篇文章有连接的原因,而feature attention则会对不同feature维度施加不同权重,这和GAT中对不同节点施加不同权重不同,前者是feature dim level,后者是node level。相比于。另外作者给出了一些实验数据集的分析,其中cora和citeseer这两个数据集,graph较为稀疏,所以内容起到的作用相对较多。而DBLP数据集中,content内容比较稀疏,graph相对稠密,所以graph起到的信息作用比较多。
-
2019 - NIPS-GRL - Learnable Aggregator for GCN
- Li Zhang
- 相比于GAT更进一步,在每个feature dimension上进行attention
-
2020 - ICLR - Geom-GCN: Geometric Graph Convolutional Networks
- Hongbin Pei, Bingzhe Wei, Kevin Chen-Chuan Chang, Yu Lei, Bo Yang
- 图卷积网络已经成功被应用到图表示学习的很多应用当中。但是目前的图卷积网络中仍有两个缺陷限制了他们的表达学习能力,一是在节点信息汇聚的过程中,以往的mean,max pooling操作会丢失掉了节点的结构信息,无法区分一些特殊的非同构数据。二是对于一些disassortative的图,以往的相邻节点的定义方法无法利用上在图结构上距离较远的节点。本文针对以上两个限制,为图卷积网络提出了一种新的邻居节点定义方法和汇聚方式。主要分为三步,节点嵌入,从图结构和节点嵌入空间一起选取相邻节点,两阶段的节点汇聚。最终作者在八个数据集,按照6:2:2的比例划分数据集,超越了GCN和GAT的节点分类效果,并给出了对照试验验证了各个模块的有效性。该文章的工作整体分成两部,embedding和后续的操作,不是一个end to end的work,前边的embedding要手动调整。
-
2019 - ICML - Disentangled Graph Convolutional Networks
- Jinxi Ma, Peng Cui
- 作者假设edge存在是不同的原因导致的,应当是disentangle的。相比于GCN的不同做法,本文在message passing过程中,将feature分成了K个channel,每个channel上基于特征相似度的attention进行汇聚,部分算法如下
-
2018 - ICML - Representation Learning on Graphs with Jumping Knowledge Networks
- Keyulu Xu, Chengtao Li, Yonglong Tian, Tomohiro Sonobe,Ken-ichi Kawarabayashi, Stefanie Jegelka
- jump connection;
-
2019 - ICLR - Predict then Propagate: Graph Neural Networks meet Personalized PageRank
- Johannes Klicpera, Aleksandar Bojchevski, Stephan Günnemann
- page rank;
-
2019 - NIPS - Break the Ceiling: Stronger Multi-scale Deep Graph Convolutional Networks
- Luan, Sitao and Zhao, Mingde and Chang, Xiao-Wen and Precup, Doina
- 提出了两种类似于densenet 的图卷积网络结构用于node classification,作者的motivation是从spectruml出发的,和LanczosNet属于同一系列工作
-
2019 - NIPS - Diffusion Improves Graph Learning
- Johannes Klicpera, Stefan Weißenberger, Stephan Günnemann
- 该文章提出GNN输入的图T要先经过如下的diffusion后,再把diffusion后的图S送入GNN
-
2019 - ICLR - Graph Wavelet Neural Network
-
2018 - AAAI - GraphGAN: Graph Representation Learning with Generative Adversarial Nets
- Hongwei Wang, Jia Wang, Jialin Wang,Miao Zhao,Weinan Zhang,Fuzheng Zhang Xing Xie, Minyi Guo
-
2018 - CIKM - Semi-supervised Learning on Graphs with Generative Adversarial Nets
-
2019 - ICML - Simplifying Graph Convolutional Networks
- Wu Felix, Zhang Tianyi, Souza, Amauri, Holanda de Fifty, Christopher, Yu, Tao, Weinberger, Kilian Q.
-
2019 - ICLR - HOW POWERFUL ARE GRAPH NEURAL NETWORKS
- Keyulu Xu, Weihua Hu, Jure Leskovec, Stefanie Jegelka
-
2019 - ICLR - LanczosNet: Multi-Scale Deep Graph Convolutional Networks
- Renjie Liao, et al
-
2019 - AAAI - GeniePath: Graph Neural Networks with Adaptive Receptive Paths
- Le Song, Yuan Qi, et al
-
2018 - ICLR - Graph Attention Networks
- Petar Veliˇckovi´, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Li`, Yoshua Bengio
-
2018 - NIPS - Hierarchical Graph Representation Learning with Differentiable Pooling
- Rex Ying, Jiaxuan You, Christopher Morris, Xiang Ren, William L. Hamilton, Jure Leskovec
-
2018 - NIPS - GLoMo: Unsupervisedly Learned Relational Graphs as Transferable Representations
- Zhilin Yang, Jake Zhao, Bhuwan Dhingra, Kaiming He, William W. Cohen, Ruslan Salakhutdinov, Yann LeCun
-
2017 - NIPS - GraphSAGE: Inductive Representation Learning on Large Graphs
- Rex Ying, Jiaxuan You, Christopher Morris, Xiang Ren, William L. Hamilton, Jure Leskovec
-
2018 - NIPS - Pitfalls of Graph Neural Network Evaluation
- Shchur Oleksandr et al
-
2017 - ICLR - SEMI-SUPERVISED CLASSIFICATION WITH GRAPH CONVOLUTIONAL NETWORKS
-
Stochastic Shared Embeddings Data-driven Regularization of Embedding Layers
- *Liwei Wu, Shuqing Li, Cho-Jui Hsieh, James Sharpnack *
-
2019 - Chemical - 2019 - Chemical - A Bayesian graph convolutional network for reliable prediction of molecular properties with uncertainty quantification
- 2019 - AISTATS - Confidence-based Graph Convolutional Networks for Semi-supervised learning
- Shikhar Vashishth, Prateek Yadav, Manik Bhandari, Partha Talukdar
- Cora-ML数据集有更大的Label Mismatch,即两个连接的点有着不同label的概率。该模型提出的方法在这种数据集上号称有着更好的表现,同时他实验分析得到GAT学习的attention系数并不能阻止这种现象。
- 2019 - ICLR - Bayesian Graph Convolutional Neural Networks Using Non-Parametric Graph Learning
- Soumyasundar Pal, Florence Regol & Mark Coates
- 2019 - NIPS - Variational Spectral Graph Convolutional Networks
- Louis Tiao, Pantelis Elinas, Harrison Nguyen, Edwin V. Bonilla
- 2019 - NIPS - Graph Agreement Models for Semi-Supervised Learning
- Otilia Stretcu · Krishnamurthy Viswanathan · Dana Movshovitz-Attias · Emmanouil Platanios · Sujith Ravi · Andrew
- 本文提出了一个基于graph的半监督学习框架。基于graph的半监督学习算法在半监督任务上效果出众,例如label propagation,他们基于的基本假设是一个节点的标签可以通过周围节点的类别来推测。但是在真实的世界中,许多的graph中节点之间存在边并不代表两个节点共属于同一类。本文基于WSDM2018的工作,提出了graph aggrement model,其引入辅助任务帮助判断两个节点之间的边是否有助于当前的主任务,例如分类。并在主任务和辅助任务之间通过co training的方式进行训练,最终在半监督任务上取得了优异的效果,超越了GAT。
- 2019 - AAAI - Bayesian graph convolutional neural networks for semi-supervised classification
- Jiatao Jiang, Zhen Cui, Chunyan Xu, Jian Yang
- provided an example of the framework for the case of an assortative mixed membership stochastic block model and explained how approximate inference can be performed using a combination of stochastic optimization
- 2019 - ICML - Learning Discrete Structures for Graph Neural Networks
- 2019 - ICML - Are Graph Neural Networks Miscalibrated
- 2019 - ICLR - DEEP GAUSSIAN EMBEDDING OF GRAPHS UNSUPERVISED INDUCTIVE LEARNING VIA RANKING
- 提出概率建模,variance对应有具体的含义,另外提出ranking,即1 hop node embedding的similarity要大于2 hop node embedding,利用KL来计算similarity。同时,相比于node2vec这样的node embedding算法,该算法能够利用node attribute做到inductive,相比于graph sage,能够做到在test阶段,即使没有link,也能够产生node 的 embedding
- 2019 - KDD - Robust Graph Convolutional Networks Against Adversaria Attacks
- gcn中每一层特征都用一个gaussian distribution来表征,分布的好处是可以吸收对抗攻击的坏处。另外,设计了基于variance的attention方式,这样可以阻止对抗攻击在graph中的传播
-
2020 - ICLR - PairNorm: Tackling Oversmoothing in GNNs
- Lingxiao Zhao, Leman Akoglu
- 解决oversmoothing
-
2020 - ICLR - DropEdge: Towards Deep Graph Convolutional Networks on Node Classification
- Yu Rong, Wenbing Huang, Tingyang Xu, Junzhou Huang
- 解决oversmoothing
-
2020 - ICLR - Measuring and Improving the Use of Graph Information in Graph Neural Networks
-
2020 - ICLR - Characterize and Transfer Attention in Graph Neural Networks
- GAT在citation数据集上不同node的attention区分度不明显,在PPI上明显。这个attention和不同的数据集有着相关性,利用attention score作为feature vector,可以明显的区分出来不同的dataset。另外,作者尝试利用GAT得到的attention score对edge进行过滤,发现graph中的仅仅保留30-40%边仍能够得到不错的效果
-
2020 - AAAI - Measuring and Relieving the Over-smoothing Problem for Graph Neural Networks from the Topological View
- Deli Chen, Yankai Lin, Wei Li, Peng Li, Jie Zhou, Xu Sun
- 作者发现对于node classification这样的任务,inter class edge是有用的,而intra classi的edge是noisy的。作者提供了两种指标来衡量smoothing。同时作者还提出了两种方法来解决oversmooting,一种是加regularizer,在graph较近的node之间的feature vector的cosine distance变小,而graph上离得比较远的node之间的distance变大,另外一种方法为对graph进行重建,期望graph之间confidence比较高的edge得以保留,confidence比较低的边去除掉。这两种方法来使得在达到较高的层数的时候,performance的衰退变慢。
- 2019 - KDD - Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks
-
2019 - SIGIR - Neural Graph Collaborative Filtering
-
2019 - NIPS - Inductive Matrix Completion Based on Graph Neural Networks
- Muhan Zhang, Yixin Chen
-
该文章提出了一种基于图卷积网络的inductive,并且不使用辅助信息的矩阵补全方法。矩阵补全作为一个经典的问题,在许多领域有着应用,例如推荐系统。以往的方法比如低秩矩阵分解将矩阵分解成两个向量的乘积,他们往往是transductive的,不能够泛化到新的矩阵行和列上,KDD 2018的GCMC应用node-level的图卷积网络在bipartie graph上学习用户和物品特征表达,但其仍属于transductive的方法,而同为KDD 2018的pinsage虽然是inductive的模型,但是要依赖辅助信息如特征,并且特征的质量往往会影响模型的效果。本文提出一种基于图卷积网络的inductive矩阵补全方法,使得模型不依赖特征就可以泛化到新用户和新物品的矩阵补全方法。该方法主要由三步构成,包括了1.抽取包含目标用户和物品的sub-graph;2.为subgraph中不同的节点打上标签;3.graph-level的图卷积网络进行评分预测。最终作者在4个数据集上取得最好的表现效果,值得一提的是在movielens数据集上训练的模型在Douban数据集上进行测试,也能够超越一大部分baseline,显出该方法有着良好的transfer能力。
-
2018 - KDD - DeepInf: Social Influence Prediction with Deep Learning
-
Jiezhong Qiu , Jie Tang, et al
-
2018 - ICDM- Signed Graph Convolutional Network
-
yler Derr, Yao Ma, Jiliang Tang
-
2019 - AAAI - Graph Convolutional Networks for Text Classification
-
Liang Yao, Chengsheng Mao, Yuan Luo
-
2018 - KDD - Graph Convolutional Matrix Completion
-
Rianne van den Berg, Thomas N. Kipf, Max Welling
-
2018 - KDD - PinSage: Graph Convolutional Neural Networks for Web-Scale Recommender Systems
- Rex Ying, Ruining He, Kaifeng Chen, Pong Eksombatchai, William L. Hamilton, Jure Leskovec
-
2020 - ICLR - Composition-based Multi-Relational Graph Convolutional Networks
- Shikhar Vashishth, Soumya Sanyal, Vikram Nitin, Partha Talukdar
- propose the relation embedding in the aggregation as below to solve the over-parameterization problem
-
2019 - AAAI - End-to-end Structure-Aware Convolutional Networks for Knowledge Base Completion
-
2019 - AAAI - Modeling Relational Data with Graph Convolutional Networks
- Michael Schlichtkrull, Thomas N. Kipf
-
2018 - NIPS - SimplE Embedding for Link Prediction in Knowledge Graphs
- Seyed Mehran Kazemi, David Poole
-
2017 - AAAI - Convolutional 2D Knowledge Graph Embeddings
- Tim Dettmers, Pasquale Minervini, Pontus Stenetorp, Sebastian Riedel
-
2013 - NIPS - Translating Embeddings for Modeling Multi-relational Data
- Antoine Bordes, Nicolas Usunier, Alberto Garcia-Duran, Jason Weston, Oksana Yakhnenko
-
2020 - ICLR - Hyper-SAGNN: a self-attention based graph neural network for hypergraphs
- Ruochi Zhang, Yuesong Zou, Jian Ma
-
2019 - AAAI - Hypergraph Neural Networks
- Yifan Feng, Haoxuan You, Zizhao Zhang, Rongrong Ji, Yue Gao
-
2018 - AAAI - Structural Deep Embedding for Hyper-Networks
- Ke Tu, Peng Cui, Xiao Wang, Fei Wang, Wenwu Zhu
-
2020 - AAAI - An Attention-based Graph Neural Network for Heterogeneous Structural Learning
- Huiting Hong, Hantao Guo, Yucheng Lin, Xiaoqing Yang, Zang Li, Jieping Ye
- Metapath free method; Multi-Task; Self-attention,在message passing时,相关的属于不同种类的节点先transform到特定的目标节点种类空间,然后在该空间内进行汇聚,不再考虑metapath,一层message passing考虑到一阶的关系,如AA,或者PA,如果对于比较长的meta path,该模型需要多层的message passing layer。
-
2019 - NIPS - Graph Transformer Networks
-
2019 - WWW - Heterogeneous Graph Attention Network
-
2019 - AAAI - Relation Structure-Aware Heterogeneous Information Network Embedding
- Yuanfu Lu, Chuan Shi, Linmei Hu, Zhiyuan Liu
-
2018 - CIKM - Are Meta-Paths Necessary ? Revisiting Heterogeneous Graph Embeddings
-
2018 - WWW - Deep Collective Classification in Heterogeneous Information Networks
-
2018 - KDD - PME : Projected Metric Embedding on Heterogeneous Networks for Link Prediction
-
2017 - KDD - metapath2vec: Scalable Representation Learning for Heterogeneous Networks
-
2019 - CIKM - Relation-Aware Graph Convolutional Networks for Agent-Initiated Social E-Commerce Recommendation
- relation aware aggregator; metapath based receptive field sampler; co-attention fusion;
-
2019 - KDD - Metapath-guided Heterogeneous Graph Neural Network for Intent Recommendation
-
2019 -EMNLP - Heterogeneous Graph Attention Networks for Semi-supervised Short Text Classification
-
2017 - KDD - Meta-Graph Based Recommendation Fusion over Heterogeneous Information Networks
- Huan Zhao, anming Yao, Jianda Li, Yangqiu Song and Dik Lun Lee
-
2019 - AAAI - Cash-out User Detection based on Attributed Heterogeneous Information Network with a Hierarchical Attention Mechanism
- Binbin Hu, Zhiqiang Zhang, Chuan Shi, Jun Zhou, Xiaolong Li, Yuan Qi
-
2018 - KDD - Leveraging Meta-path based Context for Top- N Recommendation with A Neural Co-Attention Model
- Binbin Hu, Chuan Shi, Wayne Xin Zhao, Philip S. Yu
-
2018 - IJCAI - Aspect-Level Deep Collaborative Filtering via Heterogeneous Information Networks
- Xiaotian Han, Chuan Shi, Senzhang Wang, Philip S. Yu, Li Song
-
2018 - A Survey on Network Embedding
-
2018 - A Tutorial on Network Embeddings
-
2017 - IJCAI - TransNet : Translation-Based Network Representation Learning for Social Relation Extraction
- Cunchao Tu, Zhengyan, Maosong Sun
-
2019 - AAAI - TransConv: Relationship Embedding in Social Networks
-
2019 - ICLR - DEEP GRAPH INFOMAX
- Petar Velickovi ˇ c´, William L. Hamilton, [Yoshua Bengio] et al
-
2018 IJCAI - ANRL: Attributed Network Representation Learning via Deep Neural Networks
- Zhen Zhang, Hongxia Yang, Jiajun Bu, Sheng Zhou, Pinggang Yu, Jianwei Zhang, Martin Ester, Can Wang
- gated-graph-neural-network-samples
- Graph-neural-networks jupyter tutorial
- Deep Graph Library (DGL) Python package
- pitafall: gnn model collection
- pytorch_geometric
- Liaojunjie: gnn model collection
- node embedding from deepwalk to struc2vec
- spektral
- stellargraph including metapath2vec
- visualization of graph- graph tool
- analysis the spectral of graph pyqsp
- Tsinghua University Graph papers reading list
- gnn literature
- MIA reading group
- awesome-network-embedding
- dynamic graph
- zhihu link for graph
- spatial-temporal graph
- Technische Universität München
- graph-adversarial-learning-literature
- 2020 - ICLR - CONTRASTIVE REPRESENTATION DISTILLATION
- 2020 - CVPR - Self-training with Noisy Student improves ImageNet classification
- Multi-view
- 2019 - KDD - Deep Bayesian Mining, Learning and Understanding
- Jen-Tzung Chien
- Bayesian Deep Learning code
- Bayesian Neural Networks code
- NIPS - Bayesian Deep Learning
-
2018 - NIPS - Recent Advances in Autoencoder-Based Representation Learning
-
Michael Tschannen, Olivier Bachem, Mario Lucic
-
2017 - NIPS - Bayesian GAN
- *对GAN的D和G的参数使用分布进行建模,利用Stochastic Gradient Hamiltonian Monte Carlo方法来对进行优化
-
2014 - ICML - Stochastic Gradient Hamiltonian Monte Carlo
-
2019 - MICA - Uncertainty-aware Self-ensembling Model for Semi-supervised 3D Left Atrium Segmentation
-
2020 - AAAI - Uncertainty-Aware Multi-Shot Knowledge Distillation for Image-Based Object Re-Identification
-
2020 - CVPR - 3D Semi-Supervised Learning with Uncertainty-Aware Multi-View Co-Training
- co training中需要把每个view下的预测标签去作为另外一个view的训练样本,但其实应挑选置信度高的样本,这篇利用epistemic uncertainty (model parameters)去挑选置信度高的样本
-
2018 - CVPR - Multi-Task Learning Using Uncertainty to Weigh Losses for Scene Geometry and Semantics
- Alex Kendall, Yarin Gal, Roberto Cipolla
- 该文提出利用 Task-dependent or Homoscedastic uncertainty去刻画当前任务的noisy程度,如果uncertainty越大的话,那么他在multi-task的loss中应当占更少的比例。
-
2019 - CVPR - Striking the Right Balance with Uncertainty
-
2019 - thisis - Uncertainty Quantification in Deep Learning
-
2017 - NIPS - What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision?
-
2016 - ICML - Dropout as a Bayesian Approximation Representing Model Uncertainty in Deep Learning
-
2019 - ICCV - Probabilistic Face Embedding
-
2019 - ICCV - Robust Person Re-identification by Modelling Feature Uncertainty
- Tianyuan Yu, Da Li, Yongxin Yang,Timothy Hospedales,Tao Xiang
-
2017 - ICML - On Calibration of Modern Neural Networks
-
2019 - NIPS - Variational Graph Convolutional Networks
-
2019 - ICML - Are Graph Neural Networks Miscalibrated?
- *Leonardo Teixeira, Brian Jalaian, Bruno Ribeiro *
- calibrated的定义为,在多分类模型中,softmax输出的概率是否和预测正确的频率相吻合,例如模型对这批数据的预测为0.8,那么10次给出这个标签,8次应该是对的。random classifier is a perfectly calibrated model so it is a orthogonal metric to model accuracy。该文章调研了之前比较成熟的calibration method包括了MC dropout和temprature scaling,显示了他们并不一定都对GNN有效果,但本篇文章并没有提出来任何改善GNN的calibration techniques。
-
2019 - NIPS - Using Self-Supervised Learning Can Improve Model Robustness and Uncertainty
- Dan Hendrycks, Mantas Mazeika, Saurav Kadavath, Dawn Song
- 该文在CV的分类任务上,在原有cross entropy loss基础上,附加了一种基于rotation的self-supervised loss用来增强模型的鲁棒性,可以应对噪声标签,噪声图片输入,对抗样本输入,out of distribution sample。在每个任务上均能提升5个点左右。该方法简单有效,可以作为plug添加到许多CV分类任务中
-
2019 - NIPS - Uncertainty posters
-
2019 - ICLR - Modeling Uncertainty with Hedged Instance Embedding
-
2019 - NIPS - Practical Deep Learning with Bayesian Principles
-
2018 - NIPS - Multimodal Generative Models for Scalable Weakly-Supervised Learning
-
2014 - NIPS - Generalized Product of Experts for Automatic and Principled Fusion of Gaussian Process Predictions
-
2018 - ECCV - CBAM Convolutional Block Attention Module
-
2020 - CVPR - CONTRASTIVE REPRESENTATION DISTILLATION
-
2017 - NIPS - Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles
-
2019 - NIPS - A Simple Baseline for Bayesian Uncertainty in Deep Learning
-
2020 - AISTATS - Confident Learning Estimating Uncertainty in Dataset Labels
-
PubMed Diabetes
- The Pubmed Diabetes dataset consists of 19717 scientific publications from PubMed database pertaining to diabetes classified into one of three classes. The citation network consists of 44338 links. Each publication in the dataset is described by a TF/IDF weighted word vector from a dictionary which consists of 500 unique words. The README file in the dataset provides more details.
- Download Link:
- Related Papers:
- Galileo Namata, et. al. "Query-driven Active Surveying for Collective Classification." MLG. 2012.
-
Cora
- The Cora dataset consists of 2708 scientific publications classified into one of seven classes. The citation network consists of 5429 links. Each publication in the dataset is described by a 0/1-valued word vector indicating the absence/presence of the corresponding word from the dictionary. The dictionary consists of 1433 unique words. The README file in the dataset provides more details.
- Download Link:
- Related Papers:
- Qing Lu, and Lise Getoor. "Link-based classification." ICML, 2003.
- Prithviraj Sen, et al. "Collective classification in network data." AI Magazine, 2008.
other useful datasets link:
- citation dataset
- IMDB Datasets
- MovieLens Latest Dataset which consists of 33,000 movies. And it contains four types of nodes: movie, director, actor and actress, connected by two types of relations/link: directed link and actor/actress staring link. Each movie is assigned with a set of class labels, indicating generes of the movie. For each movie, we extract a bag of words vector of all the plot summary about the movie as local features, which include 1000 words.
- Download Link:
- Related Papers:
- T. Pham, et al. "Column networks for collective classification." In AAAI, 2017.
- Zhang, Yizhou et al. "Deep Collective Classification in Heterogeneous Information Networks" In WWW, 2018.
other useful dataset links