归档: 2018/10
Inductive Representation Learning on Large Graphs
这是一篇发表在NIPS2017上的工作,提出了一个模型叫做GraphSAGE,用来解决图表示的。提出了一个问题However, most existing approaches require that all nodes in the graph are present during training of the embeddings;作者把这些模型叫做transductive的,而他提出来的
Incorporating Loose-Structured Knowledge into LSTM with Recall Gate for Conversation Modeling
Introduction使用了Recall gate去将knowledge整合进LSTM中。其中由sentence modeling, knowledge triggering和conversation modeling components。想法看着还比较简单,就是先将knowldege pair先映射成dense vector,然后直接拼接到LSTM的cell中。
Monte Carlo Methods
Sampling and Monte Carlo MethodsWhy Sampling蒙特卡洛方法主要是为了计算出积分或者某些目标的近似值,通过随机采样的方式,这种在可求导的情况下可以提高计算速度。比如说用minibatch的方式来估计training loss。除此之外,有些时候还需要我们去估计intractable的值。 Basics of Monte Carlo Sampling$$s =
FastGCN Fast Learning with Graph Convolutional Networks via Importance Sampling
IntroductionBorrowing the concept of a convolution filter for image pixels or a linear array of signals, GCN uses the connectivity structure of the graph as the filter to perform neighborhood mixing.