furukawa-ai/deeplearning_papers

Neural Message Passing for Quantum Chemistry

Closed this issue · 0 comments

msrks commented

2018-03-19 0 01 01

NNは事前知識をうまく取り込めるとうまく動く。例えば、

  • CNN : 画像内オブジェクトの並進普遍性
  • RNN : 時間的な類似性

Graphに対しても、普遍性を取り込めばうまく機能するはずとの発想。
どうしたか?

=> graphical modelの message passingを使って、定常なグラフ表現を抽出してから、DNNにぶち込むことで、グラフ表現普遍性的な特徴をNN構造に事前知識としてうまく取り込むことに成功し、SOTAった。

(本件に関して、きちんと読めてるか自信ないので、間違ってたら issueで訂正コメントください)

https://arxiv.org/abs/1704.01212

Supervised learning on molecules has incredible potential to be useful in chemistry, drug discovery, and materials science. Luckily, several promising and closely related neural network models invariant to molecular symmetries have already been described in the literature. These models learn a message passing algorithm and aggregation procedure to compute a function of their entire input graph. At this point, the next step is to find a particularly effective variant of this general approach and apply it to chemical prediction benchmarks until we either solve them or reach the limits of the approach. In this paper, we reformulate existing models into a single common framework we call Message Passing Neural Networks (MPNNs) and explore additional novel variations within this framework. Using MPNNs we demonstrate state of the art results on an important molecular property prediction benchmark; these results are strong enough that we believe future work should focus on datasets with larger molecules or more accurate ground truth labels.