Pytorch code for our AAAI 2020 paper "Path Ranking with Attention to Type Hierarchies"
Go to file
2019-11-20 21:59:53 -05:00
main first commit 2019-11-20 21:59:53 -05:00
README.md first commit 2019-11-20 21:59:53 -05:00
requirements.txt first commit 2019-11-20 21:59:53 -05:00
run.py first commit 2019-11-20 21:59:53 -05:00

Path Ranking with Attention to Type Hierarchies (Review only)

This repo contains code for training and testing the proposed models in Path Ranking with Attention to Type Hierarchies. Due to its large size, data needs to be downloaded separately from dropbox.

Notes

  1. Code for baseline models in the paper can be found here (PRA and SFE) and here (Path-RNN).
  2. We provide tokenized data for WN18RR and FB15k-237. Our data format follows ChainsofReasoning. Vocabularies used for tokenizing data are also provided for reference.
  3. Raw data for WN18RR and FB15k-237 can be found here. Types for WN18RR entities can be obtained from Wordnet. Types for FB15k-237 entities can be found here.

Tested platform

  • Hardware: 64GB RAM, 12GB GPU memory
  • Software: ubuntu 16.04, python 3.5, cuda 8

Setup

  1. Install cuda
  2. (Optional) Set up python virtual environment by running virtualenv -p python3 .
  3. (Optional) Activate virtual environment by running source bin/activate
  4. Install pytorch with cuda
  5. Install requirements by running pip3 install -r requirements.txt

Instruction for running the code

Data

  1. Compressed data file can be downloaded from dropbox
  2. Unzip the file in the root directory of this repo.

Run the model

  1. Use run.py to train and test the model on WN18RR or FB15k-237.
  2. Use /main/playground/model2/CompositionalVectorSpaceAlgorithm.py to modify the training settings and hyperparamters.
  3. Use main/playground/model2/CompositionalVectorSpaceModel.py to modify the network design. Different attention methods for types and paths can be selected here.
  4. Training progress can be monitored using tensorboardX by running tensorboard --logdir runs. Tutorials and Details can be found here.