Merge branch 'master' of github.com:ixaxaar/pytorch-dnc into sparse
This commit is contained in:
commit
9a6b97ad6b
@ -2,6 +2,8 @@
|
||||
|
||||
[![Build Status](https://travis-ci.org/ixaxaar/pytorch-dnc.svg?branch=master)](https://travis-ci.org/ixaxaar/pytorch-dnc) [![PyPI version](https://badge.fury.io/py/dnc.svg)](https://badge.fury.io/py/dnc)
|
||||
|
||||
[![Build Status](https://travis-ci.org/ixaxaar/pytorch-dnc.svg?branch=master)](https://travis-ci.org/ixaxaar/pytorch-dnc) [![PyPI version](https://badge.fury.io/py/dnc.svg)](https://badge.fury.io/py/dnc)
|
||||
|
||||
This is an implementation of [Differentiable Neural Computers](http://people.idsia.ch/~rupesh/rnnsymposium2016/slides/graves.pdf), described in the paper [Hybrid computing using a neural network with dynamic external memory, Graves et al.](https://www.nature.com/articles/nature20101)
|
||||
and the Sparse version of the DNC (the SDNC) described in [Scaling Memory-Augmented Neural Networks with Sparse Reads and Writes](http://papers.nips.cc/paper/6298-scaling-memory-augmented-neural-networks-with-sparse-reads-and-writes.pdf).
|
||||
|
||||
@ -36,6 +38,8 @@ Tested on `ubuntu 16.04`, `Arch / Manjaro` and `Fedora 27`.
|
||||
|
||||
Following are the constructor parameters:
|
||||
|
||||
Following are the constructor parameters:
|
||||
|
||||
| Argument | Default | Description |
|
||||
| --- | --- | --- |
|
||||
| input_size | `None` | Size of the input vectors |
|
||||
|
@ -207,8 +207,13 @@ if __name__ == '__main__':
|
||||
|
||||
last_save_losses.append(loss_value)
|
||||
|
||||
if summarize:
|
||||
if summarize and rnn.debug:
|
||||
loss = np.mean(last_save_losses)
|
||||
# print(input_data)
|
||||
# print("1111111111111111111111111111111111111111111111")
|
||||
# print(target_output)
|
||||
# print('2222222222222222222222222222222222222222222222')
|
||||
# print(F.relu6(output))
|
||||
llprint("\n\tAvg. Logistic Loss: %.4f\n" % (loss))
|
||||
if np.isnan(loss):
|
||||
raise Exception('nan Loss')
|
||||
|
Loading…
Reference in New Issue
Block a user