Skip to content

Experiments for "Incremental Training of a Recurrent Neural Network Exploiting a Multi-Scale Dynamic Memory". Accepted @ ECML 2020

Notifications You must be signed in to change notification settings

AntonioCarta/mslmn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 

Repository files navigation

Incremental Training of a Recurrent Neural Network Exploiting a Multi-Scale Dynamic Memory

MSLMN code for IAM-OnDB experiments and incremental training. This codebase implements our recurrent model based on a hierarchical recurrent neural network architecture. The model is trained incrementally by dynamically expanding the architecture to capture longer dependencies during training. Each new module is pretrained to maximize its memory capacity.

References

This work is based on our paper published @ ECML 2020: https://arxiv.org/abs/2006.16800

If you find this useful consider citing:

@inproceedings{carta2020incremental,
  title={Incremental Training of a Recurrent Neural Network Exploiting a Multi-Scale Dynamic Memory},
  author={Antonio Carta and Alessandro Sperduti and Davide Bacciu},
  booktitle={ECML/PKDD},
  year={2020}
}

About

Experiments for "Incremental Training of a Recurrent Neural Network Exploiting a Multi-Scale Dynamic Memory". Accepted @ ECML 2020

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages