Integration of Progressive Learning for incremental learning with multi-headed self attentive LSTM in Encoder-Decoder architecture. LSTM can be replaced with GRU or else the architecture can be used only with Multi-head attention with progressive learning. LSTM itself has been an open source in keras framework, and the proposed architecture is built upon the keras framework (open source deep learning framework).
- Saran Karthikeyan, Andreas Wendemuth, Ingo Siegert, Marcus Petersen: "Building of conversational agents using goal oriented dialog managers," Otto-von-Guericke-Universität Magdeburg, regiocom SE Magdeburg, Germany, March 2020 (https://github.com/Naras-KS/Konversa)
- Saunders, Ben and Camgoz, Necati Cihan and Bowden, Richard, "Progressive Transformers for End-to-End Sign Language Production," arXiv preprint arXiv:2004.14874, July 2020