Linear Attention Recurrent Neural Network Resources Save

A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)

No resources for this project.

Add resource

Open Source Agenda Badge

Open Source Agenda Rating
Submit Resource Articles, Courses, Videos