Call for Papers RWKV: Reinventing RNNs for the transformer eraIf you stepped into language modeling and Natural Language Processing (NLP) in the last three years, you are excused for being less familiar with Recurrent Neural Networks (RNNs) and Long Short-Term Memory (LSTM) Networks. Why? RNNs could not keep up with the unparalleled capabilities (pun intended) of Transformers and have since fallen out of favor as the go-to architecture in the modern deep le… more
|
Yashodeep
@yasho
Deputy Manager at Ashok Leyland, building AI/ML solutions for the automotive domain. Interested in interdisciplinary research and building open source tools.
- Joined Jan 2023