Skip to content

mlxu995/Synthesizer-attention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 

Repository files navigation

Synthesizer-attention

A Pytorch implementation of Dense Synthesizer attention and Random Synthesizer attention.

Reference

[1] Tay, Yi, et al. "Synthesizer: Rethinking Self-Attention in Transformer Models." arXiv preprint arXiv:2005.00743 (2020).

About

Dense Synthesizer and Random Synthesizer

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages