AISci.fans

RESEARCH AREA

Transformer Architecture

Self-attention based sequence models that revolutionized NLP and beyond.

5000+ papers1 researchers

Key Papers

Attention Is All You Need

NeurIPS2017100,000 citations
PDF

BERT: Pre-training of Deep Bidirectional Transformers

NAACL201970,000 citations
PDF

An Image is Worth 16x16 Words: Transformers for Vision

ICLR202128,000 citations
PDF