The Time Detectives
The Time Detectives®
Learn · Investigate · Master
Investigate →
Learn / Events / 21st Century / Transformer Architecture Published

Transformer Architecture Published

June 12, 2017 · 21st Century
TechnologyMathematics

On June 12, 2017, Google Brain researchers published "Attention Is All You Need," introducing the transformer architecture that became the foundation of every major language model developed since. The paper replaced recurrent neural networks with a self-attention mechanism that allowed models to process entire sequences simultaneously, dramatically improving training speed and scalability. Within five years the transformer architecture underpinned GPT, BERT, Claude, Gemini, and nearly every large language model — making it arguably the most consequential AI paper since backpropagation.

Key Figures

Geoffrey HintonAshish Vaswani

Locations

Google Brain

Topics

Artificial IntelligencetechnologyMachine Learninginnovationdeep learningneural networksnatural language processing

Connected Events — 3 Connections

Fulfilled core language intelligence ambitions of Dartmouth Conference — Artificial Intelligence Named as a Discipline
Summer 1956 · Technology · 20th Century
Extended and superseded deep learning approach of AlexNet Wins ImageNet Challenge
September 30, 2012 · Technology · 21st Century
Built directly on architecture introduced by ChatGPT Launched
November 30, 2022 · Technology · 21st Century
The Time Detectives® · Cadet Mission
Investigate This Event
Place it on the timeline. Earn points. Master the connections.
Start →
New to The Time Detectives? Learn what it is →