All Reading Lists
📚Essential 75 min total

The 7 Papers Every AI PM Must Know

If you only read 7 papers in your life, make it these. The foundations of modern AI — architecture, scale, alignment, and efficiency.

7 papers
1
Attention Is All You Need

Ashish Vaswani et al.

Pro ~11 min

Transformers revolutionize AI by ditching recurrence and convolutions, shining with sheer parallelizable efficiency.

Why this paper

The paper that changed everything. No other architecture paper has had more commercial impact.

ArchitectureScalingRead paper
2

BERT revolutionizes NLP by learning context from both directions, improving accuracy across key benchmarks.

Why this paper

Bidirectional understanding — the approach behind Google Search's biggest upgrade in decades.

ArchitectureTrainingRead paper
3

gpt-3 — coming soon

The commercial turning point. Proves scale alone produces emergent intelligence.

4
Pro ~11 min

Larger language models offer more sample efficiency, enabling better results with smaller datasets and fixed compute resources.

Why this paper

The scientific foundation for why OpenAI, Google, and Anthropic keep building bigger models.

ScalingTrainingRead paper
5

instructgpt — coming soon

RLHF in practice — the exact technique that created ChatGPT from GPT-3.

6

LoRA slashes fine-tuning costs by 10,000x and GPUs by 3x while preserving quality on large language models.

Why this paper

The efficiency breakthrough that made custom AI models accessible to any team.

EfficiencyTrainingRead paper
7

Llama 2 outperforms open-source chat models, challenging its closed-source rivals in safety and dialogue optimization.

Why this paper

Open source changed the competitive landscape. Llama-2 forced every lab to rethink their strategy.

Open SourceSafetyRead paper

Unlock the full analysis for each paper

Deep-dive articles, expert annotations, PM action plans, and interactive experiments — all for $6/mo.

Go Pro — $6/mo