GPT-3 — Language Models Scale Up
OpenAI released GPT-3, a language model with 175 billion parameters that could write essays, code, and poetry — demonstrating that scale alone could produce emergent capabilities.
In June 2020, OpenAI released GPT-3 (Generative Pre-trained Transformer 3), a language model with 175 billion parameters — 100 times larger than its predecessor GPT-2. Trained on a vast corpus of internet text, GPT-3 demonstrated remarkable "few-shot" abilities: given just a few examples in a prompt, it could write essays, generate code, translate languages, answer questions, and even compose poetry. The model revealed that scaling neural networks produced emergent capabilities — behaviors not explicitly programmed but arising from the interaction of billions of learned parameters. GPT-3 was initially available only through an API, and its outputs amazed and unsettled the tech community in equal measure. It proved that the path to more capable AI might simply be "more compute, more data, more parameters."
More in Science & Technology
Cambrian Period
The Cambrian explosion saw the rapid diversification of most major animal phyla, fundamentally transforming life on Earth.
~570 BCEBirth of Pythagoras
Pythagoras, the Greek mathematician and philosopher, was born on the island of Samos.
~340 BCEAristotle's Contributions to Science and Philosophy
Aristotle made groundbreaking arguments for a spherical Earth and laid the foundations of logic, science, and Western philosophy.
c. 300 BCELibrary of Alexandria Founded
The most famous library of the ancient world was established in Egypt.