DEV Community
•
2026-04-11 10:36
What an LLM Actually Does
Pretraining, Prompting, Sampling, and Alignment
By the end of this post, you'll understand what an LLM actually learns during pretraining (ontologies, math, pronoun resolution, all of it) and why this happens from nothing more than predicting the next word. You'll know the three architectural families of LLMs (decoder-only, encoder-only, encoder-decoder) and when each one fits the job. ...