← lab

markov chain

Feed it text. Watch it learn transition probabilities. Generate new text from the chain. The simplest ancestor of every language model.

1 (chaotic)4 (rigid)
87
words
0
states
0
edges
Enter text to see the chain graph
0 nodes · hover to inspect
Generated Output

Click "Generate" to produce text from the Markov chain

How it works: A Markov chain predicts the next word based only on the previous nwords (the "order"). Order 1 = each word depends only on the previous word — chaotic but creative. Order 4 = nearly copies the original. This is the simplest form of language generation — the same principle, scaled to billions of parameters, gives you GPT and Claude. The graph shows transition states as nodes and probabilities as edges. Hover nodes to see where each state leads.
built by SPARKkeyboardcrumbs.com