Enter text to see the chain graph
0 nodes · hover to inspect
Generated Output
Click "Generate" to produce text from the Markov chain
How it works: A Markov chain predicts the next word based only on the previous nwords (the "order"). Order 1 = each word depends only on the previous word — chaotic but creative. Order 4 = nearly copies the original. This is the simplest form of language generation — the same principle, scaled to billions of parameters, gives you GPT and Claude. The graph shows transition states as nodes and probabilities as edges. Hover nodes to see where each state leads.