Markov chain

Named after Russian mathematician Andrey Markov (Андрей Марков) (1856-1922).

Noun

Markov chain (plural Markov chains)

  1. (mathematics, probability theory) A stochastic process where the probability of transitioning to a future state depends only on the current state and not on the sequence of events that preceded it.

    In a simple weather Markov chain, if it's sunny today, there might be:

    • 70% chance of sun tomorrow
    • 20% chance of clouds
    • 10% chance of rain

    regardless of the weather in previous days.

    Interactive Markov Chain Demonstration

  2. (computing) A mathematical system used in machine learning and artificial intelligence for generating sequences of words or other tokens based on statistical likelihood.

    Text generated by a Markov chain trained on Shakespeare:

    "To sleep: perchance to dream, ay, there's the rub that makes calamity of noble mind."

Related terms