Markov chain

Noun

 * 1)  A discrete-time stochastic process containing a Markov property.

Translations

 * Arabic: سِلْسِلَةُ مَارْكُوف
 * Chinese:
 * Mandarin: 馬爾可夫鏈
 * Czech: Markovův řetězec
 * Estonian: Markovi ahel
 * Finnish: Markovin ketju
 * German: Markow-Kette
 * Hungarian:
 * Icelandic: Markov-keðja
 * Japanese: マルコフ連鎖
 * Polish: łańcuch Markowa
 * Russian: цепь Ма́ркова
 * Spanish: cadena de Márkov
 * Swedish: Markovkedja
 * Turkish: Markov zinciri