discrete-time Markov chain

Noun

 * 1)  A sequence of random variables, known as a stochastic process, in which the value of the next variable depends only on the value of the current variable, and not any variables in the past.