markoff chain n a Markov process for which the parameter is discrete time values Synonym(s) Markov chain Markoff chain