/ Dictionary / Index M Markov chain: Meaning and Definition of Find definitions for: Mar'kov chain" Pronunciation: (mär'kôf), [key] — Statistics. Statistics. a Markov process restricted to discrete random events or to discontinuous time sequences. Random House Unabridged Dictionary, Copyright © 1997, by Random House, Inc., on Infoplease. Markova Markov process Related Content Daily Word Quiz: vertigo Analogy of the Day: Today’s Analogy Spelling Bee: Today’s Spelling Bee Frequently Misspelled Words Frequently Mispronounced Words Easily Confused Words Writing & Language