柯林斯詞典Markov chain /?mɑ?k?f/ 1. N a sequence of events the probability for each of which is dependent only on the event immediately preceding it 馬爾可夫鏈[statistics] 返回 Markov chain