Markoff chain n : a Markov process for which the parameter is discrete time values [syn: {Markov chain}]
版權所有 © 2024 3Dict.net