+1 vote
in Data Science by
Describe Markov chains?

1 Answer

0 votes
by

Brilliant provides a great definition of Markov chains (here):

“A Markov chain is a mathematical system that experiences transitions from one state to another according to certain probabilistic rules. The defining characteristic of a Markov chain is that no matter how the process arrived at its present state, the possible future states are fixed. In other words, the probability of transitioning to any particular state is dependent solely on the current state and time elapsed.”

The actual math behind Markov chains requires knowledge on linear algebra and matrices, so I’ll leave some links below in case you want to explore this topic further on your own.

Related questions

0 votes
asked Oct 23, 2021 in Artificial Intelligence by DavidAnderson
0 votes
asked Jan 2, 2020 in Data Science by sharadyadav1986
...