Markov Chain is a sequence of states where your future state only depends on your current state. When you add decision-making to it, you get a Markov Decision Process (MDP)
very informative!
very informative!