- Keras Reinforcement Learning Projects
- Giuseppe Ciaburro
- 291字
- 2021-08-13 15:26:07
Transition diagram
A very intuitive alternative to the description of a Markov chain through a transition matrix is associating an oriented graph (transition diagram) to a Markov chain to which the following two statements apply:
- Vertices are labeled by the S1, S2,…, Sn states (or, briefly, from the indices 1, 2, …, n of the states)
- There is a directed edge that connects the Si vertex to the Sj vertex if and only if the probability of transition from Si to Sj is positive (a probability that is in turn used as a label of the edge itself)
It is clear that the transition matrix and transition diagram provide the same information regarding the same Markov chain. To understand this duality, we can look at a simple example. Say that we have a Markov chain with three possible states—1, 2, and 3—and the following transition matrix:
The following diagram shows the transition for the preceding Markov chain. In this diagram, there are three possible states—1, 2, and 3—and the directed edge from each state to other states shows the transition probabilities pij. When there is no arrow from state i to state j, it means that pij=0:
In the previous diagram, we can see that the arrows that come out of a state always sum up exactly at 1, just as the values of every row in the transition matrix must add up exactly to 1, which represents the probability distribution. From the comparison between the transition matrix and transition diagram, it is possible to understand the duality between the two resources. As always, a diagram is much more illustrative.
- 機器學習實戰:基于Sophon平臺的機器學習理論與實踐
- Hands-On Artificial Intelligence on Amazon Web Services
- Seven NoSQL Databases in a Week
- 最后一個人類
- 具比例時滯遞歸神經網絡的穩定性及其仿真與應用
- Enterprise PowerShell Scripting Bootcamp
- Troubleshooting OpenVPN
- 面向對象程序設計綜合實踐
- 人工智能技術入門
- Silverlight 2完美征程
- 所羅門的密碼
- 會聲會影X4中文版從入門到精通
- Mastering Ansible(Second Edition)
- 步步驚“芯”
- 自適應學習:人工智能時代的教育革命