In fact, irrespective of the initial state the process was in, the state distribution will always converge to [0.5714, 0.4286]. You could test with other initial distributions, such as [0.2, 0.8] and [1, 0]. The distribution will remain [0.5714, 0.4286] after 10 steps.
A Markov chain does not necessarily converge, especially when it contains transient or current states. But if it does converge, it will reach the same equilibrium regardless of the starting distribution.