Jump to content

Talk:Markov chain

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Elenktik (talk | contribs) at 08:34, 8 February 2022 (→‎Where is the reversible markov chain section?). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Template:WP1.0 Template:Vital article

Aperiodic

Did anyone notice that the definiton of aperiodic in the text is plain wrong, the condition mentioned implies that the chain is aperiodic but it's not neccesary. — Preceding unsigned comment added by 136.142.151.43 (talk) 21:52, 13 February 2005‎ (UTC)[reply]

Plagiarism in Harris Chains Section

The third edition of Rick Durrett's "Probability Theory and Examples", published in 2004, has the following to say about Harris chains (see page 322):

The developments here are motivated by three ideas. First, the proofs in the last two sections work if there is one point in the state space that the chain hits with probability one. (Think, for example, about the construction of the stationary measure in (4.3).) Second, a recurrent Harris chain can be modified to contain such a point. Third, the collection of Harris chains is a comfortable level of generality; broad enough to contain a large number of interesting examples, yet restrictive enough to allow for a rich theory.

The section of this article on Harris Chains was added in 2007: https://en.wikipedia.org/w/index.php?title=Markov_chain&diff=119170716&oldid=119072272. At the time, it read:

Many results for Markov chains with finite state space can be generated into uncountable state space through Harris chains. The main idea is to see if there is a point in the state space that the chain hits with probability one. Generally, it is not true for continuous state space, however, we can define sets A and B along with a positive number ε and a probability

measure ρ, such that

  1. If , then for all .
  2. If and , then.

Then we could collapse the sets into a auxiliary point α, and a recurrent Harris chain can be modified to contain α. Lastly, the collection of Harris chains is a comfortable level of generality, which is broad enough to contain a large number of interesting examples, yet restrictive enough to allow for a rich theory.

The phrasing hasn't changed much since then. Personally, this looks to me like a clear case of plagiarism, especially that last sentence. Furthermore, the section doesn't cite any sources. I intend to Be Bold and remove the section entirely (I lack enough understanding of Harris chains to write a suitable replacement). Hopefully someone knowledgeable can rewrite the section appropriately and add it back in. WallAdhesion (talk) 23:00, 3 April 2021 (UTC)[reply]

No knowledge about Harris chains from me, but can this be solved just by rephrasing? For instance, "The main idea is to see if there is a point in the state space that the chain hits with probability one" could be changed to "The principle applies to chains with some fixed state that is hit almost surely". As for the lack of sources... well, Probability Theory and Examples would be the source to cite, I guess. If this isn't possible, removing the section would be right because this is plaigarism as it stands. — Bilorv (talk) 19:00, 4 April 2021 (UTC)[reply]

Where is the reversible markov chain section?

There used to be a reversible Markov chain section here. Where is it? If anyone thinks this is out of scope for the main Markov article, then there should be at least a subarticle about this subject.

https://en.wikipedia.org/w/index.php?title=Markov_chain&oldid=689230061#Reversible_Markov_chain

Elenktik (talk) 14:26, 4 September 2021 (UTC)[reply]

@Elenktik: after a split, because the article was getting long and unstructured, the material was moved to Discrete-time Markov chain#Reversible Markov chain. There's also relevant content at Detailed balance#Reversible Markov chains. — Bilorv (talk) 16:42, 4 September 2021 (UTC)[reply]
@Bilorv: thank you for mentioning it. Strange that it was just not carried over to the new discrete Markov chain article, but removed from the original one. Anyway, I added to to the discrete Markov chain article. Elenktik (talk) 08:34, 8 February 2022 (UTC)[reply]