Jump to content

Talk:Markov chain: Difference between revisions

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Content deleted Content added
m Archiving 1 discussion(s) to Talk:Markov chain/Archive 2) (bot
Elenktik (talk | contribs)
Line 57: Line 57:
The phrasing hasn't changed much since then. Personally, this looks to me like a clear case of plagiarism, especially that last sentence. Furthermore, the section doesn't cite any sources. I intend to Be Bold and remove the section entirely (I lack enough understanding of Harris chains to write a suitable replacement). Hopefully someone knowledgeable can rewrite the section appropriately and add it back in. [[User:WallAdhesion|WallAdhesion]] ([[User talk:WallAdhesion|talk]]) 23:00, 3 April 2021 (UTC)
The phrasing hasn't changed much since then. Personally, this looks to me like a clear case of plagiarism, especially that last sentence. Furthermore, the section doesn't cite any sources. I intend to Be Bold and remove the section entirely (I lack enough understanding of Harris chains to write a suitable replacement). Hopefully someone knowledgeable can rewrite the section appropriately and add it back in. [[User:WallAdhesion|WallAdhesion]] ([[User talk:WallAdhesion|talk]]) 23:00, 3 April 2021 (UTC)
:No knowledge about Harris chains from me, but can this be solved just by rephrasing? For instance, "The main idea is to see if there is a point in the state space that the chain hits with probability one" could be changed to "The principle applies to chains with some fixed state that is hit almost surely". As for the lack of sources... well, ''Probability Theory and Examples'' would be the source to cite, I guess. If this isn't possible, removing the section would be right because this is plaigarism as it stands. — [[User:Bilorv|Bilorv]] ('''[[User talk:Bilorv|<span style="color:purple">talk</span>]]''') 19:00, 4 April 2021 (UTC)
:No knowledge about Harris chains from me, but can this be solved just by rephrasing? For instance, "The main idea is to see if there is a point in the state space that the chain hits with probability one" could be changed to "The principle applies to chains with some fixed state that is hit almost surely". As for the lack of sources... well, ''Probability Theory and Examples'' would be the source to cite, I guess. If this isn't possible, removing the section would be right because this is plaigarism as it stands. — [[User:Bilorv|Bilorv]] ('''[[User talk:Bilorv|<span style="color:purple">talk</span>]]''') 19:00, 4 April 2021 (UTC)

== Where is the reversible markov chain section? ==

There used to be a reversible Markov chain section here. Where is it? If anyone thinks this is out of scope for the main Markov article, then there should be at least a subarticle about this subject.

https://en.wikipedia.org/w/index.php?title=Markov_chain&oldid=689230061#Reversible_Markov_chain

[[User:Elenktik|Elenktik]] ([[User talk:Elenktik|talk]]) 14:26, 4 September 2021 (UTC)

Revision as of 14:26, 4 September 2021

Template:WP1.0 Template:Vital article

Aperiodic

Did anyone notice that the definiton of aperiodic in the text is plain wrong, the condition mentioned implies that the chain is aperiodic but it's not neccesary. — Preceding unsigned comment added by 136.142.151.43 (talk) 21:52, 13 February 2005‎ (UTC)[reply]

Potential split

It's remarkable to me that we don't have separate articles for discrete-time Markov chains and continuous-time Markov chains, instead just having this article for both—a long article where one has to get a fair way into the body to just get a DTMC/CTMC definition. I believe it would be clearer to have three articles: an article introducing DTMCs and giving basic theorems, properties and applications (lead understandable to high school students, most of the body to early undergraduates); an article on CTMCs of a similar scope; and this article, focused on history and applications with only a brief focus on the mathematical groundwork and maybe the similarities and differences between DTMCs and CTMCs (e.g. periodicity only applies to DTMCs, which this article currently does not make clear).

Is there opposition to this view? Are there pages somewhere that I've missed that cover the basics of DTMCs or CTMCs? — Bilorv (talk) 15:05, 14 July 2020 (UTC)[reply]

This sounds like a good idea to me. (I do not volunteer to help with the labor, though, just to give a vote of confidence.) The article Stochastic matrix is something like an article on the basics of DTMCs. Examples of Markov chains includes mostly discrete examples but isn't a proper intro article. --JBL (talk) 00:04, 21 July 2020 (UTC)[reply]
I've no objection in principle to a split, and your plan for what should go where sounds reasonable. XOR'easter (talk) 05:28, 21 July 2020 (UTC)[reply]
Okay, thanks for the comments. Since there's been no objection, I've gone ahead and done with this. Still a lot of cleanup needed at all three articles, but the length of this article has now been somewhat addressed. — Bilorv (talk) 21:07, 1 August 2020 (UTC)[reply]

Plagiarism in Harris Chains Section

The third edition of Rick Durrett's "Probability Theory and Examples", published in 2004, has the following to say about Harris chains (see page 322):

The developments here are motivated by three ideas. First, the proofs in the last two sections work if there is one point in the state space that the chain hits with probability one. (Think, for example, about the construction of the stationary measure in (4.3).) Second, a recurrent Harris chain can be modified to contain such a point. Third, the collection of Harris chains is a comfortable level of generality; broad enough to contain a large number of interesting examples, yet restrictive enough to allow for a rich theory.

The section of this article on Harris Chains was added in 2007: https://en.wikipedia.org/w/index.php?title=Markov_chain&diff=119170716&oldid=119072272. At the time, it read:

Many results for Markov chains with finite state space can be generated into uncountable state space through Harris chains. The main idea is to see if there is a point in the state space that the chain hits with probability one. Generally, it is not true for continuous state space, however, we can define sets A and B along with a positive number ε and a probability

measure ρ, such that

  1. If , then for all .
  2. If and , then.

Then we could collapse the sets into a auxiliary point α, and a recurrent Harris chain can be modified to contain α. Lastly, the collection of Harris chains is a comfortable level of generality, which is broad enough to contain a large number of interesting examples, yet restrictive enough to allow for a rich theory.

The phrasing hasn't changed much since then. Personally, this looks to me like a clear case of plagiarism, especially that last sentence. Furthermore, the section doesn't cite any sources. I intend to Be Bold and remove the section entirely (I lack enough understanding of Harris chains to write a suitable replacement). Hopefully someone knowledgeable can rewrite the section appropriately and add it back in. WallAdhesion (talk) 23:00, 3 April 2021 (UTC)[reply]

No knowledge about Harris chains from me, but can this be solved just by rephrasing? For instance, "The main idea is to see if there is a point in the state space that the chain hits with probability one" could be changed to "The principle applies to chains with some fixed state that is hit almost surely". As for the lack of sources... well, Probability Theory and Examples would be the source to cite, I guess. If this isn't possible, removing the section would be right because this is plaigarism as it stands. — Bilorv (talk) 19:00, 4 April 2021 (UTC)[reply]

Where is the reversible markov chain section?

There used to be a reversible Markov chain section here. Where is it? If anyone thinks this is out of scope for the main Markov article, then there should be at least a subarticle about this subject.

https://en.wikipedia.org/w/index.php?title=Markov_chain&oldid=689230061#Reversible_Markov_chain

Elenktik (talk) 14:26, 4 September 2021 (UTC)[reply]