Jump to content

Talk:Markov property/Archive 1

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

I suppose that some description of the formulas would be welcomed - like it is done for Poisson process.

I believe that the Markov property is only the left hand side of the 1st formula. The right hand side states that this is a first order Markov chain. Can somebody confirm this?

Brownian Motion[edit]

The most famous Markov processes are Markov chains, but many other processes, including Brownian motion, are Markovian.

What disqualifies Brownian motion from being a continuous-time Markov chain? Should the above say "are discrete-time Markov chains"? Josh Cherry 02:56, 15 Nov 2004 (UTC)

Upon further reading of continuous-time Markov chain, I suspect that I know the answer. But then isn't the first sentence of continuous-time Markov chain too loose a definition? Josh Cherry 03:03, 15 Nov 2004 (UTC)

I agree; the 'chain' part is poorly specified. In a Markov chain there is a sequence of states visited by the process (such as the count of a Poisson process), rather than a continuous path (as in Brownian motion). I'm planning some further edits to that page, so (if no one else does in the meantime) I'll incorporate this too. Ben Cairns 22:52, 27 Jan 2005 (UTC).

redirect[edit]

I've redirected this to Markov property because it was worthless. There is an article titled Markov chain that treats discrete-time Markov processes. Markov property does not assume discrete time. This article assumed (incorrectly) not only discrete time but also a state space that was not merely discrete but actually finite. In effect, this denies that the standard Wiener process (and many others) is a Markov process! Michael Hardy 14:51, 23 June 2006 (UTC)[reply]

continuous and discrete time[edit]

Hmm, this article gives the arkov property only for continuous time systems; it would be nice if it included the defn for discrete-time systems as well, or at leaast recapping sufficiently before telling the reader to go to read about markov chains. linas 23:59, 28 August 2006 (UTC)[reply]

Goblygook[edit]

Agree with the previous 100% with the previous. Without any discription of the formulas, this article looks like science classroom blackboard scribblings in a bad made-for-tv Sci-Fi movie. —The preceding unsigned comment was added by Bigjimslade (talkcontribs) 14:16, 17 April 2007 (UTC).[reply]

Totally agree... It's full of variables that aren't defined, and very difficult to understand for a novice in this field. --Mighty Jay 14:19, 15 June 2007 (UTC)[reply]
Second. Jm546 (talk) 16:39, 11 January 2008 (UTC)[reply]