Results 1 to 3 of 3

Math Help - Linear Programming/ Markov Chain

  1. #1
    Member
    Joined
    Aug 2007
    Posts
    239

    Linear Programming/ Markov Chain

    We want to minimize  \bold{c} \bold{x} subject to  \bold{A} \bold{x} = \bold{b}, \ \bold{x} \geq \bold{0} .

     \bold{A} is an  m \times n matrix of fixed constants,  \bold{c} = (c_1, \ldots, c_n) ,  \bold{b} = (b_1, \ldots, b_m) (both are vectors of fixed constants) and  \bold{x} = (x_{1}, \ldots, x_n) is the  n -vector of nonnegative values that is to be chosen to minimize  \bold{c} \bold{x} \equiv \sum_{i=1}^{n} c_{i} x_{i} . So if  n > m , then the optimal  \bold{x} can always be chosen to have  n-m components equal to  0 .

    So in a Markov chain, suppose that the algorithm is at the  j th best extreme point, then after the next pivot the resulting extreme point is equally likely to be any of the  j-1 best. We want to find the expected number of transitions needed to go from state  i to state  1 , or  E[T_i] .

    Then why does  E[T_i] = 1 + \frac{1}{i-1} \sum_{j=1}^{i-1} E[T_j] ? This is only for the initial transition right? They probably conditioned on some variable, but I am not seeing it.

    Ultimately  E[T_i] = \sum_{j=1}^{i-1} \frac{1}{j} .

    Source: An Introduction to Probability Models by Sheldon Ross
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Member
    Joined
    Aug 2007
    Posts
    239
    I guess what I am saying is why they even did that in the first place?
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Grand Panjandrum
    Joined
    Nov 2005
    From
    someplace
    Posts
    14,972
    Thanks
    4
    Quote Originally Posted by shilz222 View Post
    So in a Markov chain, suppose that the algorithm is at the  j th best extreme point, then after the next pivot the resulting extreme point is equally likely to be any of the  j-1 best. We want to find the expected number of transitions needed to go from state  i to state  1 , or  E[T_i] .

    Then why does  E[T_i] = 1 + \frac{1}{i-1} \sum_{j=1}^{i-1} E[T_j] ? This is only for the initial transition right? They probably conditioned on some variable, but I am not seeing it.
    If i>1 then after 1 transition it is in state 1, .. , i-1 with probability 1/(i-1).

    So:

    <br />
E(T_i)= 1 + \frac{1}{i-1} E(T_{1})+ \frac{1}{i-2} E(T_{2})+..+\frac{1}{i-1} E(T_{i-1})=1 + \frac{1}{i-1} \sum_{j=1}^{i-1} E[T_j]

    RonL
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. Markov Chain of random variables from a primitive markov chain
    Posted in the Advanced Statistics Forum
    Replies: 0
    Last Post: October 19th 2011, 08:12 AM
  2. Markov chain
    Posted in the Advanced Statistics Forum
    Replies: 1
    Last Post: November 20th 2010, 01:40 AM
  3. Markov chain
    Posted in the Advanced Statistics Forum
    Replies: 5
    Last Post: September 6th 2009, 01:25 PM
  4. Replies: 1
    Last Post: November 17th 2008, 03:18 AM
  5. Replies: 2
    Last Post: October 28th 2008, 06:32 PM

Search Tags


/mathhelpforum @mathhelpforum