Simple proof of a theorem on convergence of series
up vote
5
down vote
favorite
Let $p_jge0, j=1,2,3,dots,$ and suppose $sum_j p_j=1.$ Is there a simple proof that $$sum_{j=1}^infty{jp_j}tag{1}$$ converges? My question arises from the answer to this question. Consider a Markov chain with state space ${1,2,3,dots}.$ If the chain is is state $1,$ it transitions to state $j$ with probability $p_j.$ If it is in state $j>1$ then it always transitions to state $j-1$. The chain is irreducible and aperiodic, so it has a unique stationary distribution. The sum $(1)$ arises in computing the stationary probabilities, so it must converge.
I've been trying unsuccessfully to find a more direct proof. There isn't any way to apply standard tests (root test, ratio test, Gauss's test) and I haven't any other ideas. (It's equivalent to the statement that if N is a random variable that takes positive integer values, then $E(N)$ exists, but I don't see how that helps. In fact, my intuition would be that this statement is false.)
EDIT
It has been amply shown that the statement is false. I would like to know the error in the linked question.
probability sequences-and-series markov-chains
add a comment |
up vote
5
down vote
favorite
Let $p_jge0, j=1,2,3,dots,$ and suppose $sum_j p_j=1.$ Is there a simple proof that $$sum_{j=1}^infty{jp_j}tag{1}$$ converges? My question arises from the answer to this question. Consider a Markov chain with state space ${1,2,3,dots}.$ If the chain is is state $1,$ it transitions to state $j$ with probability $p_j.$ If it is in state $j>1$ then it always transitions to state $j-1$. The chain is irreducible and aperiodic, so it has a unique stationary distribution. The sum $(1)$ arises in computing the stationary probabilities, so it must converge.
I've been trying unsuccessfully to find a more direct proof. There isn't any way to apply standard tests (root test, ratio test, Gauss's test) and I haven't any other ideas. (It's equivalent to the statement that if N is a random variable that takes positive integer values, then $E(N)$ exists, but I don't see how that helps. In fact, my intuition would be that this statement is false.)
EDIT
It has been amply shown that the statement is false. I would like to know the error in the linked question.
probability sequences-and-series markov-chains
1
Isn't this math.stackexchange.com/a/520619/42969 a counter-example?
– Martin R
Dec 5 at 16:24
You are right. What is the error in the linked question?
– saulspatz
Dec 5 at 16:27
I don't see an argument in the linked question for why this statement should be true, just a claim in a comment that "this problem shows" it to be true. Maybe if you provided more detail as to why you think the problems shows it to be true then someone could find your error.
– MartianInvader
Dec 5 at 17:52
@MartianInvader The stationary distribution exists by standard theorems on Markov chains, and the OP of the original question has shown how to calculate the the stationary probabilities. If my transformation of the series is correct, then it arises in the formula for those probabilities, and it must converge. So, it must be that the transformation is wrong, or there is an error in the original problem.
– saulspatz
Dec 5 at 17:56
add a comment |
up vote
5
down vote
favorite
up vote
5
down vote
favorite
Let $p_jge0, j=1,2,3,dots,$ and suppose $sum_j p_j=1.$ Is there a simple proof that $$sum_{j=1}^infty{jp_j}tag{1}$$ converges? My question arises from the answer to this question. Consider a Markov chain with state space ${1,2,3,dots}.$ If the chain is is state $1,$ it transitions to state $j$ with probability $p_j.$ If it is in state $j>1$ then it always transitions to state $j-1$. The chain is irreducible and aperiodic, so it has a unique stationary distribution. The sum $(1)$ arises in computing the stationary probabilities, so it must converge.
I've been trying unsuccessfully to find a more direct proof. There isn't any way to apply standard tests (root test, ratio test, Gauss's test) and I haven't any other ideas. (It's equivalent to the statement that if N is a random variable that takes positive integer values, then $E(N)$ exists, but I don't see how that helps. In fact, my intuition would be that this statement is false.)
EDIT
It has been amply shown that the statement is false. I would like to know the error in the linked question.
probability sequences-and-series markov-chains
Let $p_jge0, j=1,2,3,dots,$ and suppose $sum_j p_j=1.$ Is there a simple proof that $$sum_{j=1}^infty{jp_j}tag{1}$$ converges? My question arises from the answer to this question. Consider a Markov chain with state space ${1,2,3,dots}.$ If the chain is is state $1,$ it transitions to state $j$ with probability $p_j.$ If it is in state $j>1$ then it always transitions to state $j-1$. The chain is irreducible and aperiodic, so it has a unique stationary distribution. The sum $(1)$ arises in computing the stationary probabilities, so it must converge.
I've been trying unsuccessfully to find a more direct proof. There isn't any way to apply standard tests (root test, ratio test, Gauss's test) and I haven't any other ideas. (It's equivalent to the statement that if N is a random variable that takes positive integer values, then $E(N)$ exists, but I don't see how that helps. In fact, my intuition would be that this statement is false.)
EDIT
It has been amply shown that the statement is false. I would like to know the error in the linked question.
probability sequences-and-series markov-chains
probability sequences-and-series markov-chains
edited Dec 5 at 16:33
asked Dec 5 at 16:21
saulspatz
13.7k21328
13.7k21328
1
Isn't this math.stackexchange.com/a/520619/42969 a counter-example?
– Martin R
Dec 5 at 16:24
You are right. What is the error in the linked question?
– saulspatz
Dec 5 at 16:27
I don't see an argument in the linked question for why this statement should be true, just a claim in a comment that "this problem shows" it to be true. Maybe if you provided more detail as to why you think the problems shows it to be true then someone could find your error.
– MartianInvader
Dec 5 at 17:52
@MartianInvader The stationary distribution exists by standard theorems on Markov chains, and the OP of the original question has shown how to calculate the the stationary probabilities. If my transformation of the series is correct, then it arises in the formula for those probabilities, and it must converge. So, it must be that the transformation is wrong, or there is an error in the original problem.
– saulspatz
Dec 5 at 17:56
add a comment |
1
Isn't this math.stackexchange.com/a/520619/42969 a counter-example?
– Martin R
Dec 5 at 16:24
You are right. What is the error in the linked question?
– saulspatz
Dec 5 at 16:27
I don't see an argument in the linked question for why this statement should be true, just a claim in a comment that "this problem shows" it to be true. Maybe if you provided more detail as to why you think the problems shows it to be true then someone could find your error.
– MartianInvader
Dec 5 at 17:52
@MartianInvader The stationary distribution exists by standard theorems on Markov chains, and the OP of the original question has shown how to calculate the the stationary probabilities. If my transformation of the series is correct, then it arises in the formula for those probabilities, and it must converge. So, it must be that the transformation is wrong, or there is an error in the original problem.
– saulspatz
Dec 5 at 17:56
1
1
Isn't this math.stackexchange.com/a/520619/42969 a counter-example?
– Martin R
Dec 5 at 16:24
Isn't this math.stackexchange.com/a/520619/42969 a counter-example?
– Martin R
Dec 5 at 16:24
You are right. What is the error in the linked question?
– saulspatz
Dec 5 at 16:27
You are right. What is the error in the linked question?
– saulspatz
Dec 5 at 16:27
I don't see an argument in the linked question for why this statement should be true, just a claim in a comment that "this problem shows" it to be true. Maybe if you provided more detail as to why you think the problems shows it to be true then someone could find your error.
– MartianInvader
Dec 5 at 17:52
I don't see an argument in the linked question for why this statement should be true, just a claim in a comment that "this problem shows" it to be true. Maybe if you provided more detail as to why you think the problems shows it to be true then someone could find your error.
– MartianInvader
Dec 5 at 17:52
@MartianInvader The stationary distribution exists by standard theorems on Markov chains, and the OP of the original question has shown how to calculate the the stationary probabilities. If my transformation of the series is correct, then it arises in the formula for those probabilities, and it must converge. So, it must be that the transformation is wrong, or there is an error in the original problem.
– saulspatz
Dec 5 at 17:56
@MartianInvader The stationary distribution exists by standard theorems on Markov chains, and the OP of the original question has shown how to calculate the the stationary probabilities. If my transformation of the series is correct, then it arises in the formula for those probabilities, and it must converge. So, it must be that the transformation is wrong, or there is an error in the original problem.
– saulspatz
Dec 5 at 17:56
add a comment |
5 Answers
5
active
oldest
votes
up vote
7
down vote
accepted
Not all aperiodic, irreducible Markov processes have a stationary distribution. This is only true for finite state spaces. For infinite spaces, you need the process to be positive recurrent, meaning the expected time to return to a state is finite. Here, starting from $1$, the expected time to return to $1$ is $sum jp_j$. Therefore, your proof goes in circles; in order for the process to have a stationary distribution, you need $sum jp_j<infty$, and in order to prove that, you use that the process has a stationary distribution.
When the list $(p_1,p_2,dots)$ has too fat a tail, the process will never settle, and instead become more diffuse as time goes on.
Thank you. That is what I had forgotten.
– saulspatz
Dec 5 at 17:57
add a comment |
up vote
8
down vote
The statement is false. Put
$$
p_j=frac{6}{pi^2}frac{1}{j^2}quad (jgeq 1)
$$
where the constant is for normalization. Let $N$ be distributed according to this pmf. Then
$$
sum_{j=1}^infty jp_j=EN=frac{6}{pi^2}sum_{j=1}^inftyfrac{1}{j}=infty
$$
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:28
add a comment |
up vote
6
down vote
We have the series $displaystylesum_{j=1}^infty frac{1}{j(j+1)} = 1$, but $displaystylesum_{j=1}^infty frac{1}{j+1}$ diverges, so your affirmation is false.
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:29
add a comment |
up vote
3
down vote
It is known that $$sum_{j=1}^infty dfrac{1}{j^2}=dfrac{pi^2}{6}.$$
So if you take $p_j=frac{6}{(pi j)^2}$, you have $sum_{j=1}^infty p_j=1$ yet $sum_{j=1}^infty jp_j=frac{6}{pi^2}sum_{j=1}^inftyfrac{1}{j}=+infty.$
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:28
@saulspatz Looked it up. Didn't find any mistake. Rather, for your answer to hold, you must show that $$lim_{kto +infty}sum_{j=k}^infty(j-k)p_j=0,$$ which is not obvious to me.
– Scientifica
Dec 5 at 17:10
add a comment |
up vote
3
down vote
Another counterexample can be derived from the St. Petersburg paradox. Suppose that $p_j=frac1j$ if $j$ is a power of $2$, and $0$ otherwise. Then $sum p_j = sum 2^{-k}=1$, but $jp_j=1$ whenever $j$ is a power of $2$, and thus $sum jp_j$ diverges.
add a comment |
5 Answers
5
active
oldest
votes
5 Answers
5
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
7
down vote
accepted
Not all aperiodic, irreducible Markov processes have a stationary distribution. This is only true for finite state spaces. For infinite spaces, you need the process to be positive recurrent, meaning the expected time to return to a state is finite. Here, starting from $1$, the expected time to return to $1$ is $sum jp_j$. Therefore, your proof goes in circles; in order for the process to have a stationary distribution, you need $sum jp_j<infty$, and in order to prove that, you use that the process has a stationary distribution.
When the list $(p_1,p_2,dots)$ has too fat a tail, the process will never settle, and instead become more diffuse as time goes on.
Thank you. That is what I had forgotten.
– saulspatz
Dec 5 at 17:57
add a comment |
up vote
7
down vote
accepted
Not all aperiodic, irreducible Markov processes have a stationary distribution. This is only true for finite state spaces. For infinite spaces, you need the process to be positive recurrent, meaning the expected time to return to a state is finite. Here, starting from $1$, the expected time to return to $1$ is $sum jp_j$. Therefore, your proof goes in circles; in order for the process to have a stationary distribution, you need $sum jp_j<infty$, and in order to prove that, you use that the process has a stationary distribution.
When the list $(p_1,p_2,dots)$ has too fat a tail, the process will never settle, and instead become more diffuse as time goes on.
Thank you. That is what I had forgotten.
– saulspatz
Dec 5 at 17:57
add a comment |
up vote
7
down vote
accepted
up vote
7
down vote
accepted
Not all aperiodic, irreducible Markov processes have a stationary distribution. This is only true for finite state spaces. For infinite spaces, you need the process to be positive recurrent, meaning the expected time to return to a state is finite. Here, starting from $1$, the expected time to return to $1$ is $sum jp_j$. Therefore, your proof goes in circles; in order for the process to have a stationary distribution, you need $sum jp_j<infty$, and in order to prove that, you use that the process has a stationary distribution.
When the list $(p_1,p_2,dots)$ has too fat a tail, the process will never settle, and instead become more diffuse as time goes on.
Not all aperiodic, irreducible Markov processes have a stationary distribution. This is only true for finite state spaces. For infinite spaces, you need the process to be positive recurrent, meaning the expected time to return to a state is finite. Here, starting from $1$, the expected time to return to $1$ is $sum jp_j$. Therefore, your proof goes in circles; in order for the process to have a stationary distribution, you need $sum jp_j<infty$, and in order to prove that, you use that the process has a stationary distribution.
When the list $(p_1,p_2,dots)$ has too fat a tail, the process will never settle, and instead become more diffuse as time goes on.
edited Dec 5 at 17:58
answered Dec 5 at 17:55
Mike Earnest
19.6k11950
19.6k11950
Thank you. That is what I had forgotten.
– saulspatz
Dec 5 at 17:57
add a comment |
Thank you. That is what I had forgotten.
– saulspatz
Dec 5 at 17:57
Thank you. That is what I had forgotten.
– saulspatz
Dec 5 at 17:57
Thank you. That is what I had forgotten.
– saulspatz
Dec 5 at 17:57
add a comment |
up vote
8
down vote
The statement is false. Put
$$
p_j=frac{6}{pi^2}frac{1}{j^2}quad (jgeq 1)
$$
where the constant is for normalization. Let $N$ be distributed according to this pmf. Then
$$
sum_{j=1}^infty jp_j=EN=frac{6}{pi^2}sum_{j=1}^inftyfrac{1}{j}=infty
$$
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:28
add a comment |
up vote
8
down vote
The statement is false. Put
$$
p_j=frac{6}{pi^2}frac{1}{j^2}quad (jgeq 1)
$$
where the constant is for normalization. Let $N$ be distributed according to this pmf. Then
$$
sum_{j=1}^infty jp_j=EN=frac{6}{pi^2}sum_{j=1}^inftyfrac{1}{j}=infty
$$
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:28
add a comment |
up vote
8
down vote
up vote
8
down vote
The statement is false. Put
$$
p_j=frac{6}{pi^2}frac{1}{j^2}quad (jgeq 1)
$$
where the constant is for normalization. Let $N$ be distributed according to this pmf. Then
$$
sum_{j=1}^infty jp_j=EN=frac{6}{pi^2}sum_{j=1}^inftyfrac{1}{j}=infty
$$
The statement is false. Put
$$
p_j=frac{6}{pi^2}frac{1}{j^2}quad (jgeq 1)
$$
where the constant is for normalization. Let $N$ be distributed according to this pmf. Then
$$
sum_{j=1}^infty jp_j=EN=frac{6}{pi^2}sum_{j=1}^inftyfrac{1}{j}=infty
$$
answered Dec 5 at 16:26
Foobaz John
20.3k41250
20.3k41250
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:28
add a comment |
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:28
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:28
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:28
add a comment |
up vote
6
down vote
We have the series $displaystylesum_{j=1}^infty frac{1}{j(j+1)} = 1$, but $displaystylesum_{j=1}^infty frac{1}{j+1}$ diverges, so your affirmation is false.
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:29
add a comment |
up vote
6
down vote
We have the series $displaystylesum_{j=1}^infty frac{1}{j(j+1)} = 1$, but $displaystylesum_{j=1}^infty frac{1}{j+1}$ diverges, so your affirmation is false.
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:29
add a comment |
up vote
6
down vote
up vote
6
down vote
We have the series $displaystylesum_{j=1}^infty frac{1}{j(j+1)} = 1$, but $displaystylesum_{j=1}^infty frac{1}{j+1}$ diverges, so your affirmation is false.
We have the series $displaystylesum_{j=1}^infty frac{1}{j(j+1)} = 1$, but $displaystylesum_{j=1}^infty frac{1}{j+1}$ diverges, so your affirmation is false.
answered Dec 5 at 16:28
jjagmath
2206
2206
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:29
add a comment |
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:29
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:29
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:29
add a comment |
up vote
3
down vote
It is known that $$sum_{j=1}^infty dfrac{1}{j^2}=dfrac{pi^2}{6}.$$
So if you take $p_j=frac{6}{(pi j)^2}$, you have $sum_{j=1}^infty p_j=1$ yet $sum_{j=1}^infty jp_j=frac{6}{pi^2}sum_{j=1}^inftyfrac{1}{j}=+infty.$
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:28
@saulspatz Looked it up. Didn't find any mistake. Rather, for your answer to hold, you must show that $$lim_{kto +infty}sum_{j=k}^infty(j-k)p_j=0,$$ which is not obvious to me.
– Scientifica
Dec 5 at 17:10
add a comment |
up vote
3
down vote
It is known that $$sum_{j=1}^infty dfrac{1}{j^2}=dfrac{pi^2}{6}.$$
So if you take $p_j=frac{6}{(pi j)^2}$, you have $sum_{j=1}^infty p_j=1$ yet $sum_{j=1}^infty jp_j=frac{6}{pi^2}sum_{j=1}^inftyfrac{1}{j}=+infty.$
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:28
@saulspatz Looked it up. Didn't find any mistake. Rather, for your answer to hold, you must show that $$lim_{kto +infty}sum_{j=k}^infty(j-k)p_j=0,$$ which is not obvious to me.
– Scientifica
Dec 5 at 17:10
add a comment |
up vote
3
down vote
up vote
3
down vote
It is known that $$sum_{j=1}^infty dfrac{1}{j^2}=dfrac{pi^2}{6}.$$
So if you take $p_j=frac{6}{(pi j)^2}$, you have $sum_{j=1}^infty p_j=1$ yet $sum_{j=1}^infty jp_j=frac{6}{pi^2}sum_{j=1}^inftyfrac{1}{j}=+infty.$
It is known that $$sum_{j=1}^infty dfrac{1}{j^2}=dfrac{pi^2}{6}.$$
So if you take $p_j=frac{6}{(pi j)^2}$, you have $sum_{j=1}^infty p_j=1$ yet $sum_{j=1}^infty jp_j=frac{6}{pi^2}sum_{j=1}^inftyfrac{1}{j}=+infty.$
answered Dec 5 at 16:25
Scientifica
6,28641333
6,28641333
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:28
@saulspatz Looked it up. Didn't find any mistake. Rather, for your answer to hold, you must show that $$lim_{kto +infty}sum_{j=k}^infty(j-k)p_j=0,$$ which is not obvious to me.
– Scientifica
Dec 5 at 17:10
add a comment |
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:28
@saulspatz Looked it up. Didn't find any mistake. Rather, for your answer to hold, you must show that $$lim_{kto +infty}sum_{j=k}^infty(j-k)p_j=0,$$ which is not obvious to me.
– Scientifica
Dec 5 at 17:10
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:28
Can you find the error in the linked question?
– saulspatz
Dec 5 at 16:28
@saulspatz Looked it up. Didn't find any mistake. Rather, for your answer to hold, you must show that $$lim_{kto +infty}sum_{j=k}^infty(j-k)p_j=0,$$ which is not obvious to me.
– Scientifica
Dec 5 at 17:10
@saulspatz Looked it up. Didn't find any mistake. Rather, for your answer to hold, you must show that $$lim_{kto +infty}sum_{j=k}^infty(j-k)p_j=0,$$ which is not obvious to me.
– Scientifica
Dec 5 at 17:10
add a comment |
up vote
3
down vote
Another counterexample can be derived from the St. Petersburg paradox. Suppose that $p_j=frac1j$ if $j$ is a power of $2$, and $0$ otherwise. Then $sum p_j = sum 2^{-k}=1$, but $jp_j=1$ whenever $j$ is a power of $2$, and thus $sum jp_j$ diverges.
add a comment |
up vote
3
down vote
Another counterexample can be derived from the St. Petersburg paradox. Suppose that $p_j=frac1j$ if $j$ is a power of $2$, and $0$ otherwise. Then $sum p_j = sum 2^{-k}=1$, but $jp_j=1$ whenever $j$ is a power of $2$, and thus $sum jp_j$ diverges.
add a comment |
up vote
3
down vote
up vote
3
down vote
Another counterexample can be derived from the St. Petersburg paradox. Suppose that $p_j=frac1j$ if $j$ is a power of $2$, and $0$ otherwise. Then $sum p_j = sum 2^{-k}=1$, but $jp_j=1$ whenever $j$ is a power of $2$, and thus $sum jp_j$ diverges.
Another counterexample can be derived from the St. Petersburg paradox. Suppose that $p_j=frac1j$ if $j$ is a power of $2$, and $0$ otherwise. Then $sum p_j = sum 2^{-k}=1$, but $jp_j=1$ whenever $j$ is a power of $2$, and thus $sum jp_j$ diverges.
answered Dec 5 at 18:24
Acccumulation
6,6112616
6,6112616
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3027282%2fsimple-proof-of-a-theorem-on-convergence-of-series%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
Isn't this math.stackexchange.com/a/520619/42969 a counter-example?
– Martin R
Dec 5 at 16:24
You are right. What is the error in the linked question?
– saulspatz
Dec 5 at 16:27
I don't see an argument in the linked question for why this statement should be true, just a claim in a comment that "this problem shows" it to be true. Maybe if you provided more detail as to why you think the problems shows it to be true then someone could find your error.
– MartianInvader
Dec 5 at 17:52
@MartianInvader The stationary distribution exists by standard theorems on Markov chains, and the OP of the original question has shown how to calculate the the stationary probabilities. If my transformation of the series is correct, then it arises in the formula for those probabilities, and it must converge. So, it must be that the transformation is wrong, or there is an error in the original problem.
– saulspatz
Dec 5 at 17:56