What is the meaning, both intuitively and mathematically, behind the probability of a set?
up vote
1
down vote
favorite
I'm currently studying a course on applied probability and I keep running into a notation that I can't quite understand.
Let's say we are working in a progrability space $(Omega,mathcal{F},P)$ and we have some sequence of independent indentically distributed variables ${X_{i}:i in mathbb{N} }$ with each $X_{i}$ a martingale. We have a stopping time $tau$. I realise that in order to make any sense of this setting I need to give additional information about the problem but I'd like to just focus on the meaning of the following statements:
- $mathbb{P}({tau = infty }) < 1$
- $forall ninmathbb{N}:mathbb{P}({tau geq n}) geq mathbb{P}({tau = infty })$
I guess in words they would mean: "The probability that our stopping time equals infinity is smaller than $1$". However, what do the statements mean with regard to our probability space? Does the first statement mean that for any arbitrary $omega$, the probability of $tau(omega)$ not being equal to $infty$ is larger than $0$? The second statement is already considerably harder to wrap my head around than the first one.
I'm quite struggling with this relatively elementary notation so I hope I don't get laughed off this board, but any help is appreciated. The notation becomes especially hard to deal with when random walks and stopping times for those types of processes get introduced.
probability probability-theory conditional-probability expected-value
add a comment |
up vote
1
down vote
favorite
I'm currently studying a course on applied probability and I keep running into a notation that I can't quite understand.
Let's say we are working in a progrability space $(Omega,mathcal{F},P)$ and we have some sequence of independent indentically distributed variables ${X_{i}:i in mathbb{N} }$ with each $X_{i}$ a martingale. We have a stopping time $tau$. I realise that in order to make any sense of this setting I need to give additional information about the problem but I'd like to just focus on the meaning of the following statements:
- $mathbb{P}({tau = infty }) < 1$
- $forall ninmathbb{N}:mathbb{P}({tau geq n}) geq mathbb{P}({tau = infty })$
I guess in words they would mean: "The probability that our stopping time equals infinity is smaller than $1$". However, what do the statements mean with regard to our probability space? Does the first statement mean that for any arbitrary $omega$, the probability of $tau(omega)$ not being equal to $infty$ is larger than $0$? The second statement is already considerably harder to wrap my head around than the first one.
I'm quite struggling with this relatively elementary notation so I hope I don't get laughed off this board, but any help is appreciated. The notation becomes especially hard to deal with when random walks and stopping times for those types of processes get introduced.
probability probability-theory conditional-probability expected-value
Yes the first statement means that $P[{omega in Omega: tau(omega) = infty}] < 1$ so $P[{omega in Omega : tau(omega) < infty}]>0$. The second one may make more sense if you consider that for any natural number $n$ we have $${omega in Omega : tau(omega) = infty} subseteq {omega in Omega: tau(omega) geq n} $$ and recall that $A subseteq B implies P[A]leq P[B]$.
– Michael
8 hours ago
Could you explain in words what the two statements mean though? Let's say we select an arbitrary $omega in Omega$, then what do the statements mean in regard to this $omega$? Does it mean for the first statement that there is a positive probability that for this $omega$ I will have $tau(omega)=infty$? Please explain it in the most basic way possible by sketching a scenario. Thanks in advance.
– S. Crim
8 hours ago
add a comment |
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I'm currently studying a course on applied probability and I keep running into a notation that I can't quite understand.
Let's say we are working in a progrability space $(Omega,mathcal{F},P)$ and we have some sequence of independent indentically distributed variables ${X_{i}:i in mathbb{N} }$ with each $X_{i}$ a martingale. We have a stopping time $tau$. I realise that in order to make any sense of this setting I need to give additional information about the problem but I'd like to just focus on the meaning of the following statements:
- $mathbb{P}({tau = infty }) < 1$
- $forall ninmathbb{N}:mathbb{P}({tau geq n}) geq mathbb{P}({tau = infty })$
I guess in words they would mean: "The probability that our stopping time equals infinity is smaller than $1$". However, what do the statements mean with regard to our probability space? Does the first statement mean that for any arbitrary $omega$, the probability of $tau(omega)$ not being equal to $infty$ is larger than $0$? The second statement is already considerably harder to wrap my head around than the first one.
I'm quite struggling with this relatively elementary notation so I hope I don't get laughed off this board, but any help is appreciated. The notation becomes especially hard to deal with when random walks and stopping times for those types of processes get introduced.
probability probability-theory conditional-probability expected-value
I'm currently studying a course on applied probability and I keep running into a notation that I can't quite understand.
Let's say we are working in a progrability space $(Omega,mathcal{F},P)$ and we have some sequence of independent indentically distributed variables ${X_{i}:i in mathbb{N} }$ with each $X_{i}$ a martingale. We have a stopping time $tau$. I realise that in order to make any sense of this setting I need to give additional information about the problem but I'd like to just focus on the meaning of the following statements:
- $mathbb{P}({tau = infty }) < 1$
- $forall ninmathbb{N}:mathbb{P}({tau geq n}) geq mathbb{P}({tau = infty })$
I guess in words they would mean: "The probability that our stopping time equals infinity is smaller than $1$". However, what do the statements mean with regard to our probability space? Does the first statement mean that for any arbitrary $omega$, the probability of $tau(omega)$ not being equal to $infty$ is larger than $0$? The second statement is already considerably harder to wrap my head around than the first one.
I'm quite struggling with this relatively elementary notation so I hope I don't get laughed off this board, but any help is appreciated. The notation becomes especially hard to deal with when random walks and stopping times for those types of processes get introduced.
probability probability-theory conditional-probability expected-value
probability probability-theory conditional-probability expected-value
asked 9 hours ago
S. Crim
487
487
Yes the first statement means that $P[{omega in Omega: tau(omega) = infty}] < 1$ so $P[{omega in Omega : tau(omega) < infty}]>0$. The second one may make more sense if you consider that for any natural number $n$ we have $${omega in Omega : tau(omega) = infty} subseteq {omega in Omega: tau(omega) geq n} $$ and recall that $A subseteq B implies P[A]leq P[B]$.
– Michael
8 hours ago
Could you explain in words what the two statements mean though? Let's say we select an arbitrary $omega in Omega$, then what do the statements mean in regard to this $omega$? Does it mean for the first statement that there is a positive probability that for this $omega$ I will have $tau(omega)=infty$? Please explain it in the most basic way possible by sketching a scenario. Thanks in advance.
– S. Crim
8 hours ago
add a comment |
Yes the first statement means that $P[{omega in Omega: tau(omega) = infty}] < 1$ so $P[{omega in Omega : tau(omega) < infty}]>0$. The second one may make more sense if you consider that for any natural number $n$ we have $${omega in Omega : tau(omega) = infty} subseteq {omega in Omega: tau(omega) geq n} $$ and recall that $A subseteq B implies P[A]leq P[B]$.
– Michael
8 hours ago
Could you explain in words what the two statements mean though? Let's say we select an arbitrary $omega in Omega$, then what do the statements mean in regard to this $omega$? Does it mean for the first statement that there is a positive probability that for this $omega$ I will have $tau(omega)=infty$? Please explain it in the most basic way possible by sketching a scenario. Thanks in advance.
– S. Crim
8 hours ago
Yes the first statement means that $P[{omega in Omega: tau(omega) = infty}] < 1$ so $P[{omega in Omega : tau(omega) < infty}]>0$. The second one may make more sense if you consider that for any natural number $n$ we have $${omega in Omega : tau(omega) = infty} subseteq {omega in Omega: tau(omega) geq n} $$ and recall that $A subseteq B implies P[A]leq P[B]$.
– Michael
8 hours ago
Yes the first statement means that $P[{omega in Omega: tau(omega) = infty}] < 1$ so $P[{omega in Omega : tau(omega) < infty}]>0$. The second one may make more sense if you consider that for any natural number $n$ we have $${omega in Omega : tau(omega) = infty} subseteq {omega in Omega: tau(omega) geq n} $$ and recall that $A subseteq B implies P[A]leq P[B]$.
– Michael
8 hours ago
Could you explain in words what the two statements mean though? Let's say we select an arbitrary $omega in Omega$, then what do the statements mean in regard to this $omega$? Does it mean for the first statement that there is a positive probability that for this $omega$ I will have $tau(omega)=infty$? Please explain it in the most basic way possible by sketching a scenario. Thanks in advance.
– S. Crim
8 hours ago
Could you explain in words what the two statements mean though? Let's say we select an arbitrary $omega in Omega$, then what do the statements mean in regard to this $omega$? Does it mean for the first statement that there is a positive probability that for this $omega$ I will have $tau(omega)=infty$? Please explain it in the most basic way possible by sketching a scenario. Thanks in advance.
– S. Crim
8 hours ago
add a comment |
1 Answer
1
active
oldest
votes
up vote
1
down vote
accepted
1) The statement $P[tau = infty] < 1$ means that if we randomly select an outcome $omega$ to produce $tau(omega)$, the probability that our $tau(omega)$ is equal to infinity is less than 1. Indeed it implies that the probability that our $tau(omega)$ is less than infinity is larger than 0.
2) The second statement means that if we compare the probability that $tau(omega)=infty$ to the probability that $tau(omega) geq n$, the first is less than or equal to the second. This is because if $tau(omega)=infty$ then we know that $tau(omega) geq n$ must also be true. In particular:
$$ {tau(omega) = infty} subseteq {tau(omega) geq n}$$
and recall $Asubseteq B implies P[A] leq P[B]$.
You can draw a Venn diagram with a blob $A$ inside a larger blob $B$ to understand the statement $A subseteq B implies P[A] leq P[B]$.
– Michael
8 hours ago
Why do you call the selected $omega$ an ''outcome''? Is there a specific reason for that, as in it is the outcome of some process, or is it just vocabulary?
– S. Crim
8 hours ago
1
This is the vocabulary for probability: A particular element in the probability space $Omega$ is called an outcome. So then $Omega$ is just the set of all possible outcomes. Intuitively you can think of running a probability experiment, and the particular outcome of the experiment is the particular $omega$ value that arises. If we consider the experiment of rolling a die, then $Omega = {1, 2, 3, 4, 5, 6}$ and a particular outcome is, for example, "2." An event is a subset of outcomes, for example $A = {2, 4, 6} = {mbox{roll an even}}$
– Michael
8 hours ago
1
Thanks alot, you have been very helpful!
– S. Crim
8 hours ago
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
1) The statement $P[tau = infty] < 1$ means that if we randomly select an outcome $omega$ to produce $tau(omega)$, the probability that our $tau(omega)$ is equal to infinity is less than 1. Indeed it implies that the probability that our $tau(omega)$ is less than infinity is larger than 0.
2) The second statement means that if we compare the probability that $tau(omega)=infty$ to the probability that $tau(omega) geq n$, the first is less than or equal to the second. This is because if $tau(omega)=infty$ then we know that $tau(omega) geq n$ must also be true. In particular:
$$ {tau(omega) = infty} subseteq {tau(omega) geq n}$$
and recall $Asubseteq B implies P[A] leq P[B]$.
You can draw a Venn diagram with a blob $A$ inside a larger blob $B$ to understand the statement $A subseteq B implies P[A] leq P[B]$.
– Michael
8 hours ago
Why do you call the selected $omega$ an ''outcome''? Is there a specific reason for that, as in it is the outcome of some process, or is it just vocabulary?
– S. Crim
8 hours ago
1
This is the vocabulary for probability: A particular element in the probability space $Omega$ is called an outcome. So then $Omega$ is just the set of all possible outcomes. Intuitively you can think of running a probability experiment, and the particular outcome of the experiment is the particular $omega$ value that arises. If we consider the experiment of rolling a die, then $Omega = {1, 2, 3, 4, 5, 6}$ and a particular outcome is, for example, "2." An event is a subset of outcomes, for example $A = {2, 4, 6} = {mbox{roll an even}}$
– Michael
8 hours ago
1
Thanks alot, you have been very helpful!
– S. Crim
8 hours ago
add a comment |
up vote
1
down vote
accepted
1) The statement $P[tau = infty] < 1$ means that if we randomly select an outcome $omega$ to produce $tau(omega)$, the probability that our $tau(omega)$ is equal to infinity is less than 1. Indeed it implies that the probability that our $tau(omega)$ is less than infinity is larger than 0.
2) The second statement means that if we compare the probability that $tau(omega)=infty$ to the probability that $tau(omega) geq n$, the first is less than or equal to the second. This is because if $tau(omega)=infty$ then we know that $tau(omega) geq n$ must also be true. In particular:
$$ {tau(omega) = infty} subseteq {tau(omega) geq n}$$
and recall $Asubseteq B implies P[A] leq P[B]$.
You can draw a Venn diagram with a blob $A$ inside a larger blob $B$ to understand the statement $A subseteq B implies P[A] leq P[B]$.
– Michael
8 hours ago
Why do you call the selected $omega$ an ''outcome''? Is there a specific reason for that, as in it is the outcome of some process, or is it just vocabulary?
– S. Crim
8 hours ago
1
This is the vocabulary for probability: A particular element in the probability space $Omega$ is called an outcome. So then $Omega$ is just the set of all possible outcomes. Intuitively you can think of running a probability experiment, and the particular outcome of the experiment is the particular $omega$ value that arises. If we consider the experiment of rolling a die, then $Omega = {1, 2, 3, 4, 5, 6}$ and a particular outcome is, for example, "2." An event is a subset of outcomes, for example $A = {2, 4, 6} = {mbox{roll an even}}$
– Michael
8 hours ago
1
Thanks alot, you have been very helpful!
– S. Crim
8 hours ago
add a comment |
up vote
1
down vote
accepted
up vote
1
down vote
accepted
1) The statement $P[tau = infty] < 1$ means that if we randomly select an outcome $omega$ to produce $tau(omega)$, the probability that our $tau(omega)$ is equal to infinity is less than 1. Indeed it implies that the probability that our $tau(omega)$ is less than infinity is larger than 0.
2) The second statement means that if we compare the probability that $tau(omega)=infty$ to the probability that $tau(omega) geq n$, the first is less than or equal to the second. This is because if $tau(omega)=infty$ then we know that $tau(omega) geq n$ must also be true. In particular:
$$ {tau(omega) = infty} subseteq {tau(omega) geq n}$$
and recall $Asubseteq B implies P[A] leq P[B]$.
1) The statement $P[tau = infty] < 1$ means that if we randomly select an outcome $omega$ to produce $tau(omega)$, the probability that our $tau(omega)$ is equal to infinity is less than 1. Indeed it implies that the probability that our $tau(omega)$ is less than infinity is larger than 0.
2) The second statement means that if we compare the probability that $tau(omega)=infty$ to the probability that $tau(omega) geq n$, the first is less than or equal to the second. This is because if $tau(omega)=infty$ then we know that $tau(omega) geq n$ must also be true. In particular:
$$ {tau(omega) = infty} subseteq {tau(omega) geq n}$$
and recall $Asubseteq B implies P[A] leq P[B]$.
answered 8 hours ago
Michael
12.9k11325
12.9k11325
You can draw a Venn diagram with a blob $A$ inside a larger blob $B$ to understand the statement $A subseteq B implies P[A] leq P[B]$.
– Michael
8 hours ago
Why do you call the selected $omega$ an ''outcome''? Is there a specific reason for that, as in it is the outcome of some process, or is it just vocabulary?
– S. Crim
8 hours ago
1
This is the vocabulary for probability: A particular element in the probability space $Omega$ is called an outcome. So then $Omega$ is just the set of all possible outcomes. Intuitively you can think of running a probability experiment, and the particular outcome of the experiment is the particular $omega$ value that arises. If we consider the experiment of rolling a die, then $Omega = {1, 2, 3, 4, 5, 6}$ and a particular outcome is, for example, "2." An event is a subset of outcomes, for example $A = {2, 4, 6} = {mbox{roll an even}}$
– Michael
8 hours ago
1
Thanks alot, you have been very helpful!
– S. Crim
8 hours ago
add a comment |
You can draw a Venn diagram with a blob $A$ inside a larger blob $B$ to understand the statement $A subseteq B implies P[A] leq P[B]$.
– Michael
8 hours ago
Why do you call the selected $omega$ an ''outcome''? Is there a specific reason for that, as in it is the outcome of some process, or is it just vocabulary?
– S. Crim
8 hours ago
1
This is the vocabulary for probability: A particular element in the probability space $Omega$ is called an outcome. So then $Omega$ is just the set of all possible outcomes. Intuitively you can think of running a probability experiment, and the particular outcome of the experiment is the particular $omega$ value that arises. If we consider the experiment of rolling a die, then $Omega = {1, 2, 3, 4, 5, 6}$ and a particular outcome is, for example, "2." An event is a subset of outcomes, for example $A = {2, 4, 6} = {mbox{roll an even}}$
– Michael
8 hours ago
1
Thanks alot, you have been very helpful!
– S. Crim
8 hours ago
You can draw a Venn diagram with a blob $A$ inside a larger blob $B$ to understand the statement $A subseteq B implies P[A] leq P[B]$.
– Michael
8 hours ago
You can draw a Venn diagram with a blob $A$ inside a larger blob $B$ to understand the statement $A subseteq B implies P[A] leq P[B]$.
– Michael
8 hours ago
Why do you call the selected $omega$ an ''outcome''? Is there a specific reason for that, as in it is the outcome of some process, or is it just vocabulary?
– S. Crim
8 hours ago
Why do you call the selected $omega$ an ''outcome''? Is there a specific reason for that, as in it is the outcome of some process, or is it just vocabulary?
– S. Crim
8 hours ago
1
1
This is the vocabulary for probability: A particular element in the probability space $Omega$ is called an outcome. So then $Omega$ is just the set of all possible outcomes. Intuitively you can think of running a probability experiment, and the particular outcome of the experiment is the particular $omega$ value that arises. If we consider the experiment of rolling a die, then $Omega = {1, 2, 3, 4, 5, 6}$ and a particular outcome is, for example, "2." An event is a subset of outcomes, for example $A = {2, 4, 6} = {mbox{roll an even}}$
– Michael
8 hours ago
This is the vocabulary for probability: A particular element in the probability space $Omega$ is called an outcome. So then $Omega$ is just the set of all possible outcomes. Intuitively you can think of running a probability experiment, and the particular outcome of the experiment is the particular $omega$ value that arises. If we consider the experiment of rolling a die, then $Omega = {1, 2, 3, 4, 5, 6}$ and a particular outcome is, for example, "2." An event is a subset of outcomes, for example $A = {2, 4, 6} = {mbox{roll an even}}$
– Michael
8 hours ago
1
1
Thanks alot, you have been very helpful!
– S. Crim
8 hours ago
Thanks alot, you have been very helpful!
– S. Crim
8 hours ago
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2995459%2fwhat-is-the-meaning-both-intuitively-and-mathematically-behind-the-probability%23new-answer', 'question_page');
}
);
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Yes the first statement means that $P[{omega in Omega: tau(omega) = infty}] < 1$ so $P[{omega in Omega : tau(omega) < infty}]>0$. The second one may make more sense if you consider that for any natural number $n$ we have $${omega in Omega : tau(omega) = infty} subseteq {omega in Omega: tau(omega) geq n} $$ and recall that $A subseteq B implies P[A]leq P[B]$.
– Michael
8 hours ago
Could you explain in words what the two statements mean though? Let's say we select an arbitrary $omega in Omega$, then what do the statements mean in regard to this $omega$? Does it mean for the first statement that there is a positive probability that for this $omega$ I will have $tau(omega)=infty$? Please explain it in the most basic way possible by sketching a scenario. Thanks in advance.
– S. Crim
8 hours ago