What is the meaning, both intuitively and mathematically, behind the probability of a set?











up vote
1
down vote

favorite












I'm currently studying a course on applied probability and I keep running into a notation that I can't quite understand.



Let's say we are working in a progrability space $(Omega,mathcal{F},P)$ and we have some sequence of independent indentically distributed variables ${X_{i}:i in mathbb{N} }$ with each $X_{i}$ a martingale. We have a stopping time $tau$. I realise that in order to make any sense of this setting I need to give additional information about the problem but I'd like to just focus on the meaning of the following statements:




  1. $mathbb{P}({tau = infty }) < 1$

  2. $forall ninmathbb{N}:mathbb{P}({tau geq n}) geq mathbb{P}({tau = infty })$


I guess in words they would mean: "The probability that our stopping time equals infinity is smaller than $1$". However, what do the statements mean with regard to our probability space? Does the first statement mean that for any arbitrary $omega$, the probability of $tau(omega)$ not being equal to $infty$ is larger than $0$? The second statement is already considerably harder to wrap my head around than the first one.



I'm quite struggling with this relatively elementary notation so I hope I don't get laughed off this board, but any help is appreciated. The notation becomes especially hard to deal with when random walks and stopping times for those types of processes get introduced.










share|cite|improve this question






















  • Yes the first statement means that $P[{omega in Omega: tau(omega) = infty}] < 1$ so $P[{omega in Omega : tau(omega) < infty}]>0$. The second one may make more sense if you consider that for any natural number $n$ we have $${omega in Omega : tau(omega) = infty} subseteq {omega in Omega: tau(omega) geq n} $$ and recall that $A subseteq B implies P[A]leq P[B]$.
    – Michael
    8 hours ago












  • Could you explain in words what the two statements mean though? Let's say we select an arbitrary $omega in Omega$, then what do the statements mean in regard to this $omega$? Does it mean for the first statement that there is a positive probability that for this $omega$ I will have $tau(omega)=infty$? Please explain it in the most basic way possible by sketching a scenario. Thanks in advance.
    – S. Crim
    8 hours ago















up vote
1
down vote

favorite












I'm currently studying a course on applied probability and I keep running into a notation that I can't quite understand.



Let's say we are working in a progrability space $(Omega,mathcal{F},P)$ and we have some sequence of independent indentically distributed variables ${X_{i}:i in mathbb{N} }$ with each $X_{i}$ a martingale. We have a stopping time $tau$. I realise that in order to make any sense of this setting I need to give additional information about the problem but I'd like to just focus on the meaning of the following statements:




  1. $mathbb{P}({tau = infty }) < 1$

  2. $forall ninmathbb{N}:mathbb{P}({tau geq n}) geq mathbb{P}({tau = infty })$


I guess in words they would mean: "The probability that our stopping time equals infinity is smaller than $1$". However, what do the statements mean with regard to our probability space? Does the first statement mean that for any arbitrary $omega$, the probability of $tau(omega)$ not being equal to $infty$ is larger than $0$? The second statement is already considerably harder to wrap my head around than the first one.



I'm quite struggling with this relatively elementary notation so I hope I don't get laughed off this board, but any help is appreciated. The notation becomes especially hard to deal with when random walks and stopping times for those types of processes get introduced.










share|cite|improve this question






















  • Yes the first statement means that $P[{omega in Omega: tau(omega) = infty}] < 1$ so $P[{omega in Omega : tau(omega) < infty}]>0$. The second one may make more sense if you consider that for any natural number $n$ we have $${omega in Omega : tau(omega) = infty} subseteq {omega in Omega: tau(omega) geq n} $$ and recall that $A subseteq B implies P[A]leq P[B]$.
    – Michael
    8 hours ago












  • Could you explain in words what the two statements mean though? Let's say we select an arbitrary $omega in Omega$, then what do the statements mean in regard to this $omega$? Does it mean for the first statement that there is a positive probability that for this $omega$ I will have $tau(omega)=infty$? Please explain it in the most basic way possible by sketching a scenario. Thanks in advance.
    – S. Crim
    8 hours ago













up vote
1
down vote

favorite









up vote
1
down vote

favorite











I'm currently studying a course on applied probability and I keep running into a notation that I can't quite understand.



Let's say we are working in a progrability space $(Omega,mathcal{F},P)$ and we have some sequence of independent indentically distributed variables ${X_{i}:i in mathbb{N} }$ with each $X_{i}$ a martingale. We have a stopping time $tau$. I realise that in order to make any sense of this setting I need to give additional information about the problem but I'd like to just focus on the meaning of the following statements:




  1. $mathbb{P}({tau = infty }) < 1$

  2. $forall ninmathbb{N}:mathbb{P}({tau geq n}) geq mathbb{P}({tau = infty })$


I guess in words they would mean: "The probability that our stopping time equals infinity is smaller than $1$". However, what do the statements mean with regard to our probability space? Does the first statement mean that for any arbitrary $omega$, the probability of $tau(omega)$ not being equal to $infty$ is larger than $0$? The second statement is already considerably harder to wrap my head around than the first one.



I'm quite struggling with this relatively elementary notation so I hope I don't get laughed off this board, but any help is appreciated. The notation becomes especially hard to deal with when random walks and stopping times for those types of processes get introduced.










share|cite|improve this question













I'm currently studying a course on applied probability and I keep running into a notation that I can't quite understand.



Let's say we are working in a progrability space $(Omega,mathcal{F},P)$ and we have some sequence of independent indentically distributed variables ${X_{i}:i in mathbb{N} }$ with each $X_{i}$ a martingale. We have a stopping time $tau$. I realise that in order to make any sense of this setting I need to give additional information about the problem but I'd like to just focus on the meaning of the following statements:




  1. $mathbb{P}({tau = infty }) < 1$

  2. $forall ninmathbb{N}:mathbb{P}({tau geq n}) geq mathbb{P}({tau = infty })$


I guess in words they would mean: "The probability that our stopping time equals infinity is smaller than $1$". However, what do the statements mean with regard to our probability space? Does the first statement mean that for any arbitrary $omega$, the probability of $tau(omega)$ not being equal to $infty$ is larger than $0$? The second statement is already considerably harder to wrap my head around than the first one.



I'm quite struggling with this relatively elementary notation so I hope I don't get laughed off this board, but any help is appreciated. The notation becomes especially hard to deal with when random walks and stopping times for those types of processes get introduced.







probability probability-theory conditional-probability expected-value






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked 9 hours ago









S. Crim

487




487












  • Yes the first statement means that $P[{omega in Omega: tau(omega) = infty}] < 1$ so $P[{omega in Omega : tau(omega) < infty}]>0$. The second one may make more sense if you consider that for any natural number $n$ we have $${omega in Omega : tau(omega) = infty} subseteq {omega in Omega: tau(omega) geq n} $$ and recall that $A subseteq B implies P[A]leq P[B]$.
    – Michael
    8 hours ago












  • Could you explain in words what the two statements mean though? Let's say we select an arbitrary $omega in Omega$, then what do the statements mean in regard to this $omega$? Does it mean for the first statement that there is a positive probability that for this $omega$ I will have $tau(omega)=infty$? Please explain it in the most basic way possible by sketching a scenario. Thanks in advance.
    – S. Crim
    8 hours ago


















  • Yes the first statement means that $P[{omega in Omega: tau(omega) = infty}] < 1$ so $P[{omega in Omega : tau(omega) < infty}]>0$. The second one may make more sense if you consider that for any natural number $n$ we have $${omega in Omega : tau(omega) = infty} subseteq {omega in Omega: tau(omega) geq n} $$ and recall that $A subseteq B implies P[A]leq P[B]$.
    – Michael
    8 hours ago












  • Could you explain in words what the two statements mean though? Let's say we select an arbitrary $omega in Omega$, then what do the statements mean in regard to this $omega$? Does it mean for the first statement that there is a positive probability that for this $omega$ I will have $tau(omega)=infty$? Please explain it in the most basic way possible by sketching a scenario. Thanks in advance.
    – S. Crim
    8 hours ago
















Yes the first statement means that $P[{omega in Omega: tau(omega) = infty}] < 1$ so $P[{omega in Omega : tau(omega) < infty}]>0$. The second one may make more sense if you consider that for any natural number $n$ we have $${omega in Omega : tau(omega) = infty} subseteq {omega in Omega: tau(omega) geq n} $$ and recall that $A subseteq B implies P[A]leq P[B]$.
– Michael
8 hours ago






Yes the first statement means that $P[{omega in Omega: tau(omega) = infty}] < 1$ so $P[{omega in Omega : tau(omega) < infty}]>0$. The second one may make more sense if you consider that for any natural number $n$ we have $${omega in Omega : tau(omega) = infty} subseteq {omega in Omega: tau(omega) geq n} $$ and recall that $A subseteq B implies P[A]leq P[B]$.
– Michael
8 hours ago














Could you explain in words what the two statements mean though? Let's say we select an arbitrary $omega in Omega$, then what do the statements mean in regard to this $omega$? Does it mean for the first statement that there is a positive probability that for this $omega$ I will have $tau(omega)=infty$? Please explain it in the most basic way possible by sketching a scenario. Thanks in advance.
– S. Crim
8 hours ago




Could you explain in words what the two statements mean though? Let's say we select an arbitrary $omega in Omega$, then what do the statements mean in regard to this $omega$? Does it mean for the first statement that there is a positive probability that for this $omega$ I will have $tau(omega)=infty$? Please explain it in the most basic way possible by sketching a scenario. Thanks in advance.
– S. Crim
8 hours ago










1 Answer
1






active

oldest

votes

















up vote
1
down vote



accepted










1) The statement $P[tau = infty] < 1$ means that if we randomly select an outcome $omega$ to produce $tau(omega)$, the probability that our $tau(omega)$ is equal to infinity is less than 1. Indeed it implies that the probability that our $tau(omega)$ is less than infinity is larger than 0.



2) The second statement means that if we compare the probability that $tau(omega)=infty$ to the probability that $tau(omega) geq n$, the first is less than or equal to the second. This is because if $tau(omega)=infty$ then we know that $tau(omega) geq n$ must also be true. In particular:
$$ {tau(omega) = infty} subseteq {tau(omega) geq n}$$
and recall $Asubseteq B implies P[A] leq P[B]$.






share|cite|improve this answer





















  • You can draw a Venn diagram with a blob $A$ inside a larger blob $B$ to understand the statement $A subseteq B implies P[A] leq P[B]$.
    – Michael
    8 hours ago










  • Why do you call the selected $omega$ an ''outcome''? Is there a specific reason for that, as in it is the outcome of some process, or is it just vocabulary?
    – S. Crim
    8 hours ago






  • 1




    This is the vocabulary for probability: A particular element in the probability space $Omega$ is called an outcome. So then $Omega$ is just the set of all possible outcomes. Intuitively you can think of running a probability experiment, and the particular outcome of the experiment is the particular $omega$ value that arises. If we consider the experiment of rolling a die, then $Omega = {1, 2, 3, 4, 5, 6}$ and a particular outcome is, for example, "2." An event is a subset of outcomes, for example $A = {2, 4, 6} = {mbox{roll an even}}$
    – Michael
    8 hours ago








  • 1




    Thanks alot, you have been very helpful!
    – S. Crim
    8 hours ago











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














 

draft saved


draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2995459%2fwhat-is-the-meaning-both-intuitively-and-mathematically-behind-the-probability%23new-answer', 'question_page');
}
);

Post as a guest
































1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
1
down vote



accepted










1) The statement $P[tau = infty] < 1$ means that if we randomly select an outcome $omega$ to produce $tau(omega)$, the probability that our $tau(omega)$ is equal to infinity is less than 1. Indeed it implies that the probability that our $tau(omega)$ is less than infinity is larger than 0.



2) The second statement means that if we compare the probability that $tau(omega)=infty$ to the probability that $tau(omega) geq n$, the first is less than or equal to the second. This is because if $tau(omega)=infty$ then we know that $tau(omega) geq n$ must also be true. In particular:
$$ {tau(omega) = infty} subseteq {tau(omega) geq n}$$
and recall $Asubseteq B implies P[A] leq P[B]$.






share|cite|improve this answer





















  • You can draw a Venn diagram with a blob $A$ inside a larger blob $B$ to understand the statement $A subseteq B implies P[A] leq P[B]$.
    – Michael
    8 hours ago










  • Why do you call the selected $omega$ an ''outcome''? Is there a specific reason for that, as in it is the outcome of some process, or is it just vocabulary?
    – S. Crim
    8 hours ago






  • 1




    This is the vocabulary for probability: A particular element in the probability space $Omega$ is called an outcome. So then $Omega$ is just the set of all possible outcomes. Intuitively you can think of running a probability experiment, and the particular outcome of the experiment is the particular $omega$ value that arises. If we consider the experiment of rolling a die, then $Omega = {1, 2, 3, 4, 5, 6}$ and a particular outcome is, for example, "2." An event is a subset of outcomes, for example $A = {2, 4, 6} = {mbox{roll an even}}$
    – Michael
    8 hours ago








  • 1




    Thanks alot, you have been very helpful!
    – S. Crim
    8 hours ago















up vote
1
down vote



accepted










1) The statement $P[tau = infty] < 1$ means that if we randomly select an outcome $omega$ to produce $tau(omega)$, the probability that our $tau(omega)$ is equal to infinity is less than 1. Indeed it implies that the probability that our $tau(omega)$ is less than infinity is larger than 0.



2) The second statement means that if we compare the probability that $tau(omega)=infty$ to the probability that $tau(omega) geq n$, the first is less than or equal to the second. This is because if $tau(omega)=infty$ then we know that $tau(omega) geq n$ must also be true. In particular:
$$ {tau(omega) = infty} subseteq {tau(omega) geq n}$$
and recall $Asubseteq B implies P[A] leq P[B]$.






share|cite|improve this answer





















  • You can draw a Venn diagram with a blob $A$ inside a larger blob $B$ to understand the statement $A subseteq B implies P[A] leq P[B]$.
    – Michael
    8 hours ago










  • Why do you call the selected $omega$ an ''outcome''? Is there a specific reason for that, as in it is the outcome of some process, or is it just vocabulary?
    – S. Crim
    8 hours ago






  • 1




    This is the vocabulary for probability: A particular element in the probability space $Omega$ is called an outcome. So then $Omega$ is just the set of all possible outcomes. Intuitively you can think of running a probability experiment, and the particular outcome of the experiment is the particular $omega$ value that arises. If we consider the experiment of rolling a die, then $Omega = {1, 2, 3, 4, 5, 6}$ and a particular outcome is, for example, "2." An event is a subset of outcomes, for example $A = {2, 4, 6} = {mbox{roll an even}}$
    – Michael
    8 hours ago








  • 1




    Thanks alot, you have been very helpful!
    – S. Crim
    8 hours ago













up vote
1
down vote



accepted







up vote
1
down vote



accepted






1) The statement $P[tau = infty] < 1$ means that if we randomly select an outcome $omega$ to produce $tau(omega)$, the probability that our $tau(omega)$ is equal to infinity is less than 1. Indeed it implies that the probability that our $tau(omega)$ is less than infinity is larger than 0.



2) The second statement means that if we compare the probability that $tau(omega)=infty$ to the probability that $tau(omega) geq n$, the first is less than or equal to the second. This is because if $tau(omega)=infty$ then we know that $tau(omega) geq n$ must also be true. In particular:
$$ {tau(omega) = infty} subseteq {tau(omega) geq n}$$
and recall $Asubseteq B implies P[A] leq P[B]$.






share|cite|improve this answer












1) The statement $P[tau = infty] < 1$ means that if we randomly select an outcome $omega$ to produce $tau(omega)$, the probability that our $tau(omega)$ is equal to infinity is less than 1. Indeed it implies that the probability that our $tau(omega)$ is less than infinity is larger than 0.



2) The second statement means that if we compare the probability that $tau(omega)=infty$ to the probability that $tau(omega) geq n$, the first is less than or equal to the second. This is because if $tau(omega)=infty$ then we know that $tau(omega) geq n$ must also be true. In particular:
$$ {tau(omega) = infty} subseteq {tau(omega) geq n}$$
and recall $Asubseteq B implies P[A] leq P[B]$.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered 8 hours ago









Michael

12.9k11325




12.9k11325












  • You can draw a Venn diagram with a blob $A$ inside a larger blob $B$ to understand the statement $A subseteq B implies P[A] leq P[B]$.
    – Michael
    8 hours ago










  • Why do you call the selected $omega$ an ''outcome''? Is there a specific reason for that, as in it is the outcome of some process, or is it just vocabulary?
    – S. Crim
    8 hours ago






  • 1




    This is the vocabulary for probability: A particular element in the probability space $Omega$ is called an outcome. So then $Omega$ is just the set of all possible outcomes. Intuitively you can think of running a probability experiment, and the particular outcome of the experiment is the particular $omega$ value that arises. If we consider the experiment of rolling a die, then $Omega = {1, 2, 3, 4, 5, 6}$ and a particular outcome is, for example, "2." An event is a subset of outcomes, for example $A = {2, 4, 6} = {mbox{roll an even}}$
    – Michael
    8 hours ago








  • 1




    Thanks alot, you have been very helpful!
    – S. Crim
    8 hours ago


















  • You can draw a Venn diagram with a blob $A$ inside a larger blob $B$ to understand the statement $A subseteq B implies P[A] leq P[B]$.
    – Michael
    8 hours ago










  • Why do you call the selected $omega$ an ''outcome''? Is there a specific reason for that, as in it is the outcome of some process, or is it just vocabulary?
    – S. Crim
    8 hours ago






  • 1




    This is the vocabulary for probability: A particular element in the probability space $Omega$ is called an outcome. So then $Omega$ is just the set of all possible outcomes. Intuitively you can think of running a probability experiment, and the particular outcome of the experiment is the particular $omega$ value that arises. If we consider the experiment of rolling a die, then $Omega = {1, 2, 3, 4, 5, 6}$ and a particular outcome is, for example, "2." An event is a subset of outcomes, for example $A = {2, 4, 6} = {mbox{roll an even}}$
    – Michael
    8 hours ago








  • 1




    Thanks alot, you have been very helpful!
    – S. Crim
    8 hours ago
















You can draw a Venn diagram with a blob $A$ inside a larger blob $B$ to understand the statement $A subseteq B implies P[A] leq P[B]$.
– Michael
8 hours ago




You can draw a Venn diagram with a blob $A$ inside a larger blob $B$ to understand the statement $A subseteq B implies P[A] leq P[B]$.
– Michael
8 hours ago












Why do you call the selected $omega$ an ''outcome''? Is there a specific reason for that, as in it is the outcome of some process, or is it just vocabulary?
– S. Crim
8 hours ago




Why do you call the selected $omega$ an ''outcome''? Is there a specific reason for that, as in it is the outcome of some process, or is it just vocabulary?
– S. Crim
8 hours ago




1




1




This is the vocabulary for probability: A particular element in the probability space $Omega$ is called an outcome. So then $Omega$ is just the set of all possible outcomes. Intuitively you can think of running a probability experiment, and the particular outcome of the experiment is the particular $omega$ value that arises. If we consider the experiment of rolling a die, then $Omega = {1, 2, 3, 4, 5, 6}$ and a particular outcome is, for example, "2." An event is a subset of outcomes, for example $A = {2, 4, 6} = {mbox{roll an even}}$
– Michael
8 hours ago






This is the vocabulary for probability: A particular element in the probability space $Omega$ is called an outcome. So then $Omega$ is just the set of all possible outcomes. Intuitively you can think of running a probability experiment, and the particular outcome of the experiment is the particular $omega$ value that arises. If we consider the experiment of rolling a die, then $Omega = {1, 2, 3, 4, 5, 6}$ and a particular outcome is, for example, "2." An event is a subset of outcomes, for example $A = {2, 4, 6} = {mbox{roll an even}}$
– Michael
8 hours ago






1




1




Thanks alot, you have been very helpful!
– S. Crim
8 hours ago




Thanks alot, you have been very helpful!
– S. Crim
8 hours ago


















 

draft saved


draft discarded



















































 


draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2995459%2fwhat-is-the-meaning-both-intuitively-and-mathematically-behind-the-probability%23new-answer', 'question_page');
}
);

Post as a guest




















































































Popular posts from this blog

How to change which sound is reproduced for terminal bell?

Title Spacing in Bjornstrup Chapter, Removing Chapter Number From Contents

Can I use Tabulator js library in my java Spring + Thymeleaf project?