The expected value of the second pivot in gauss jordan eliminaton
$begingroup$
Say I have a matrix
x1 x2
x3 x4
With x1, x2, x3 and x4 randomly and uniformly drawn from the interval [0,1]
After I do gauss-jordan elimination, what is the expected value of the absolute value of the second pivot (the one on the x4 position)?
So far I seem to be getting a surprinsing answer: infinity
For the second pivot can be calculated as $x4-(x3/x1)*x2$. Therefore, the sought expected value is $E(x4-(x3/x1)*x2) = E(x4)-E(x2)*E(x3)*E(1/x1) = 0.5-0.5*0.5*E(1/x1)$.
But $E(1/x1)=infty$ (see Expectation of 1/x, x uniform from 0 to 1).
Is there an error somewhere in my thinking? Where can I read a bit more about this?
linear-algebra numerical-linear-algebra expected-value
$endgroup$
add a comment |
$begingroup$
Say I have a matrix
x1 x2
x3 x4
With x1, x2, x3 and x4 randomly and uniformly drawn from the interval [0,1]
After I do gauss-jordan elimination, what is the expected value of the absolute value of the second pivot (the one on the x4 position)?
So far I seem to be getting a surprinsing answer: infinity
For the second pivot can be calculated as $x4-(x3/x1)*x2$. Therefore, the sought expected value is $E(x4-(x3/x1)*x2) = E(x4)-E(x2)*E(x3)*E(1/x1) = 0.5-0.5*0.5*E(1/x1)$.
But $E(1/x1)=infty$ (see Expectation of 1/x, x uniform from 0 to 1).
Is there an error somewhere in my thinking? Where can I read a bit more about this?
linear-algebra numerical-linear-algebra expected-value
$endgroup$
$begingroup$
Is the answer just $1$? The columns will be linearly independent with probability equal to $1$.
$endgroup$
– Ekesh Kumar
Dec 8 '18 at 22:04
$begingroup$
I dont think so. Is my calculation of the second pivot wrong? I am following the steps of gauss-jordan elimination. Also, the fact that the columns are (probably) linearly independent does not seem to imply anything about the value of the second pivot (except that the lines are also probably LI and then the second pivot is probably not zero)
$endgroup$
– josinalvo
Dec 9 '18 at 13:20
add a comment |
$begingroup$
Say I have a matrix
x1 x2
x3 x4
With x1, x2, x3 and x4 randomly and uniformly drawn from the interval [0,1]
After I do gauss-jordan elimination, what is the expected value of the absolute value of the second pivot (the one on the x4 position)?
So far I seem to be getting a surprinsing answer: infinity
For the second pivot can be calculated as $x4-(x3/x1)*x2$. Therefore, the sought expected value is $E(x4-(x3/x1)*x2) = E(x4)-E(x2)*E(x3)*E(1/x1) = 0.5-0.5*0.5*E(1/x1)$.
But $E(1/x1)=infty$ (see Expectation of 1/x, x uniform from 0 to 1).
Is there an error somewhere in my thinking? Where can I read a bit more about this?
linear-algebra numerical-linear-algebra expected-value
$endgroup$
Say I have a matrix
x1 x2
x3 x4
With x1, x2, x3 and x4 randomly and uniformly drawn from the interval [0,1]
After I do gauss-jordan elimination, what is the expected value of the absolute value of the second pivot (the one on the x4 position)?
So far I seem to be getting a surprinsing answer: infinity
For the second pivot can be calculated as $x4-(x3/x1)*x2$. Therefore, the sought expected value is $E(x4-(x3/x1)*x2) = E(x4)-E(x2)*E(x3)*E(1/x1) = 0.5-0.5*0.5*E(1/x1)$.
But $E(1/x1)=infty$ (see Expectation of 1/x, x uniform from 0 to 1).
Is there an error somewhere in my thinking? Where can I read a bit more about this?
linear-algebra numerical-linear-algebra expected-value
linear-algebra numerical-linear-algebra expected-value
edited Dec 9 '18 at 13:20
josinalvo
asked Dec 8 '18 at 20:36
josinalvojosinalvo
1,193710
1,193710
$begingroup$
Is the answer just $1$? The columns will be linearly independent with probability equal to $1$.
$endgroup$
– Ekesh Kumar
Dec 8 '18 at 22:04
$begingroup$
I dont think so. Is my calculation of the second pivot wrong? I am following the steps of gauss-jordan elimination. Also, the fact that the columns are (probably) linearly independent does not seem to imply anything about the value of the second pivot (except that the lines are also probably LI and then the second pivot is probably not zero)
$endgroup$
– josinalvo
Dec 9 '18 at 13:20
add a comment |
$begingroup$
Is the answer just $1$? The columns will be linearly independent with probability equal to $1$.
$endgroup$
– Ekesh Kumar
Dec 8 '18 at 22:04
$begingroup$
I dont think so. Is my calculation of the second pivot wrong? I am following the steps of gauss-jordan elimination. Also, the fact that the columns are (probably) linearly independent does not seem to imply anything about the value of the second pivot (except that the lines are also probably LI and then the second pivot is probably not zero)
$endgroup$
– josinalvo
Dec 9 '18 at 13:20
$begingroup$
Is the answer just $1$? The columns will be linearly independent with probability equal to $1$.
$endgroup$
– Ekesh Kumar
Dec 8 '18 at 22:04
$begingroup$
Is the answer just $1$? The columns will be linearly independent with probability equal to $1$.
$endgroup$
– Ekesh Kumar
Dec 8 '18 at 22:04
$begingroup$
I dont think so. Is my calculation of the second pivot wrong? I am following the steps of gauss-jordan elimination. Also, the fact that the columns are (probably) linearly independent does not seem to imply anything about the value of the second pivot (except that the lines are also probably LI and then the second pivot is probably not zero)
$endgroup$
– josinalvo
Dec 9 '18 at 13:20
$begingroup$
I dont think so. Is my calculation of the second pivot wrong? I am following the steps of gauss-jordan elimination. Also, the fact that the columns are (probably) linearly independent does not seem to imply anything about the value of the second pivot (except that the lines are also probably LI and then the second pivot is probably not zero)
$endgroup$
– josinalvo
Dec 9 '18 at 13:20
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Your computation appears to be correct to me. It's also supported by the following little bit of matlab code:
function gjtest(n)
mats = rand(4, n);
cells = zeros(n, 1);
for i = 1:n
s = reshape(mats(:,i), 2, 2);
s(2,:) = s(2,1)/s(1,1) * s(2,:);
cells(i) = s(2,2);
end
mean(cells)
clf;
plot(cells);
figure(gcf);
which produces, as mean values:
n mean
1000 2.17...
10000 2.14...
100000 2.94...
1000000 3.100...
10000000 4.441...
which is not growing very fast, I admit, but certainly suggests that the mean is not $1$.
So you have a random variable with a very high mean, but the odd characteristic that its "population mean" tends to be a great deal smaller. I think this indicates lots of "skew" in the data (I apologize if I'm misusing statistics terms...it's been a long time). The plots from running those code fragments certainly suggest a very biased distribution. If you replace "plot" with "histogram" you can see what I mean.
In fact, let me add a little more. The pivot $X$ is (up to small perturbation), roughly $1/x_1$. How is this distributed? Well,
begin{align}
P(a-u < X < a+u)
&= P(a-u < frac{1}{x_1} < a + u)\
&= P(frac{1}{a+u} < x_1 < frac{1}{a - u}\
&= frac{1}{a-u} - frac{1}{a + u}\
&= frac{(a+u) - (a-u)}{(a-u)(a+u)}\
&= frac{2u}{a^2-u^2}\
&approx frac{2u}{a^2}\
end{align}
for small values of $u$. The pdf, gotten by taking a limit of difference quotients, therefore gives
$$
p(x) = frac{1}{a^2}
$$
This explains why there are so few large values of $X$ -- the probability falls off quadratically with the target values. But it also explains why the expected value is infinite: if you look at $x p(x)$, whose integral is the mean of $X$, you find you're integrating $frac{1}{x}$, which gives you a $log$, and the integral diverges. Then again, it diverges about as slowly as something possibly can [I'm speaking informally here!], so it's no surprise that the values I was getting were in the $2$ to $5$ range. :)
$endgroup$
$begingroup$
that is very interesting discussion. Thanks!
$endgroup$
– josinalvo
Dec 10 '18 at 13:07
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3031611%2fthe-expected-value-of-the-second-pivot-in-gauss-jordan-eliminaton%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Your computation appears to be correct to me. It's also supported by the following little bit of matlab code:
function gjtest(n)
mats = rand(4, n);
cells = zeros(n, 1);
for i = 1:n
s = reshape(mats(:,i), 2, 2);
s(2,:) = s(2,1)/s(1,1) * s(2,:);
cells(i) = s(2,2);
end
mean(cells)
clf;
plot(cells);
figure(gcf);
which produces, as mean values:
n mean
1000 2.17...
10000 2.14...
100000 2.94...
1000000 3.100...
10000000 4.441...
which is not growing very fast, I admit, but certainly suggests that the mean is not $1$.
So you have a random variable with a very high mean, but the odd characteristic that its "population mean" tends to be a great deal smaller. I think this indicates lots of "skew" in the data (I apologize if I'm misusing statistics terms...it's been a long time). The plots from running those code fragments certainly suggest a very biased distribution. If you replace "plot" with "histogram" you can see what I mean.
In fact, let me add a little more. The pivot $X$ is (up to small perturbation), roughly $1/x_1$. How is this distributed? Well,
begin{align}
P(a-u < X < a+u)
&= P(a-u < frac{1}{x_1} < a + u)\
&= P(frac{1}{a+u} < x_1 < frac{1}{a - u}\
&= frac{1}{a-u} - frac{1}{a + u}\
&= frac{(a+u) - (a-u)}{(a-u)(a+u)}\
&= frac{2u}{a^2-u^2}\
&approx frac{2u}{a^2}\
end{align}
for small values of $u$. The pdf, gotten by taking a limit of difference quotients, therefore gives
$$
p(x) = frac{1}{a^2}
$$
This explains why there are so few large values of $X$ -- the probability falls off quadratically with the target values. But it also explains why the expected value is infinite: if you look at $x p(x)$, whose integral is the mean of $X$, you find you're integrating $frac{1}{x}$, which gives you a $log$, and the integral diverges. Then again, it diverges about as slowly as something possibly can [I'm speaking informally here!], so it's no surprise that the values I was getting were in the $2$ to $5$ range. :)
$endgroup$
$begingroup$
that is very interesting discussion. Thanks!
$endgroup$
– josinalvo
Dec 10 '18 at 13:07
add a comment |
$begingroup$
Your computation appears to be correct to me. It's also supported by the following little bit of matlab code:
function gjtest(n)
mats = rand(4, n);
cells = zeros(n, 1);
for i = 1:n
s = reshape(mats(:,i), 2, 2);
s(2,:) = s(2,1)/s(1,1) * s(2,:);
cells(i) = s(2,2);
end
mean(cells)
clf;
plot(cells);
figure(gcf);
which produces, as mean values:
n mean
1000 2.17...
10000 2.14...
100000 2.94...
1000000 3.100...
10000000 4.441...
which is not growing very fast, I admit, but certainly suggests that the mean is not $1$.
So you have a random variable with a very high mean, but the odd characteristic that its "population mean" tends to be a great deal smaller. I think this indicates lots of "skew" in the data (I apologize if I'm misusing statistics terms...it's been a long time). The plots from running those code fragments certainly suggest a very biased distribution. If you replace "plot" with "histogram" you can see what I mean.
In fact, let me add a little more. The pivot $X$ is (up to small perturbation), roughly $1/x_1$. How is this distributed? Well,
begin{align}
P(a-u < X < a+u)
&= P(a-u < frac{1}{x_1} < a + u)\
&= P(frac{1}{a+u} < x_1 < frac{1}{a - u}\
&= frac{1}{a-u} - frac{1}{a + u}\
&= frac{(a+u) - (a-u)}{(a-u)(a+u)}\
&= frac{2u}{a^2-u^2}\
&approx frac{2u}{a^2}\
end{align}
for small values of $u$. The pdf, gotten by taking a limit of difference quotients, therefore gives
$$
p(x) = frac{1}{a^2}
$$
This explains why there are so few large values of $X$ -- the probability falls off quadratically with the target values. But it also explains why the expected value is infinite: if you look at $x p(x)$, whose integral is the mean of $X$, you find you're integrating $frac{1}{x}$, which gives you a $log$, and the integral diverges. Then again, it diverges about as slowly as something possibly can [I'm speaking informally here!], so it's no surprise that the values I was getting were in the $2$ to $5$ range. :)
$endgroup$
$begingroup$
that is very interesting discussion. Thanks!
$endgroup$
– josinalvo
Dec 10 '18 at 13:07
add a comment |
$begingroup$
Your computation appears to be correct to me. It's also supported by the following little bit of matlab code:
function gjtest(n)
mats = rand(4, n);
cells = zeros(n, 1);
for i = 1:n
s = reshape(mats(:,i), 2, 2);
s(2,:) = s(2,1)/s(1,1) * s(2,:);
cells(i) = s(2,2);
end
mean(cells)
clf;
plot(cells);
figure(gcf);
which produces, as mean values:
n mean
1000 2.17...
10000 2.14...
100000 2.94...
1000000 3.100...
10000000 4.441...
which is not growing very fast, I admit, but certainly suggests that the mean is not $1$.
So you have a random variable with a very high mean, but the odd characteristic that its "population mean" tends to be a great deal smaller. I think this indicates lots of "skew" in the data (I apologize if I'm misusing statistics terms...it's been a long time). The plots from running those code fragments certainly suggest a very biased distribution. If you replace "plot" with "histogram" you can see what I mean.
In fact, let me add a little more. The pivot $X$ is (up to small perturbation), roughly $1/x_1$. How is this distributed? Well,
begin{align}
P(a-u < X < a+u)
&= P(a-u < frac{1}{x_1} < a + u)\
&= P(frac{1}{a+u} < x_1 < frac{1}{a - u}\
&= frac{1}{a-u} - frac{1}{a + u}\
&= frac{(a+u) - (a-u)}{(a-u)(a+u)}\
&= frac{2u}{a^2-u^2}\
&approx frac{2u}{a^2}\
end{align}
for small values of $u$. The pdf, gotten by taking a limit of difference quotients, therefore gives
$$
p(x) = frac{1}{a^2}
$$
This explains why there are so few large values of $X$ -- the probability falls off quadratically with the target values. But it also explains why the expected value is infinite: if you look at $x p(x)$, whose integral is the mean of $X$, you find you're integrating $frac{1}{x}$, which gives you a $log$, and the integral diverges. Then again, it diverges about as slowly as something possibly can [I'm speaking informally here!], so it's no surprise that the values I was getting were in the $2$ to $5$ range. :)
$endgroup$
Your computation appears to be correct to me. It's also supported by the following little bit of matlab code:
function gjtest(n)
mats = rand(4, n);
cells = zeros(n, 1);
for i = 1:n
s = reshape(mats(:,i), 2, 2);
s(2,:) = s(2,1)/s(1,1) * s(2,:);
cells(i) = s(2,2);
end
mean(cells)
clf;
plot(cells);
figure(gcf);
which produces, as mean values:
n mean
1000 2.17...
10000 2.14...
100000 2.94...
1000000 3.100...
10000000 4.441...
which is not growing very fast, I admit, but certainly suggests that the mean is not $1$.
So you have a random variable with a very high mean, but the odd characteristic that its "population mean" tends to be a great deal smaller. I think this indicates lots of "skew" in the data (I apologize if I'm misusing statistics terms...it's been a long time). The plots from running those code fragments certainly suggest a very biased distribution. If you replace "plot" with "histogram" you can see what I mean.
In fact, let me add a little more. The pivot $X$ is (up to small perturbation), roughly $1/x_1$. How is this distributed? Well,
begin{align}
P(a-u < X < a+u)
&= P(a-u < frac{1}{x_1} < a + u)\
&= P(frac{1}{a+u} < x_1 < frac{1}{a - u}\
&= frac{1}{a-u} - frac{1}{a + u}\
&= frac{(a+u) - (a-u)}{(a-u)(a+u)}\
&= frac{2u}{a^2-u^2}\
&approx frac{2u}{a^2}\
end{align}
for small values of $u$. The pdf, gotten by taking a limit of difference quotients, therefore gives
$$
p(x) = frac{1}{a^2}
$$
This explains why there are so few large values of $X$ -- the probability falls off quadratically with the target values. But it also explains why the expected value is infinite: if you look at $x p(x)$, whose integral is the mean of $X$, you find you're integrating $frac{1}{x}$, which gives you a $log$, and the integral diverges. Then again, it diverges about as slowly as something possibly can [I'm speaking informally here!], so it's no surprise that the values I was getting were in the $2$ to $5$ range. :)
edited Dec 9 '18 at 16:58
answered Dec 9 '18 at 14:49
John HughesJohn Hughes
64.6k24191
64.6k24191
$begingroup$
that is very interesting discussion. Thanks!
$endgroup$
– josinalvo
Dec 10 '18 at 13:07
add a comment |
$begingroup$
that is very interesting discussion. Thanks!
$endgroup$
– josinalvo
Dec 10 '18 at 13:07
$begingroup$
that is very interesting discussion. Thanks!
$endgroup$
– josinalvo
Dec 10 '18 at 13:07
$begingroup$
that is very interesting discussion. Thanks!
$endgroup$
– josinalvo
Dec 10 '18 at 13:07
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3031611%2fthe-expected-value-of-the-second-pivot-in-gauss-jordan-eliminaton%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Is the answer just $1$? The columns will be linearly independent with probability equal to $1$.
$endgroup$
– Ekesh Kumar
Dec 8 '18 at 22:04
$begingroup$
I dont think so. Is my calculation of the second pivot wrong? I am following the steps of gauss-jordan elimination. Also, the fact that the columns are (probably) linearly independent does not seem to imply anything about the value of the second pivot (except that the lines are also probably LI and then the second pivot is probably not zero)
$endgroup$
– josinalvo
Dec 9 '18 at 13:20