Two labelled sets of vectors in $mathbb{R^n}$ with same pairwise dots products












0












$begingroup$


Given ${x_i}_{1 leq i leq m}$ and ${y_i}_{1 leq i leq m}$ such that $x_i cdot x_j$ = $y_i cdot y_j, forall 1 leq i,j leq m$, what can I conclude about the two sets of vectors?



Clearly they need not be identical as $y_i = Ax_i$ for any orthogonal matrix $A$ is a valid solution. Intuitively it seems that all valid solutions may be of this form, but I am not certain that is the case, and would ideally like an algebraic proof if it is the case.



As a follow on question, I am also curious if the answer would change if instead we said $x_i cdot x_j$ = $y_i cdot y_j, forall 1 leq i neq j leq m$, so that we no longer had (explicitly anyway) that $lVert x_i rVert$ = $lVert y_i rVert$.



Any help would be much appreciated, thanks.










share|cite|improve this question









$endgroup$












  • $begingroup$
    Can you try by induction?
    $endgroup$
    – Federico
    Dec 6 '18 at 18:03










  • $begingroup$
    Another hint. Let's say that $c_1x_1+dots+c_mx_m=0$. Can you prove that $c_1y_1+dots+c_my_m=0$? Try taking the squared norm of both vectors
    $endgroup$
    – Federico
    Dec 6 '18 at 18:05












  • $begingroup$
    This should tell you that there exists a linear transformation such that $A x_i=y_i$. Then what can you say about $A$? Does it necessarily have to be orthogonal?
    $endgroup$
    – Federico
    Dec 6 '18 at 18:06










  • $begingroup$
    Ok so $c_1x_1 + dots + c_mx_m = 0 Rightarrow lVert c_1x_1 + dots + c_mx_m rVert^2 = 0 = sum_{1 leq i,j, leq m} c_ic_jx_i cdot x_j = sum_{1 leq i,j, leq m} c_ic_jy_i cdot y_j = lVert c_1y_1 + dots + c_my_m rVert^2 Rightarrow c_1y_1 + dots + c_my_m = 0$, but why does this tell me the transformation is linear?
    $endgroup$
    – ludog
    Dec 6 '18 at 20:20












  • $begingroup$
    The inverse statement is clear, but I can't see why $A(c_1x_1 + dots c_kx_k) = c_1A(x_1) + dots + c_kA(x_k)$
    $endgroup$
    – ludog
    Dec 6 '18 at 20:51
















0












$begingroup$


Given ${x_i}_{1 leq i leq m}$ and ${y_i}_{1 leq i leq m}$ such that $x_i cdot x_j$ = $y_i cdot y_j, forall 1 leq i,j leq m$, what can I conclude about the two sets of vectors?



Clearly they need not be identical as $y_i = Ax_i$ for any orthogonal matrix $A$ is a valid solution. Intuitively it seems that all valid solutions may be of this form, but I am not certain that is the case, and would ideally like an algebraic proof if it is the case.



As a follow on question, I am also curious if the answer would change if instead we said $x_i cdot x_j$ = $y_i cdot y_j, forall 1 leq i neq j leq m$, so that we no longer had (explicitly anyway) that $lVert x_i rVert$ = $lVert y_i rVert$.



Any help would be much appreciated, thanks.










share|cite|improve this question









$endgroup$












  • $begingroup$
    Can you try by induction?
    $endgroup$
    – Federico
    Dec 6 '18 at 18:03










  • $begingroup$
    Another hint. Let's say that $c_1x_1+dots+c_mx_m=0$. Can you prove that $c_1y_1+dots+c_my_m=0$? Try taking the squared norm of both vectors
    $endgroup$
    – Federico
    Dec 6 '18 at 18:05












  • $begingroup$
    This should tell you that there exists a linear transformation such that $A x_i=y_i$. Then what can you say about $A$? Does it necessarily have to be orthogonal?
    $endgroup$
    – Federico
    Dec 6 '18 at 18:06










  • $begingroup$
    Ok so $c_1x_1 + dots + c_mx_m = 0 Rightarrow lVert c_1x_1 + dots + c_mx_m rVert^2 = 0 = sum_{1 leq i,j, leq m} c_ic_jx_i cdot x_j = sum_{1 leq i,j, leq m} c_ic_jy_i cdot y_j = lVert c_1y_1 + dots + c_my_m rVert^2 Rightarrow c_1y_1 + dots + c_my_m = 0$, but why does this tell me the transformation is linear?
    $endgroup$
    – ludog
    Dec 6 '18 at 20:20












  • $begingroup$
    The inverse statement is clear, but I can't see why $A(c_1x_1 + dots c_kx_k) = c_1A(x_1) + dots + c_kA(x_k)$
    $endgroup$
    – ludog
    Dec 6 '18 at 20:51














0












0








0





$begingroup$


Given ${x_i}_{1 leq i leq m}$ and ${y_i}_{1 leq i leq m}$ such that $x_i cdot x_j$ = $y_i cdot y_j, forall 1 leq i,j leq m$, what can I conclude about the two sets of vectors?



Clearly they need not be identical as $y_i = Ax_i$ for any orthogonal matrix $A$ is a valid solution. Intuitively it seems that all valid solutions may be of this form, but I am not certain that is the case, and would ideally like an algebraic proof if it is the case.



As a follow on question, I am also curious if the answer would change if instead we said $x_i cdot x_j$ = $y_i cdot y_j, forall 1 leq i neq j leq m$, so that we no longer had (explicitly anyway) that $lVert x_i rVert$ = $lVert y_i rVert$.



Any help would be much appreciated, thanks.










share|cite|improve this question









$endgroup$




Given ${x_i}_{1 leq i leq m}$ and ${y_i}_{1 leq i leq m}$ such that $x_i cdot x_j$ = $y_i cdot y_j, forall 1 leq i,j leq m$, what can I conclude about the two sets of vectors?



Clearly they need not be identical as $y_i = Ax_i$ for any orthogonal matrix $A$ is a valid solution. Intuitively it seems that all valid solutions may be of this form, but I am not certain that is the case, and would ideally like an algebraic proof if it is the case.



As a follow on question, I am also curious if the answer would change if instead we said $x_i cdot x_j$ = $y_i cdot y_j, forall 1 leq i neq j leq m$, so that we no longer had (explicitly anyway) that $lVert x_i rVert$ = $lVert y_i rVert$.



Any help would be much appreciated, thanks.







linear-algebra linear-transformations






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Dec 6 '18 at 17:58









ludogludog

155




155












  • $begingroup$
    Can you try by induction?
    $endgroup$
    – Federico
    Dec 6 '18 at 18:03










  • $begingroup$
    Another hint. Let's say that $c_1x_1+dots+c_mx_m=0$. Can you prove that $c_1y_1+dots+c_my_m=0$? Try taking the squared norm of both vectors
    $endgroup$
    – Federico
    Dec 6 '18 at 18:05












  • $begingroup$
    This should tell you that there exists a linear transformation such that $A x_i=y_i$. Then what can you say about $A$? Does it necessarily have to be orthogonal?
    $endgroup$
    – Federico
    Dec 6 '18 at 18:06










  • $begingroup$
    Ok so $c_1x_1 + dots + c_mx_m = 0 Rightarrow lVert c_1x_1 + dots + c_mx_m rVert^2 = 0 = sum_{1 leq i,j, leq m} c_ic_jx_i cdot x_j = sum_{1 leq i,j, leq m} c_ic_jy_i cdot y_j = lVert c_1y_1 + dots + c_my_m rVert^2 Rightarrow c_1y_1 + dots + c_my_m = 0$, but why does this tell me the transformation is linear?
    $endgroup$
    – ludog
    Dec 6 '18 at 20:20












  • $begingroup$
    The inverse statement is clear, but I can't see why $A(c_1x_1 + dots c_kx_k) = c_1A(x_1) + dots + c_kA(x_k)$
    $endgroup$
    – ludog
    Dec 6 '18 at 20:51


















  • $begingroup$
    Can you try by induction?
    $endgroup$
    – Federico
    Dec 6 '18 at 18:03










  • $begingroup$
    Another hint. Let's say that $c_1x_1+dots+c_mx_m=0$. Can you prove that $c_1y_1+dots+c_my_m=0$? Try taking the squared norm of both vectors
    $endgroup$
    – Federico
    Dec 6 '18 at 18:05












  • $begingroup$
    This should tell you that there exists a linear transformation such that $A x_i=y_i$. Then what can you say about $A$? Does it necessarily have to be orthogonal?
    $endgroup$
    – Federico
    Dec 6 '18 at 18:06










  • $begingroup$
    Ok so $c_1x_1 + dots + c_mx_m = 0 Rightarrow lVert c_1x_1 + dots + c_mx_m rVert^2 = 0 = sum_{1 leq i,j, leq m} c_ic_jx_i cdot x_j = sum_{1 leq i,j, leq m} c_ic_jy_i cdot y_j = lVert c_1y_1 + dots + c_my_m rVert^2 Rightarrow c_1y_1 + dots + c_my_m = 0$, but why does this tell me the transformation is linear?
    $endgroup$
    – ludog
    Dec 6 '18 at 20:20












  • $begingroup$
    The inverse statement is clear, but I can't see why $A(c_1x_1 + dots c_kx_k) = c_1A(x_1) + dots + c_kA(x_k)$
    $endgroup$
    – ludog
    Dec 6 '18 at 20:51
















$begingroup$
Can you try by induction?
$endgroup$
– Federico
Dec 6 '18 at 18:03




$begingroup$
Can you try by induction?
$endgroup$
– Federico
Dec 6 '18 at 18:03












$begingroup$
Another hint. Let's say that $c_1x_1+dots+c_mx_m=0$. Can you prove that $c_1y_1+dots+c_my_m=0$? Try taking the squared norm of both vectors
$endgroup$
– Federico
Dec 6 '18 at 18:05






$begingroup$
Another hint. Let's say that $c_1x_1+dots+c_mx_m=0$. Can you prove that $c_1y_1+dots+c_my_m=0$? Try taking the squared norm of both vectors
$endgroup$
– Federico
Dec 6 '18 at 18:05














$begingroup$
This should tell you that there exists a linear transformation such that $A x_i=y_i$. Then what can you say about $A$? Does it necessarily have to be orthogonal?
$endgroup$
– Federico
Dec 6 '18 at 18:06




$begingroup$
This should tell you that there exists a linear transformation such that $A x_i=y_i$. Then what can you say about $A$? Does it necessarily have to be orthogonal?
$endgroup$
– Federico
Dec 6 '18 at 18:06












$begingroup$
Ok so $c_1x_1 + dots + c_mx_m = 0 Rightarrow lVert c_1x_1 + dots + c_mx_m rVert^2 = 0 = sum_{1 leq i,j, leq m} c_ic_jx_i cdot x_j = sum_{1 leq i,j, leq m} c_ic_jy_i cdot y_j = lVert c_1y_1 + dots + c_my_m rVert^2 Rightarrow c_1y_1 + dots + c_my_m = 0$, but why does this tell me the transformation is linear?
$endgroup$
– ludog
Dec 6 '18 at 20:20






$begingroup$
Ok so $c_1x_1 + dots + c_mx_m = 0 Rightarrow lVert c_1x_1 + dots + c_mx_m rVert^2 = 0 = sum_{1 leq i,j, leq m} c_ic_jx_i cdot x_j = sum_{1 leq i,j, leq m} c_ic_jy_i cdot y_j = lVert c_1y_1 + dots + c_my_m rVert^2 Rightarrow c_1y_1 + dots + c_my_m = 0$, but why does this tell me the transformation is linear?
$endgroup$
– ludog
Dec 6 '18 at 20:20














$begingroup$
The inverse statement is clear, but I can't see why $A(c_1x_1 + dots c_kx_k) = c_1A(x_1) + dots + c_kA(x_k)$
$endgroup$
– ludog
Dec 6 '18 at 20:51




$begingroup$
The inverse statement is clear, but I can't see why $A(c_1x_1 + dots c_kx_k) = c_1A(x_1) + dots + c_kA(x_k)$
$endgroup$
– ludog
Dec 6 '18 at 20:51










1 Answer
1






active

oldest

votes


















1












$begingroup$

The following holds under the assumption of the problem: $x_icdot x_j=y_icdot y_j$ for all $1leq i,jleq m$.



Lemma. If $c_1x_1+dots+c_mx_m=0$, then also $c_1y_1+dots+c_my_m=0$.



Proof.
$$
begin{split}
|c_1y_1+dots+c_my_m|^2
&= sum_{1leq i,jleq m} c_ic_j y_icdot y_j \
&= sum_{1leq i,jleq m} c_ic_j x_icdot x_j
= |c_1x_1+dots+c_mx_m|^2 = 0 .
end{split}
$$



Now I define a map $A:mathrm{span}{x_1,dots,x_m}tomathrm{span}{y_1,dots,y_m}$.



For $x=a_1x_1+dots+a_mx_minmathrm{span}{x_1,dots,x_m}$, define
$$
Ax = a_1y_1+dots+a_my_m.
$$

The map is well defined because if
$$
x=a_1x_1+dots+a_mx_m=b_1x_1+dots+b_mx_m,
$$

then letting $c_i=a_i-b_i$ we have $c_1x_1+dots+c_mx_m = 0$, which by the previous lemma implies $c_1y_1+dots+c_my_m$, which is equivalent to
$$
a_1y_1+dots+a_my_m = b_1y_1+dots+b_my_m.
$$



The map is clearly linear.
Moreover $A$ preserves the norm, because, as we have already computed, if $x=a_1x_1+dots+a_mx_m$, then
$$
begin{split}
|Ax| &= |a_1y_1+dots+a_my_m|^2
= sum_{1leq i,jleq m} a_ia_j y_icdot y_j \
&= sum_{1leq i,jleq m} a_ia_j x_icdot x_j
= |a_1x_1+dots+a_mx_m|^2 = |x|^2 .
end{split}
$$



This means that $A$ is orthogonal on its domain of definition, which is $mathrm{span}{x_1,dots,x_m}$, and can therefore be completed to an orthogonal linear transformation $A:mathbb R^ntomathbb R^n$.



Notice that $A$ is uniquely determined only on $mathrm{span}{x_1,dots,x_m}$: its extension to the complement can be arbitrary, but in particular can be orthogonal.



Edit: proof that preserving the norm and being orthogonal are equivalent: Theorem 2.1.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Excellent, thank you. I know that orthogonal matrices preserve norms, but wasn't sure that the converse is necessarily true. After a bit of digging I found a proof in Thm 2.1 here that norm-preserving linear transformations are always orthogonal math.lsa.umich.edu/~rauch/555/Conformal_Matrices.pdf, (Thm 2.1). (worth editing your answer to include link?) So does this mean there may exist multiple square matrices $A$ with $Ax_i = y_i$, but that at least one such matrix will be orthogonal?
    $endgroup$
    – ludog
    Dec 7 '18 at 16:02










  • $begingroup$
    Regarding orthogonality, I thought it was a pretty well known fact that the two notions are equivalent. I'll add the link nevertheless
    $endgroup$
    – Federico
    Dec 7 '18 at 16:07










  • $begingroup$
    "So does this mean there may exist multiple square matrices $A$ with $Ax_i=y_i$, but that at least one such matrix will be orthogonal?" Exactly. The matrix is uniquely determined on $mathrm{span}{x_i}$. On the complement you can do whatever you want. But you can extend it to an orthogonal matrix. Notice that also this extension is not unique. You can extend it to two different orthogonal matrices. Because you just take an arbitrary basis of $mathrm{span}{x_i}^perp$ and $mathrm{span}{y_i}^perp$.
    $endgroup$
    – Federico
    Dec 7 '18 at 16:11












  • $begingroup$
    Hmm ok interesting. Thanks very much for your help :)
    $endgroup$
    – ludog
    Dec 7 '18 at 16:16











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3028830%2ftwo-labelled-sets-of-vectors-in-mathbbrn-with-same-pairwise-dots-products%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









1












$begingroup$

The following holds under the assumption of the problem: $x_icdot x_j=y_icdot y_j$ for all $1leq i,jleq m$.



Lemma. If $c_1x_1+dots+c_mx_m=0$, then also $c_1y_1+dots+c_my_m=0$.



Proof.
$$
begin{split}
|c_1y_1+dots+c_my_m|^2
&= sum_{1leq i,jleq m} c_ic_j y_icdot y_j \
&= sum_{1leq i,jleq m} c_ic_j x_icdot x_j
= |c_1x_1+dots+c_mx_m|^2 = 0 .
end{split}
$$



Now I define a map $A:mathrm{span}{x_1,dots,x_m}tomathrm{span}{y_1,dots,y_m}$.



For $x=a_1x_1+dots+a_mx_minmathrm{span}{x_1,dots,x_m}$, define
$$
Ax = a_1y_1+dots+a_my_m.
$$

The map is well defined because if
$$
x=a_1x_1+dots+a_mx_m=b_1x_1+dots+b_mx_m,
$$

then letting $c_i=a_i-b_i$ we have $c_1x_1+dots+c_mx_m = 0$, which by the previous lemma implies $c_1y_1+dots+c_my_m$, which is equivalent to
$$
a_1y_1+dots+a_my_m = b_1y_1+dots+b_my_m.
$$



The map is clearly linear.
Moreover $A$ preserves the norm, because, as we have already computed, if $x=a_1x_1+dots+a_mx_m$, then
$$
begin{split}
|Ax| &= |a_1y_1+dots+a_my_m|^2
= sum_{1leq i,jleq m} a_ia_j y_icdot y_j \
&= sum_{1leq i,jleq m} a_ia_j x_icdot x_j
= |a_1x_1+dots+a_mx_m|^2 = |x|^2 .
end{split}
$$



This means that $A$ is orthogonal on its domain of definition, which is $mathrm{span}{x_1,dots,x_m}$, and can therefore be completed to an orthogonal linear transformation $A:mathbb R^ntomathbb R^n$.



Notice that $A$ is uniquely determined only on $mathrm{span}{x_1,dots,x_m}$: its extension to the complement can be arbitrary, but in particular can be orthogonal.



Edit: proof that preserving the norm and being orthogonal are equivalent: Theorem 2.1.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Excellent, thank you. I know that orthogonal matrices preserve norms, but wasn't sure that the converse is necessarily true. After a bit of digging I found a proof in Thm 2.1 here that norm-preserving linear transformations are always orthogonal math.lsa.umich.edu/~rauch/555/Conformal_Matrices.pdf, (Thm 2.1). (worth editing your answer to include link?) So does this mean there may exist multiple square matrices $A$ with $Ax_i = y_i$, but that at least one such matrix will be orthogonal?
    $endgroup$
    – ludog
    Dec 7 '18 at 16:02










  • $begingroup$
    Regarding orthogonality, I thought it was a pretty well known fact that the two notions are equivalent. I'll add the link nevertheless
    $endgroup$
    – Federico
    Dec 7 '18 at 16:07










  • $begingroup$
    "So does this mean there may exist multiple square matrices $A$ with $Ax_i=y_i$, but that at least one such matrix will be orthogonal?" Exactly. The matrix is uniquely determined on $mathrm{span}{x_i}$. On the complement you can do whatever you want. But you can extend it to an orthogonal matrix. Notice that also this extension is not unique. You can extend it to two different orthogonal matrices. Because you just take an arbitrary basis of $mathrm{span}{x_i}^perp$ and $mathrm{span}{y_i}^perp$.
    $endgroup$
    – Federico
    Dec 7 '18 at 16:11












  • $begingroup$
    Hmm ok interesting. Thanks very much for your help :)
    $endgroup$
    – ludog
    Dec 7 '18 at 16:16
















1












$begingroup$

The following holds under the assumption of the problem: $x_icdot x_j=y_icdot y_j$ for all $1leq i,jleq m$.



Lemma. If $c_1x_1+dots+c_mx_m=0$, then also $c_1y_1+dots+c_my_m=0$.



Proof.
$$
begin{split}
|c_1y_1+dots+c_my_m|^2
&= sum_{1leq i,jleq m} c_ic_j y_icdot y_j \
&= sum_{1leq i,jleq m} c_ic_j x_icdot x_j
= |c_1x_1+dots+c_mx_m|^2 = 0 .
end{split}
$$



Now I define a map $A:mathrm{span}{x_1,dots,x_m}tomathrm{span}{y_1,dots,y_m}$.



For $x=a_1x_1+dots+a_mx_minmathrm{span}{x_1,dots,x_m}$, define
$$
Ax = a_1y_1+dots+a_my_m.
$$

The map is well defined because if
$$
x=a_1x_1+dots+a_mx_m=b_1x_1+dots+b_mx_m,
$$

then letting $c_i=a_i-b_i$ we have $c_1x_1+dots+c_mx_m = 0$, which by the previous lemma implies $c_1y_1+dots+c_my_m$, which is equivalent to
$$
a_1y_1+dots+a_my_m = b_1y_1+dots+b_my_m.
$$



The map is clearly linear.
Moreover $A$ preserves the norm, because, as we have already computed, if $x=a_1x_1+dots+a_mx_m$, then
$$
begin{split}
|Ax| &= |a_1y_1+dots+a_my_m|^2
= sum_{1leq i,jleq m} a_ia_j y_icdot y_j \
&= sum_{1leq i,jleq m} a_ia_j x_icdot x_j
= |a_1x_1+dots+a_mx_m|^2 = |x|^2 .
end{split}
$$



This means that $A$ is orthogonal on its domain of definition, which is $mathrm{span}{x_1,dots,x_m}$, and can therefore be completed to an orthogonal linear transformation $A:mathbb R^ntomathbb R^n$.



Notice that $A$ is uniquely determined only on $mathrm{span}{x_1,dots,x_m}$: its extension to the complement can be arbitrary, but in particular can be orthogonal.



Edit: proof that preserving the norm and being orthogonal are equivalent: Theorem 2.1.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Excellent, thank you. I know that orthogonal matrices preserve norms, but wasn't sure that the converse is necessarily true. After a bit of digging I found a proof in Thm 2.1 here that norm-preserving linear transformations are always orthogonal math.lsa.umich.edu/~rauch/555/Conformal_Matrices.pdf, (Thm 2.1). (worth editing your answer to include link?) So does this mean there may exist multiple square matrices $A$ with $Ax_i = y_i$, but that at least one such matrix will be orthogonal?
    $endgroup$
    – ludog
    Dec 7 '18 at 16:02










  • $begingroup$
    Regarding orthogonality, I thought it was a pretty well known fact that the two notions are equivalent. I'll add the link nevertheless
    $endgroup$
    – Federico
    Dec 7 '18 at 16:07










  • $begingroup$
    "So does this mean there may exist multiple square matrices $A$ with $Ax_i=y_i$, but that at least one such matrix will be orthogonal?" Exactly. The matrix is uniquely determined on $mathrm{span}{x_i}$. On the complement you can do whatever you want. But you can extend it to an orthogonal matrix. Notice that also this extension is not unique. You can extend it to two different orthogonal matrices. Because you just take an arbitrary basis of $mathrm{span}{x_i}^perp$ and $mathrm{span}{y_i}^perp$.
    $endgroup$
    – Federico
    Dec 7 '18 at 16:11












  • $begingroup$
    Hmm ok interesting. Thanks very much for your help :)
    $endgroup$
    – ludog
    Dec 7 '18 at 16:16














1












1








1





$begingroup$

The following holds under the assumption of the problem: $x_icdot x_j=y_icdot y_j$ for all $1leq i,jleq m$.



Lemma. If $c_1x_1+dots+c_mx_m=0$, then also $c_1y_1+dots+c_my_m=0$.



Proof.
$$
begin{split}
|c_1y_1+dots+c_my_m|^2
&= sum_{1leq i,jleq m} c_ic_j y_icdot y_j \
&= sum_{1leq i,jleq m} c_ic_j x_icdot x_j
= |c_1x_1+dots+c_mx_m|^2 = 0 .
end{split}
$$



Now I define a map $A:mathrm{span}{x_1,dots,x_m}tomathrm{span}{y_1,dots,y_m}$.



For $x=a_1x_1+dots+a_mx_minmathrm{span}{x_1,dots,x_m}$, define
$$
Ax = a_1y_1+dots+a_my_m.
$$

The map is well defined because if
$$
x=a_1x_1+dots+a_mx_m=b_1x_1+dots+b_mx_m,
$$

then letting $c_i=a_i-b_i$ we have $c_1x_1+dots+c_mx_m = 0$, which by the previous lemma implies $c_1y_1+dots+c_my_m$, which is equivalent to
$$
a_1y_1+dots+a_my_m = b_1y_1+dots+b_my_m.
$$



The map is clearly linear.
Moreover $A$ preserves the norm, because, as we have already computed, if $x=a_1x_1+dots+a_mx_m$, then
$$
begin{split}
|Ax| &= |a_1y_1+dots+a_my_m|^2
= sum_{1leq i,jleq m} a_ia_j y_icdot y_j \
&= sum_{1leq i,jleq m} a_ia_j x_icdot x_j
= |a_1x_1+dots+a_mx_m|^2 = |x|^2 .
end{split}
$$



This means that $A$ is orthogonal on its domain of definition, which is $mathrm{span}{x_1,dots,x_m}$, and can therefore be completed to an orthogonal linear transformation $A:mathbb R^ntomathbb R^n$.



Notice that $A$ is uniquely determined only on $mathrm{span}{x_1,dots,x_m}$: its extension to the complement can be arbitrary, but in particular can be orthogonal.



Edit: proof that preserving the norm and being orthogonal are equivalent: Theorem 2.1.






share|cite|improve this answer











$endgroup$



The following holds under the assumption of the problem: $x_icdot x_j=y_icdot y_j$ for all $1leq i,jleq m$.



Lemma. If $c_1x_1+dots+c_mx_m=0$, then also $c_1y_1+dots+c_my_m=0$.



Proof.
$$
begin{split}
|c_1y_1+dots+c_my_m|^2
&= sum_{1leq i,jleq m} c_ic_j y_icdot y_j \
&= sum_{1leq i,jleq m} c_ic_j x_icdot x_j
= |c_1x_1+dots+c_mx_m|^2 = 0 .
end{split}
$$



Now I define a map $A:mathrm{span}{x_1,dots,x_m}tomathrm{span}{y_1,dots,y_m}$.



For $x=a_1x_1+dots+a_mx_minmathrm{span}{x_1,dots,x_m}$, define
$$
Ax = a_1y_1+dots+a_my_m.
$$

The map is well defined because if
$$
x=a_1x_1+dots+a_mx_m=b_1x_1+dots+b_mx_m,
$$

then letting $c_i=a_i-b_i$ we have $c_1x_1+dots+c_mx_m = 0$, which by the previous lemma implies $c_1y_1+dots+c_my_m$, which is equivalent to
$$
a_1y_1+dots+a_my_m = b_1y_1+dots+b_my_m.
$$



The map is clearly linear.
Moreover $A$ preserves the norm, because, as we have already computed, if $x=a_1x_1+dots+a_mx_m$, then
$$
begin{split}
|Ax| &= |a_1y_1+dots+a_my_m|^2
= sum_{1leq i,jleq m} a_ia_j y_icdot y_j \
&= sum_{1leq i,jleq m} a_ia_j x_icdot x_j
= |a_1x_1+dots+a_mx_m|^2 = |x|^2 .
end{split}
$$



This means that $A$ is orthogonal on its domain of definition, which is $mathrm{span}{x_1,dots,x_m}$, and can therefore be completed to an orthogonal linear transformation $A:mathbb R^ntomathbb R^n$.



Notice that $A$ is uniquely determined only on $mathrm{span}{x_1,dots,x_m}$: its extension to the complement can be arbitrary, but in particular can be orthogonal.



Edit: proof that preserving the norm and being orthogonal are equivalent: Theorem 2.1.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Dec 7 '18 at 16:08

























answered Dec 7 '18 at 14:23









FedericoFederico

5,144514




5,144514












  • $begingroup$
    Excellent, thank you. I know that orthogonal matrices preserve norms, but wasn't sure that the converse is necessarily true. After a bit of digging I found a proof in Thm 2.1 here that norm-preserving linear transformations are always orthogonal math.lsa.umich.edu/~rauch/555/Conformal_Matrices.pdf, (Thm 2.1). (worth editing your answer to include link?) So does this mean there may exist multiple square matrices $A$ with $Ax_i = y_i$, but that at least one such matrix will be orthogonal?
    $endgroup$
    – ludog
    Dec 7 '18 at 16:02










  • $begingroup$
    Regarding orthogonality, I thought it was a pretty well known fact that the two notions are equivalent. I'll add the link nevertheless
    $endgroup$
    – Federico
    Dec 7 '18 at 16:07










  • $begingroup$
    "So does this mean there may exist multiple square matrices $A$ with $Ax_i=y_i$, but that at least one such matrix will be orthogonal?" Exactly. The matrix is uniquely determined on $mathrm{span}{x_i}$. On the complement you can do whatever you want. But you can extend it to an orthogonal matrix. Notice that also this extension is not unique. You can extend it to two different orthogonal matrices. Because you just take an arbitrary basis of $mathrm{span}{x_i}^perp$ and $mathrm{span}{y_i}^perp$.
    $endgroup$
    – Federico
    Dec 7 '18 at 16:11












  • $begingroup$
    Hmm ok interesting. Thanks very much for your help :)
    $endgroup$
    – ludog
    Dec 7 '18 at 16:16


















  • $begingroup$
    Excellent, thank you. I know that orthogonal matrices preserve norms, but wasn't sure that the converse is necessarily true. After a bit of digging I found a proof in Thm 2.1 here that norm-preserving linear transformations are always orthogonal math.lsa.umich.edu/~rauch/555/Conformal_Matrices.pdf, (Thm 2.1). (worth editing your answer to include link?) So does this mean there may exist multiple square matrices $A$ with $Ax_i = y_i$, but that at least one such matrix will be orthogonal?
    $endgroup$
    – ludog
    Dec 7 '18 at 16:02










  • $begingroup$
    Regarding orthogonality, I thought it was a pretty well known fact that the two notions are equivalent. I'll add the link nevertheless
    $endgroup$
    – Federico
    Dec 7 '18 at 16:07










  • $begingroup$
    "So does this mean there may exist multiple square matrices $A$ with $Ax_i=y_i$, but that at least one such matrix will be orthogonal?" Exactly. The matrix is uniquely determined on $mathrm{span}{x_i}$. On the complement you can do whatever you want. But you can extend it to an orthogonal matrix. Notice that also this extension is not unique. You can extend it to two different orthogonal matrices. Because you just take an arbitrary basis of $mathrm{span}{x_i}^perp$ and $mathrm{span}{y_i}^perp$.
    $endgroup$
    – Federico
    Dec 7 '18 at 16:11












  • $begingroup$
    Hmm ok interesting. Thanks very much for your help :)
    $endgroup$
    – ludog
    Dec 7 '18 at 16:16
















$begingroup$
Excellent, thank you. I know that orthogonal matrices preserve norms, but wasn't sure that the converse is necessarily true. After a bit of digging I found a proof in Thm 2.1 here that norm-preserving linear transformations are always orthogonal math.lsa.umich.edu/~rauch/555/Conformal_Matrices.pdf, (Thm 2.1). (worth editing your answer to include link?) So does this mean there may exist multiple square matrices $A$ with $Ax_i = y_i$, but that at least one such matrix will be orthogonal?
$endgroup$
– ludog
Dec 7 '18 at 16:02




$begingroup$
Excellent, thank you. I know that orthogonal matrices preserve norms, but wasn't sure that the converse is necessarily true. After a bit of digging I found a proof in Thm 2.1 here that norm-preserving linear transformations are always orthogonal math.lsa.umich.edu/~rauch/555/Conformal_Matrices.pdf, (Thm 2.1). (worth editing your answer to include link?) So does this mean there may exist multiple square matrices $A$ with $Ax_i = y_i$, but that at least one such matrix will be orthogonal?
$endgroup$
– ludog
Dec 7 '18 at 16:02












$begingroup$
Regarding orthogonality, I thought it was a pretty well known fact that the two notions are equivalent. I'll add the link nevertheless
$endgroup$
– Federico
Dec 7 '18 at 16:07




$begingroup$
Regarding orthogonality, I thought it was a pretty well known fact that the two notions are equivalent. I'll add the link nevertheless
$endgroup$
– Federico
Dec 7 '18 at 16:07












$begingroup$
"So does this mean there may exist multiple square matrices $A$ with $Ax_i=y_i$, but that at least one such matrix will be orthogonal?" Exactly. The matrix is uniquely determined on $mathrm{span}{x_i}$. On the complement you can do whatever you want. But you can extend it to an orthogonal matrix. Notice that also this extension is not unique. You can extend it to two different orthogonal matrices. Because you just take an arbitrary basis of $mathrm{span}{x_i}^perp$ and $mathrm{span}{y_i}^perp$.
$endgroup$
– Federico
Dec 7 '18 at 16:11






$begingroup$
"So does this mean there may exist multiple square matrices $A$ with $Ax_i=y_i$, but that at least one such matrix will be orthogonal?" Exactly. The matrix is uniquely determined on $mathrm{span}{x_i}$. On the complement you can do whatever you want. But you can extend it to an orthogonal matrix. Notice that also this extension is not unique. You can extend it to two different orthogonal matrices. Because you just take an arbitrary basis of $mathrm{span}{x_i}^perp$ and $mathrm{span}{y_i}^perp$.
$endgroup$
– Federico
Dec 7 '18 at 16:11














$begingroup$
Hmm ok interesting. Thanks very much for your help :)
$endgroup$
– ludog
Dec 7 '18 at 16:16




$begingroup$
Hmm ok interesting. Thanks very much for your help :)
$endgroup$
– ludog
Dec 7 '18 at 16:16


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3028830%2ftwo-labelled-sets-of-vectors-in-mathbbrn-with-same-pairwise-dots-products%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

How to change which sound is reproduced for terminal bell?

Title Spacing in Bjornstrup Chapter, Removing Chapter Number From Contents

Can I use Tabulator js library in my java Spring + Thymeleaf project?