Minimize distance between two lists











up vote
5
down vote

favorite
1












Writing:



expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:



enter image description here



On the other hand, if I write:



k = 0.83;
expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4} k;
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:



enter image description here



where it is clear that, compared to the previous case, in some bars the gap has decreased and in others it has increased.



Question: How can I determine the best value of k to get the smallest possible gap?





Writing:



h = -0.35;
k = 0.83;
expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = h + k {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:



enter image description here



Question 2: is it possible to determine the pair of values h, k that minimize the gap?










share|improve this question
























  • This is a related question: How to find the distance of two lists?
    – Artes
    Nov 29 at 16:03















up vote
5
down vote

favorite
1












Writing:



expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:



enter image description here



On the other hand, if I write:



k = 0.83;
expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4} k;
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:



enter image description here



where it is clear that, compared to the previous case, in some bars the gap has decreased and in others it has increased.



Question: How can I determine the best value of k to get the smallest possible gap?





Writing:



h = -0.35;
k = 0.83;
expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = h + k {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:



enter image description here



Question 2: is it possible to determine the pair of values h, k that minimize the gap?










share|improve this question
























  • This is a related question: How to find the distance of two lists?
    – Artes
    Nov 29 at 16:03













up vote
5
down vote

favorite
1









up vote
5
down vote

favorite
1






1





Writing:



expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:



enter image description here



On the other hand, if I write:



k = 0.83;
expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4} k;
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:



enter image description here



where it is clear that, compared to the previous case, in some bars the gap has decreased and in others it has increased.



Question: How can I determine the best value of k to get the smallest possible gap?





Writing:



h = -0.35;
k = 0.83;
expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = h + k {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:



enter image description here



Question 2: is it possible to determine the pair of values h, k that minimize the gap?










share|improve this question















Writing:



expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:



enter image description here



On the other hand, if I write:



k = 0.83;
expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4} k;
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:



enter image description here



where it is clear that, compared to the previous case, in some bars the gap has decreased and in others it has increased.



Question: How can I determine the best value of k to get the smallest possible gap?





Writing:



h = -0.35;
k = 0.83;
expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = h + k {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:



enter image description here



Question 2: is it possible to determine the pair of values h, k that minimize the gap?







mathematical-optimization charts






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 28 at 21:27

























asked Nov 28 at 20:19









TeM

1,777619




1,777619












  • This is a related question: How to find the distance of two lists?
    – Artes
    Nov 29 at 16:03


















  • This is a related question: How to find the distance of two lists?
    – Artes
    Nov 29 at 16:03
















This is a related question: How to find the distance of two lists?
– Artes
Nov 29 at 16:03




This is a related question: How to find the distance of two lists?
– Artes
Nov 29 at 16:03










2 Answers
2






active

oldest

votes

















up vote
4
down vote



accepted










Update: Using two parameters:



lmf2 = LinearModelFit[data, t, t];
Normal@lmf2



1.76563 + 0.546875 t




lmf2["BestFitParameters"]



{1.76563, 0.546875}




Fit[data, {1, t}, t]



1.76563 + 0.546875 t




ClearAll[h, k]
NMinimize[Total[Subtract[expectedresults, h + k achievedresults]^2], {h, k}]



{43.3594, {h -> 1.76562, k -> 0.546875}}




N @ LeastSquares[Thread[{1, achievedresults}], expectedresults]



{1.76563, 0.546875}




Original answer:



expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
data = Transpose[{ achievedresults,expectedresults}];


You can use LinearModelFit or Fit or NMinimize or LeastSquares to get the value of k that minimizes the sum of squared distances between expectedresults and k achievedresults:



lmf = LinearModelFit[data, t, t, IncludeConstantBasis -> False]

Normal@lmf



0.828025 t




Normal @ LinearModelFit[{Transpose[{achievedresults}], expectedresults}]



0.828025 #1




Fit[data, {t}, t]



0.828025 t




ClearAll[k]
NMinimize[Total[Subtract[expectedresults, k achievedresults]^2], k]



{49.7134, {k -> 0.828025}}




N@LeastSquares[Thread[{achievedresults}], expectedresults]



{0.828025}




k = lmf["BestFitParameters"][[1]]



0.828025




p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[k achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


enter image description here



BarChart[Transpose@{expectedresults, achievedresults,  k achievedresults}, 
ChartStyle -> {Blue, Red, Green}, ChartLayout -> "Grouped",
ChartLegends -> {"expectedresults", "achievedresults", "k achievedresults"}]


enter image description here






share|improve this answer























  • For general data you can always find several values of $k$ that eliminate the difference between whichever bars you like.
    – David G. Stork
    Nov 28 at 20:42










  • @TeM, please see the update.
    – kglr
    Nov 28 at 21:47










  • Perfect, mathematically it is clear to me! But I wonder if so "improve" the minimization or less than before!
    – TeM
    Nov 28 at 21:48






  • 1




    @TeM, If you compare the NMinimize result adding the intercept parameter improves the squared loss from 49.7134 to 43.3594.
    – kglr
    Nov 28 at 21:54


















up vote
7
down vote













{k, h} = PseudoInverse[{#, 1} & /@ achievedresults].expectedresults



{35/64, 113/64}







share|improve this answer



















  • 2




    Why do you add a zero column? I think PseudoInverse[Transpose[{achievedresults}]].expectedresults will do
    – MeMyselfI
    Nov 28 at 20:56






  • 1




    @MeMyselfI Nice! Even better
    – Chris
    Nov 28 at 21:05










  • Really great!!!
    – TeM
    Nov 28 at 21:50











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "387"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathematica.stackexchange.com%2fquestions%2f186910%2fminimize-distance-between-two-lists%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
4
down vote



accepted










Update: Using two parameters:



lmf2 = LinearModelFit[data, t, t];
Normal@lmf2



1.76563 + 0.546875 t




lmf2["BestFitParameters"]



{1.76563, 0.546875}




Fit[data, {1, t}, t]



1.76563 + 0.546875 t




ClearAll[h, k]
NMinimize[Total[Subtract[expectedresults, h + k achievedresults]^2], {h, k}]



{43.3594, {h -> 1.76562, k -> 0.546875}}




N @ LeastSquares[Thread[{1, achievedresults}], expectedresults]



{1.76563, 0.546875}




Original answer:



expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
data = Transpose[{ achievedresults,expectedresults}];


You can use LinearModelFit or Fit or NMinimize or LeastSquares to get the value of k that minimizes the sum of squared distances between expectedresults and k achievedresults:



lmf = LinearModelFit[data, t, t, IncludeConstantBasis -> False]

Normal@lmf



0.828025 t




Normal @ LinearModelFit[{Transpose[{achievedresults}], expectedresults}]



0.828025 #1




Fit[data, {t}, t]



0.828025 t




ClearAll[k]
NMinimize[Total[Subtract[expectedresults, k achievedresults]^2], k]



{49.7134, {k -> 0.828025}}




N@LeastSquares[Thread[{achievedresults}], expectedresults]



{0.828025}




k = lmf["BestFitParameters"][[1]]



0.828025




p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[k achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


enter image description here



BarChart[Transpose@{expectedresults, achievedresults,  k achievedresults}, 
ChartStyle -> {Blue, Red, Green}, ChartLayout -> "Grouped",
ChartLegends -> {"expectedresults", "achievedresults", "k achievedresults"}]


enter image description here






share|improve this answer























  • For general data you can always find several values of $k$ that eliminate the difference between whichever bars you like.
    – David G. Stork
    Nov 28 at 20:42










  • @TeM, please see the update.
    – kglr
    Nov 28 at 21:47










  • Perfect, mathematically it is clear to me! But I wonder if so "improve" the minimization or less than before!
    – TeM
    Nov 28 at 21:48






  • 1




    @TeM, If you compare the NMinimize result adding the intercept parameter improves the squared loss from 49.7134 to 43.3594.
    – kglr
    Nov 28 at 21:54















up vote
4
down vote



accepted










Update: Using two parameters:



lmf2 = LinearModelFit[data, t, t];
Normal@lmf2



1.76563 + 0.546875 t




lmf2["BestFitParameters"]



{1.76563, 0.546875}




Fit[data, {1, t}, t]



1.76563 + 0.546875 t




ClearAll[h, k]
NMinimize[Total[Subtract[expectedresults, h + k achievedresults]^2], {h, k}]



{43.3594, {h -> 1.76562, k -> 0.546875}}




N @ LeastSquares[Thread[{1, achievedresults}], expectedresults]



{1.76563, 0.546875}




Original answer:



expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
data = Transpose[{ achievedresults,expectedresults}];


You can use LinearModelFit or Fit or NMinimize or LeastSquares to get the value of k that minimizes the sum of squared distances between expectedresults and k achievedresults:



lmf = LinearModelFit[data, t, t, IncludeConstantBasis -> False]

Normal@lmf



0.828025 t




Normal @ LinearModelFit[{Transpose[{achievedresults}], expectedresults}]



0.828025 #1




Fit[data, {t}, t]



0.828025 t




ClearAll[k]
NMinimize[Total[Subtract[expectedresults, k achievedresults]^2], k]



{49.7134, {k -> 0.828025}}




N@LeastSquares[Thread[{achievedresults}], expectedresults]



{0.828025}




k = lmf["BestFitParameters"][[1]]



0.828025




p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[k achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


enter image description here



BarChart[Transpose@{expectedresults, achievedresults,  k achievedresults}, 
ChartStyle -> {Blue, Red, Green}, ChartLayout -> "Grouped",
ChartLegends -> {"expectedresults", "achievedresults", "k achievedresults"}]


enter image description here






share|improve this answer























  • For general data you can always find several values of $k$ that eliminate the difference between whichever bars you like.
    – David G. Stork
    Nov 28 at 20:42










  • @TeM, please see the update.
    – kglr
    Nov 28 at 21:47










  • Perfect, mathematically it is clear to me! But I wonder if so "improve" the minimization or less than before!
    – TeM
    Nov 28 at 21:48






  • 1




    @TeM, If you compare the NMinimize result adding the intercept parameter improves the squared loss from 49.7134 to 43.3594.
    – kglr
    Nov 28 at 21:54













up vote
4
down vote



accepted







up vote
4
down vote



accepted






Update: Using two parameters:



lmf2 = LinearModelFit[data, t, t];
Normal@lmf2



1.76563 + 0.546875 t




lmf2["BestFitParameters"]



{1.76563, 0.546875}




Fit[data, {1, t}, t]



1.76563 + 0.546875 t




ClearAll[h, k]
NMinimize[Total[Subtract[expectedresults, h + k achievedresults]^2], {h, k}]



{43.3594, {h -> 1.76562, k -> 0.546875}}




N @ LeastSquares[Thread[{1, achievedresults}], expectedresults]



{1.76563, 0.546875}




Original answer:



expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
data = Transpose[{ achievedresults,expectedresults}];


You can use LinearModelFit or Fit or NMinimize or LeastSquares to get the value of k that minimizes the sum of squared distances between expectedresults and k achievedresults:



lmf = LinearModelFit[data, t, t, IncludeConstantBasis -> False]

Normal@lmf



0.828025 t




Normal @ LinearModelFit[{Transpose[{achievedresults}], expectedresults}]



0.828025 #1




Fit[data, {t}, t]



0.828025 t




ClearAll[k]
NMinimize[Total[Subtract[expectedresults, k achievedresults]^2], k]



{49.7134, {k -> 0.828025}}




N@LeastSquares[Thread[{achievedresults}], expectedresults]



{0.828025}




k = lmf["BestFitParameters"][[1]]



0.828025




p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[k achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


enter image description here



BarChart[Transpose@{expectedresults, achievedresults,  k achievedresults}, 
ChartStyle -> {Blue, Red, Green}, ChartLayout -> "Grouped",
ChartLegends -> {"expectedresults", "achievedresults", "k achievedresults"}]


enter image description here






share|improve this answer














Update: Using two parameters:



lmf2 = LinearModelFit[data, t, t];
Normal@lmf2



1.76563 + 0.546875 t




lmf2["BestFitParameters"]



{1.76563, 0.546875}




Fit[data, {1, t}, t]



1.76563 + 0.546875 t




ClearAll[h, k]
NMinimize[Total[Subtract[expectedresults, h + k achievedresults]^2], {h, k}]



{43.3594, {h -> 1.76562, k -> 0.546875}}




N @ LeastSquares[Thread[{1, achievedresults}], expectedresults]



{1.76563, 0.546875}




Original answer:



expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
data = Transpose[{ achievedresults,expectedresults}];


You can use LinearModelFit or Fit or NMinimize or LeastSquares to get the value of k that minimizes the sum of squared distances between expectedresults and k achievedresults:



lmf = LinearModelFit[data, t, t, IncludeConstantBasis -> False]

Normal@lmf



0.828025 t




Normal @ LinearModelFit[{Transpose[{achievedresults}], expectedresults}]



0.828025 #1




Fit[data, {t}, t]



0.828025 t




ClearAll[k]
NMinimize[Total[Subtract[expectedresults, k achievedresults]^2], k]



{49.7134, {k -> 0.828025}}




N@LeastSquares[Thread[{achievedresults}], expectedresults]



{0.828025}




k = lmf["BestFitParameters"][[1]]



0.828025




p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[k achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


enter image description here



BarChart[Transpose@{expectedresults, achievedresults,  k achievedresults}, 
ChartStyle -> {Blue, Red, Green}, ChartLayout -> "Grouped",
ChartLegends -> {"expectedresults", "achievedresults", "k achievedresults"}]


enter image description here







share|improve this answer














share|improve this answer



share|improve this answer








edited Nov 28 at 21:52

























answered Nov 28 at 20:29









kglr

174k9197402




174k9197402












  • For general data you can always find several values of $k$ that eliminate the difference between whichever bars you like.
    – David G. Stork
    Nov 28 at 20:42










  • @TeM, please see the update.
    – kglr
    Nov 28 at 21:47










  • Perfect, mathematically it is clear to me! But I wonder if so "improve" the minimization or less than before!
    – TeM
    Nov 28 at 21:48






  • 1




    @TeM, If you compare the NMinimize result adding the intercept parameter improves the squared loss from 49.7134 to 43.3594.
    – kglr
    Nov 28 at 21:54


















  • For general data you can always find several values of $k$ that eliminate the difference between whichever bars you like.
    – David G. Stork
    Nov 28 at 20:42










  • @TeM, please see the update.
    – kglr
    Nov 28 at 21:47










  • Perfect, mathematically it is clear to me! But I wonder if so "improve" the minimization or less than before!
    – TeM
    Nov 28 at 21:48






  • 1




    @TeM, If you compare the NMinimize result adding the intercept parameter improves the squared loss from 49.7134 to 43.3594.
    – kglr
    Nov 28 at 21:54
















For general data you can always find several values of $k$ that eliminate the difference between whichever bars you like.
– David G. Stork
Nov 28 at 20:42




For general data you can always find several values of $k$ that eliminate the difference between whichever bars you like.
– David G. Stork
Nov 28 at 20:42












@TeM, please see the update.
– kglr
Nov 28 at 21:47




@TeM, please see the update.
– kglr
Nov 28 at 21:47












Perfect, mathematically it is clear to me! But I wonder if so "improve" the minimization or less than before!
– TeM
Nov 28 at 21:48




Perfect, mathematically it is clear to me! But I wonder if so "improve" the minimization or less than before!
– TeM
Nov 28 at 21:48




1




1




@TeM, If you compare the NMinimize result adding the intercept parameter improves the squared loss from 49.7134 to 43.3594.
– kglr
Nov 28 at 21:54




@TeM, If you compare the NMinimize result adding the intercept parameter improves the squared loss from 49.7134 to 43.3594.
– kglr
Nov 28 at 21:54










up vote
7
down vote













{k, h} = PseudoInverse[{#, 1} & /@ achievedresults].expectedresults



{35/64, 113/64}







share|improve this answer



















  • 2




    Why do you add a zero column? I think PseudoInverse[Transpose[{achievedresults}]].expectedresults will do
    – MeMyselfI
    Nov 28 at 20:56






  • 1




    @MeMyselfI Nice! Even better
    – Chris
    Nov 28 at 21:05










  • Really great!!!
    – TeM
    Nov 28 at 21:50















up vote
7
down vote













{k, h} = PseudoInverse[{#, 1} & /@ achievedresults].expectedresults



{35/64, 113/64}







share|improve this answer



















  • 2




    Why do you add a zero column? I think PseudoInverse[Transpose[{achievedresults}]].expectedresults will do
    – MeMyselfI
    Nov 28 at 20:56






  • 1




    @MeMyselfI Nice! Even better
    – Chris
    Nov 28 at 21:05










  • Really great!!!
    – TeM
    Nov 28 at 21:50













up vote
7
down vote










up vote
7
down vote









{k, h} = PseudoInverse[{#, 1} & /@ achievedresults].expectedresults



{35/64, 113/64}







share|improve this answer














{k, h} = PseudoInverse[{#, 1} & /@ achievedresults].expectedresults



{35/64, 113/64}








share|improve this answer














share|improve this answer



share|improve this answer








edited Nov 28 at 21:45

























answered Nov 28 at 20:36









Chris

54116




54116








  • 2




    Why do you add a zero column? I think PseudoInverse[Transpose[{achievedresults}]].expectedresults will do
    – MeMyselfI
    Nov 28 at 20:56






  • 1




    @MeMyselfI Nice! Even better
    – Chris
    Nov 28 at 21:05










  • Really great!!!
    – TeM
    Nov 28 at 21:50














  • 2




    Why do you add a zero column? I think PseudoInverse[Transpose[{achievedresults}]].expectedresults will do
    – MeMyselfI
    Nov 28 at 20:56






  • 1




    @MeMyselfI Nice! Even better
    – Chris
    Nov 28 at 21:05










  • Really great!!!
    – TeM
    Nov 28 at 21:50








2




2




Why do you add a zero column? I think PseudoInverse[Transpose[{achievedresults}]].expectedresults will do
– MeMyselfI
Nov 28 at 20:56




Why do you add a zero column? I think PseudoInverse[Transpose[{achievedresults}]].expectedresults will do
– MeMyselfI
Nov 28 at 20:56




1




1




@MeMyselfI Nice! Even better
– Chris
Nov 28 at 21:05




@MeMyselfI Nice! Even better
– Chris
Nov 28 at 21:05












Really great!!!
– TeM
Nov 28 at 21:50




Really great!!!
– TeM
Nov 28 at 21:50


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematica Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathematica.stackexchange.com%2fquestions%2f186910%2fminimize-distance-between-two-lists%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

How to change which sound is reproduced for terminal bell?

Title Spacing in Bjornstrup Chapter, Removing Chapter Number From Contents

Can I use Tabulator js library in my java Spring + Thymeleaf project?