# Minimize distance between two lists

Clash Royale CLAN TAG#URR8PPP

Writing:

expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:

On the other hand, if I write:

k = 0.83;
expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4} k;
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:

where it is clear that, compared to the previous case, in some bars the gap has decreased and in others it has increased.

Question: How can I determine the best value of k to get the smallest possible gap?

Writing:

h = -0.35;
k = 0.83;
expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = h + k {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:

Question 2: is it possible to determine the pair of values h, k that minimize the gap?

• This is a related question: How to find the distance of two lists?
– Artes
Nov 29 at 16:03

Writing:

expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:

On the other hand, if I write:

k = 0.83;
expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4} k;
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:

where it is clear that, compared to the previous case, in some bars the gap has decreased and in others it has increased.

Question: How can I determine the best value of k to get the smallest possible gap?

Writing:

h = -0.35;
k = 0.83;
expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = h + k {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:

Question 2: is it possible to determine the pair of values h, k that minimize the gap?

• This is a related question: How to find the distance of two lists?
– Artes
Nov 29 at 16:03

1

Writing:

expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:

On the other hand, if I write:

k = 0.83;
expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4} k;
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:

where it is clear that, compared to the previous case, in some bars the gap has decreased and in others it has increased.

Question: How can I determine the best value of k to get the smallest possible gap?

Writing:

h = -0.35;
k = 0.83;
expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = h + k {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:

Question 2: is it possible to determine the pair of values h, k that minimize the gap?

Writing:

expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:

On the other hand, if I write:

k = 0.83;
expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4} k;
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:

where it is clear that, compared to the previous case, in some bars the gap has decreased and in others it has increased.

Question: How can I determine the best value of k to get the smallest possible gap?

Writing:

h = -0.35;
k = 0.83;
expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = h + k {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


I get:

Question 2: is it possible to determine the pair of values h, k that minimize the gap?

mathematical-optimization charts

edited Nov 28 at 21:27

TeM

1,777619

1,777619

• This is a related question: How to find the distance of two lists?
– Artes
Nov 29 at 16:03

• This is a related question: How to find the distance of two lists?
– Artes
Nov 29 at 16:03

This is a related question: How to find the distance of two lists?
– Artes
Nov 29 at 16:03

This is a related question: How to find the distance of two lists?
– Artes
Nov 29 at 16:03

active

oldest

Update: Using two parameters:

lmf2 = LinearModelFit[data, t, t];
Normal@lmf2


1.76563 + 0.546875 t

lmf2["BestFitParameters"]


{1.76563, 0.546875}

Fit[data, {1, t}, t]


1.76563 + 0.546875 t

ClearAll[h, k]
NMinimize[Total[Subtract[expectedresults, h + k achievedresults]^2], {h, k}]


{43.3594, {h -> 1.76562, k -> 0.546875}}

N @ LeastSquares[Thread[{1, achievedresults}], expectedresults]


{1.76563, 0.546875}

expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
data = Transpose[{ achievedresults,expectedresults}];


You can use LinearModelFit or Fit or NMinimize or LeastSquares to get the value of k that minimizes the sum of squared distances between expectedresults and k achievedresults:

lmf = LinearModelFit[data, t, t, IncludeConstantBasis -> False]

Normal@lmf


0.828025 t

Normal @ LinearModelFit[{Transpose[{achievedresults}], expectedresults}]


0.828025 #1

Fit[data, {t}, t]


0.828025 t

ClearAll[k]
NMinimize[Total[Subtract[expectedresults, k achievedresults]^2], k]


{49.7134, {k -> 0.828025}}

N@LeastSquares[Thread[{achievedresults}], expectedresults]


{0.828025}

k = lmf["BestFitParameters"][[1]]


0.828025

p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[k achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


BarChart[Transpose@{expectedresults, achievedresults,  k achievedresults},
ChartStyle -> {Blue, Red, Green}, ChartLayout -> "Grouped",
ChartLegends -> {"expectedresults", "achievedresults", "k achievedresults"}]


• For general data you can always find several values of $k$ that eliminate the difference between whichever bars you like.
– David G. Stork
Nov 28 at 20:42

• @TeM, please see the update.
– kglr
Nov 28 at 21:47

• Perfect, mathematically it is clear to me! But I wonder if so “improve” the minimization or less than before!
– TeM
Nov 28 at 21:48

• @TeM, If you compare the NMinimize result adding the intercept parameter improves the squared loss from 49.7134 to 43.3594.
– kglr
Nov 28 at 21:54

{k, h} = PseudoInverse[{#, 1} & /@ achievedresults].expectedresults


{35/64, 113/64}

• Why do you add a zero column? I think PseudoInverse[Transpose[{achievedresults}]].expectedresults will do
– MeMyselfI
Nov 28 at 20:56

• @MeMyselfI Nice! Even better
– Chris
Nov 28 at 21:05

• Really great!!!
– TeM
Nov 28 at 21:50

active

oldest

active

oldest

active

oldest

active

oldest

Update: Using two parameters:

lmf2 = LinearModelFit[data, t, t];
Normal@lmf2


1.76563 + 0.546875 t

lmf2["BestFitParameters"]


{1.76563, 0.546875}

Fit[data, {1, t}, t]


1.76563 + 0.546875 t

ClearAll[h, k]
NMinimize[Total[Subtract[expectedresults, h + k achievedresults]^2], {h, k}]


{43.3594, {h -> 1.76562, k -> 0.546875}}

N @ LeastSquares[Thread[{1, achievedresults}], expectedresults]


{1.76563, 0.546875}

expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
data = Transpose[{ achievedresults,expectedresults}];


You can use LinearModelFit or Fit or NMinimize or LeastSquares to get the value of k that minimizes the sum of squared distances between expectedresults and k achievedresults:

lmf = LinearModelFit[data, t, t, IncludeConstantBasis -> False]

Normal@lmf


0.828025 t

Normal @ LinearModelFit[{Transpose[{achievedresults}], expectedresults}]


0.828025 #1

Fit[data, {t}, t]


0.828025 t

ClearAll[k]
NMinimize[Total[Subtract[expectedresults, k achievedresults]^2], k]


{49.7134, {k -> 0.828025}}

N@LeastSquares[Thread[{achievedresults}], expectedresults]


{0.828025}

k = lmf["BestFitParameters"][[1]]


0.828025

p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[k achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


BarChart[Transpose@{expectedresults, achievedresults,  k achievedresults},
ChartStyle -> {Blue, Red, Green}, ChartLayout -> "Grouped",
ChartLegends -> {"expectedresults", "achievedresults", "k achievedresults"}]


• For general data you can always find several values of $k$ that eliminate the difference between whichever bars you like.
– David G. Stork
Nov 28 at 20:42

• @TeM, please see the update.
– kglr
Nov 28 at 21:47

• Perfect, mathematically it is clear to me! But I wonder if so “improve” the minimization or less than before!
– TeM
Nov 28 at 21:48

• @TeM, If you compare the NMinimize result adding the intercept parameter improves the squared loss from 49.7134 to 43.3594.
– kglr
Nov 28 at 21:54

Update: Using two parameters:

lmf2 = LinearModelFit[data, t, t];
Normal@lmf2


1.76563 + 0.546875 t

lmf2["BestFitParameters"]


{1.76563, 0.546875}

Fit[data, {1, t}, t]


1.76563 + 0.546875 t

ClearAll[h, k]
NMinimize[Total[Subtract[expectedresults, h + k achievedresults]^2], {h, k}]


{43.3594, {h -> 1.76562, k -> 0.546875}}

N @ LeastSquares[Thread[{1, achievedresults}], expectedresults]


{1.76563, 0.546875}

expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
data = Transpose[{ achievedresults,expectedresults}];


You can use LinearModelFit or Fit or NMinimize or LeastSquares to get the value of k that minimizes the sum of squared distances between expectedresults and k achievedresults:

lmf = LinearModelFit[data, t, t, IncludeConstantBasis -> False]

Normal@lmf


0.828025 t

Normal @ LinearModelFit[{Transpose[{achievedresults}], expectedresults}]


0.828025 #1

Fit[data, {t}, t]


0.828025 t

ClearAll[k]
NMinimize[Total[Subtract[expectedresults, k achievedresults]^2], k]


{49.7134, {k -> 0.828025}}

N@LeastSquares[Thread[{achievedresults}], expectedresults]


{0.828025}

k = lmf["BestFitParameters"][[1]]


0.828025

p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[k achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


BarChart[Transpose@{expectedresults, achievedresults,  k achievedresults},
ChartStyle -> {Blue, Red, Green}, ChartLayout -> "Grouped",
ChartLegends -> {"expectedresults", "achievedresults", "k achievedresults"}]


• For general data you can always find several values of $k$ that eliminate the difference between whichever bars you like.
– David G. Stork
Nov 28 at 20:42

• @TeM, please see the update.
– kglr
Nov 28 at 21:47

• Perfect, mathematically it is clear to me! But I wonder if so “improve” the minimization or less than before!
– TeM
Nov 28 at 21:48

• @TeM, If you compare the NMinimize result adding the intercept parameter improves the squared loss from 49.7134 to 43.3594.
– kglr
Nov 28 at 21:54

Update: Using two parameters:

lmf2 = LinearModelFit[data, t, t];
Normal@lmf2


1.76563 + 0.546875 t

lmf2["BestFitParameters"]


{1.76563, 0.546875}

Fit[data, {1, t}, t]


1.76563 + 0.546875 t

ClearAll[h, k]
NMinimize[Total[Subtract[expectedresults, h + k achievedresults]^2], {h, k}]


{43.3594, {h -> 1.76562, k -> 0.546875}}

N @ LeastSquares[Thread[{1, achievedresults}], expectedresults]


{1.76563, 0.546875}

expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
data = Transpose[{ achievedresults,expectedresults}];


You can use LinearModelFit or Fit or NMinimize or LeastSquares to get the value of k that minimizes the sum of squared distances between expectedresults and k achievedresults:

lmf = LinearModelFit[data, t, t, IncludeConstantBasis -> False]

Normal@lmf


0.828025 t

Normal @ LinearModelFit[{Transpose[{achievedresults}], expectedresults}]


0.828025 #1

Fit[data, {t}, t]


0.828025 t

ClearAll[k]
NMinimize[Total[Subtract[expectedresults, k achievedresults]^2], k]


{49.7134, {k -> 0.828025}}

N@LeastSquares[Thread[{achievedresults}], expectedresults]


{0.828025}

k = lmf["BestFitParameters"][[1]]


0.828025

p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[k achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


BarChart[Transpose@{expectedresults, achievedresults,  k achievedresults},
ChartStyle -> {Blue, Red, Green}, ChartLayout -> "Grouped",
ChartLegends -> {"expectedresults", "achievedresults", "k achievedresults"}]


Update: Using two parameters:

lmf2 = LinearModelFit[data, t, t];
Normal@lmf2


1.76563 + 0.546875 t

lmf2["BestFitParameters"]


{1.76563, 0.546875}

Fit[data, {1, t}, t]


1.76563 + 0.546875 t

ClearAll[h, k]
NMinimize[Total[Subtract[expectedresults, h + k achievedresults]^2], {h, k}]


{43.3594, {h -> 1.76562, k -> 0.546875}}

N @ LeastSquares[Thread[{1, achievedresults}], expectedresults]


{1.76563, 0.546875}

expectedresults = {4, 8, 5, 1, 4, 6, 4, 1, 9, 3};
achievedresults = {3, 6, 4, 2, 10, 7, 2, 4, 8, 4};
data = Transpose[{ achievedresults,expectedresults}];


You can use LinearModelFit or Fit or NMinimize or LeastSquares to get the value of k that minimizes the sum of squared distances between expectedresults and k achievedresults:

lmf = LinearModelFit[data, t, t, IncludeConstantBasis -> False]

Normal@lmf


0.828025 t

Normal @ LinearModelFit[{Transpose[{achievedresults}], expectedresults}]


0.828025 #1

Fit[data, {t}, t]


0.828025 t

ClearAll[k]
NMinimize[Total[Subtract[expectedresults, k achievedresults]^2], k]


{49.7134, {k -> 0.828025}}

N@LeastSquares[Thread[{achievedresults}], expectedresults]


{0.828025}

k = lmf["BestFitParameters"][[1]]


0.828025

p1 = BarChart[expectedresults, ChartStyle -> Directive[Opacity[0.1], Blue]];
p2 = BarChart[k achievedresults, ChartStyle -> Directive[Opacity[0.1], Red]];
Show[p1, p2]


BarChart[Transpose@{expectedresults, achievedresults,  k achievedresults},
ChartStyle -> {Blue, Red, Green}, ChartLayout -> "Grouped",
ChartLegends -> {"expectedresults", "achievedresults", "k achievedresults"}]


edited Nov 28 at 21:52

kglr

175k9197402

175k9197402

• For general data you can always find several values of $k$ that eliminate the difference between whichever bars you like.
– David G. Stork
Nov 28 at 20:42

• @TeM, please see the update.
– kglr
Nov 28 at 21:47

• Perfect, mathematically it is clear to me! But I wonder if so “improve” the minimization or less than before!
– TeM
Nov 28 at 21:48

• @TeM, If you compare the NMinimize result adding the intercept parameter improves the squared loss from 49.7134 to 43.3594.
– kglr
Nov 28 at 21:54

• For general data you can always find several values of $k$ that eliminate the difference between whichever bars you like.
– David G. Stork
Nov 28 at 20:42

• @TeM, please see the update.
– kglr
Nov 28 at 21:47

• Perfect, mathematically it is clear to me! But I wonder if so “improve” the minimization or less than before!
– TeM
Nov 28 at 21:48

• @TeM, If you compare the NMinimize result adding the intercept parameter improves the squared loss from 49.7134 to 43.3594.
– kglr
Nov 28 at 21:54

For general data you can always find several values of $k$ that eliminate the difference between whichever bars you like.
– David G. Stork
Nov 28 at 20:42

For general data you can always find several values of $k$ that eliminate the difference between whichever bars you like.
– David G. Stork
Nov 28 at 20:42

– kglr
Nov 28 at 21:47

– kglr
Nov 28 at 21:47

Perfect, mathematically it is clear to me! But I wonder if so “improve” the minimization or less than before!
– TeM
Nov 28 at 21:48

Perfect, mathematically it is clear to me! But I wonder if so “improve” the minimization or less than before!
– TeM
Nov 28 at 21:48

1

@TeM, If you compare the NMinimize result adding the intercept parameter improves the squared loss from 49.7134 to 43.3594.
– kglr
Nov 28 at 21:54

@TeM, If you compare the NMinimize result adding the intercept parameter improves the squared loss from 49.7134 to 43.3594.
– kglr
Nov 28 at 21:54

{k, h} = PseudoInverse[{#, 1} & /@ achievedresults].expectedresults


{35/64, 113/64}

• Why do you add a zero column? I think PseudoInverse[Transpose[{achievedresults}]].expectedresults will do
– MeMyselfI
Nov 28 at 20:56

• @MeMyselfI Nice! Even better
– Chris
Nov 28 at 21:05

• Really great!!!
– TeM
Nov 28 at 21:50

{k, h} = PseudoInverse[{#, 1} & /@ achievedresults].expectedresults


{35/64, 113/64}

• Why do you add a zero column? I think PseudoInverse[Transpose[{achievedresults}]].expectedresults will do
– MeMyselfI
Nov 28 at 20:56

• @MeMyselfI Nice! Even better
– Chris
Nov 28 at 21:05

• Really great!!!
– TeM
Nov 28 at 21:50

{k, h} = PseudoInverse[{#, 1} & /@ achievedresults].expectedresults


{35/64, 113/64}

{k, h} = PseudoInverse[{#, 1} & /@ achievedresults].expectedresults


{35/64, 113/64}

edited Nov 28 at 21:45

Chris

54116

54116

• Why do you add a zero column? I think PseudoInverse[Transpose[{achievedresults}]].expectedresults will do
– MeMyselfI
Nov 28 at 20:56

• @MeMyselfI Nice! Even better
– Chris
Nov 28 at 21:05

• Really great!!!
– TeM
Nov 28 at 21:50

• Why do you add a zero column? I think PseudoInverse[Transpose[{achievedresults}]].expectedresults will do
– MeMyselfI
Nov 28 at 20:56

• @MeMyselfI Nice! Even better
– Chris
Nov 28 at 21:05

• Really great!!!
– TeM
Nov 28 at 21:50

2

Why do you add a zero column? I think PseudoInverse[Transpose[{achievedresults}]].expectedresults will do
– MeMyselfI
Nov 28 at 20:56

Why do you add a zero column? I think PseudoInverse[Transpose[{achievedresults}]].expectedresults will do
– MeMyselfI
Nov 28 at 20:56

1

@MeMyselfI Nice! Even better
– Chris
Nov 28 at 21:05

@MeMyselfI Nice! Even better
– Chris
Nov 28 at 21:05

Really great!!!
– TeM
Nov 28 at 21:50

Really great!!!
– TeM
Nov 28 at 21:50

Thanks for contributing an answer to Mathematica Stack Exchange!

But avoid

• Making statements based on opinion; back them up with references or personal experience.

Use MathJax to format equations. MathJax reference.

Please pay close attention to the following guidance:

But avoid

• Making statements based on opinion; back them up with references or personal experience.

draft saved

function () {
}
);

### Post as a guest

Required, but never shown

Required, but never shown

Required, but never shown

Required, but never shown

Required, but never shown

Required, but never shown

Required, but never shown

Required, but never shown

Required, but never shown