I am trying to apply the Pythagorean Theorem to find the distance between an array of points as I iterate through the array and then spit out the n
number of closest points. I am confused as to how I get the d: distance between the iterated point and the next point to compare to the distance of the iterated point and the next +1 point after that and so forth. I start with my array of points:
var points = [
{ id: 1, x: 0.0, y: 0.0 },
{ id: 2, x: 10.1, y: -10.1 },
{ id: 3, x: -12.2, y: 12.2 },
{ id: 4, x: 38.3, y: 38.3 },
{ id: 5, x: 79.0, y: 179.0 },
];
I then want to iterate over them and make a new array for each point of the distance between it and the other points using the Pythagorean theorem:
points.forEach((item) => {
var newArray = [item];
var pt = null;
var d = null;
for (var i = 0; i < points.length; i = i + 1) {
//compare this point with all of the other points
for (var j = i + 1; j < points.length; j = j + 1) {
//compute distance
var curr = Math.sqrt(Math.pow(points[i][0] - points[j][0], 2) + Math.pow(points[i][1] - points[j][1], 2));
//get the distance between each point and push to a new array
if (d === null || curr < d) {
o = points.id[i];
pt = points.id[j];
d = curr;
}
}
}
newArray.push = {
"id": o,
"pt": pt,
"d": d
};
console.log(newArray);
});
It seems like I have some area of the logic incorrect here and I keep getting random Cannot read property '0' of undefined
errors whenever I try variation. Any suggestions on what I am doing incorrect?