# Manhattan distance of a point and a line

Xavier Décoret

The distance between a point and a line is defined as the smallest distance between any point on the line and :

The manhattan distance between two points is defined as:

The question is then what is the formula that gives the manhattan distance between a point and a line?''.

Proposition 1   The manhattan distance between a point of coordinates and a line of equation is given by :

Since and can not be both 0, the formula is legal.

Proof. The proof is in two steps. First we prove that the minimum distance is obtained for the vertical or horizontal projection of the point onto the line. We have (see fig. 1):

Thus, if , we have and otherwise .

Next, we need to compute the two distances. For the horizontal one, we must solve:

which gives . The distance is then . Doing the same for the vertical one yields the . We conclude by noting that .