[python] How to implement linear interpolation?

I thought up a rather elegant solution (IMHO), so I can't resist posting it:

from bisect import bisect_left

class Interpolate(object):
    def __init__(self, x_list, y_list):
        if any(y - x <= 0 for x, y in zip(x_list, x_list[1:])):
            raise ValueError("x_list must be in strictly ascending order!")
        x_list = self.x_list = map(float, x_list)
        y_list = self.y_list = map(float, y_list)
        intervals = zip(x_list, x_list[1:], y_list, y_list[1:])
        self.slopes = [(y2 - y1)/(x2 - x1) for x1, x2, y1, y2 in intervals]

    def __getitem__(self, x):
        i = bisect_left(self.x_list, x) - 1
        return self.y_list[i] + self.slopes[i] * (x - self.x_list[i])

I map to float so that integer division (python <= 2.7) won't kick in and ruin things if x1, x2, y1 and y2 are all integers for some iterval.

In __getitem__ I'm taking advantage of the fact that self.x_list is sorted in ascending order by using bisect_left to (very) quickly find the index of the largest element smaller than x in self.x_list.

Use the class like this:

i = Interpolate([1, 2.5, 3.4, 5.8, 6], [2, 4, 5.8, 4.3, 4])
# Get the interpolated value at x = 4:
y = i[4]

I've not dealt with the border conditions at all here, for simplicity. As it is, i[x] for x < 1 will work as if the line from (2.5, 4) to (1, 2) had been extended to minus infinity, while i[x] for x == 1 or x > 6 will raise an IndexError. Better would be to raise an IndexError in all cases, but this is left as an exercise for the reader. :)