[python] How to implement linear interpolation?

As I understand your question, you want to write some function y = interpolate(x_values, y_values, x), which will give you the y value at some x? The basic idea then follows these steps:

  1. Find the indices of the values in x_values which define an interval containing x. For instance, for x=3 with your example lists, the containing interval would be [x1,x2]=[2.5,3.4], and the indices would be i1=1, i2=2
  2. Calculate the slope on this interval by (y_values[i2]-y_values[i1])/(x_values[i2]-x_values[i1]) (ie dy/dx).
  3. The value at x is now the value at x1 plus the slope multiplied by the distance from x1.

You will additionally need to decide what happens if x is outside the interval of x_values, either it's an error, or you could interpolate "backwards", assuming the slope is the same as the first/last interval.

Did this help, or did you need more specific advice?