
For non-linear functions, the rate of change of a curve varies, and the derivative of a function at a given point is the rate of change of the function, represented by the slope of the line tangent to the curve at that point.I frequently use Python (and occasionally Excel) to process and compare test data between multiple experiments. For example, a typical test specification would be: 1) Stabilize test temperature to a value of 20 +/- 2 degrees CĢ) Hold test temperature at stabilized value for 15-25 secondsģ) Increase temperature by 20 degrees C at a rate of 0.5 degree C/second In some cases the data might be out of sync which makes direct comparisons difficult. It is straightforward to normalize the data so they both start with a nominal temperature value of 20 C at time = 0 seconds, but what I really want is to synchronize the data so that the temperature ramps begin at the same time. I've tried simple algorithms to check the slope of data to identify when the temperature increase begins, but local fluctuations in the measurements due to instrumentation result in slopes that don't actually reflect the overall rate of change in temperature.Īre there functions in Numpy, Scipy, Pandas, etc. that can filter out these local fluctuations and identify when the temperature actually begins to increase. I do occasionally work in Excel so if there is a more convenient way to do this in a spreadsheet I can use Excel to process the data. The first thing that comes to mind is to numerically differentiate the data, and look for the jump in the slope from 0 to 0.5.

But (as you observed) noisy data can prevent this from working well.

If you google for "numerical differentiation of noisy data", you'll find a lot of research on this topic, but I don't know of any off-the-shelf libraries in python. You might be able to make some progress using a Savitzky-Golay filter: _filter. However, that approach is probably overkill, since your signal has a very simple and specific expected structure: a constant interval followed by a ramp and then another constant. Y = temp_init + np.minimum(slope * np.maximum(t - t0, 0.0), temp_final - temp_init) Slope = (temp_final - temp_init) / (t1 - t0) Here's an example: from _future_ import divisionĭef ramp(t, temp_init, temp_final, t0, t1): You might find that _fit works fine for this.

Y = ramp(t, temp_init, temp_final, t0, t1) Y += 0.25*np.random.randn(*t.shape) # Add noise.
