Skip to content
Advertisement

Orthogonal regression fitting in scipy least squares method

The leastsq method in scipy lib fits a curve to some data. And this method implies that in this data Y values depends on some X argument. And calculates the minimal distance between curve and the data point in the Y axis (dy)

But what if I need to calculate minimal distance in both axes (dy and dx)

Is there some ways to implement this calculation?

Here is a sample of code when using one axis calculation:

import numpy as np
from scipy.optimize import leastsq

xData = [some data...]
yData = [some data...]

def mFunc(p, x, y):
    return y - (p[0]*x**p[1])  # is takes into account only y axis

plsq, pcov = leastsq(mFunc, [1,1], args=(xData,yData))
print plsq

I recently tryed scipy.odr library and it returns the proper results only for linear function. For other functions like y=a*x^b it returns wrong results. This is how I use it:

def f(p, x):      
    return p[0]*x**p[1]

myModel = Model(f)
myData = Data(xData, yData)
myOdr = ODR(myData, myModel , beta0=[1,1])
myOdr.set_job(fit_type=0) #if set fit_type=2, returns the same as leastsq
out = myOdr.run()
out.pprint()

This returns wrong results, not desired, and in some input data not even close to real. May be, there is some special ways of using it, what do I do wrong?

Advertisement

Answer

I’ve found the solution. Scipy Odrpack works noramally but it needs a good initial guess for correct results. So I divided the process into two steps.

First step: find the initial guess by using ordinaty least squares method.

Second step: substitude these initial guess in ODR as beta0 parameter.

And it works very well with an acceptable speed.

Thank you guys, your advice directed me to the right solution

User contributions licensed under: CC BY-SA
1 People found this is helpful
Advertisement