python - Finding roots with scipy.optimize.root -


i trying find root y of function called f using python.

here code:

 def f(y):     w,p1,p2,p3,p4,p5,p6,p7 = y[:8]      t1 = w - 0.500371726*(p1**0.92894164) - (-0.998515304)*((1-p1)**1.1376649)     t2 = w - 8.095873128*(p2**0.92894164) - (-0.998515304)*((1-p2)**1.1376649)     t3 = w - 220.2054377*(p3**0.92894164) - (-0.998515304)*((1-p3)**1.1376649)     t4 = w - 12.52760758*(p4**0.92894164) - (-0.998515304)*((1-p4)**1.1376649)     t5 = w - 8.710859537*(p5**0.92894164) - (-0.998515304)*((1-p5)**1.1376649)     t6 = w - 36.66350261*(p6**0.92894164) - (-0.998515304)*((1-p6)**1.1376649)     t7 = w - 3.922692207*(p7**0.92894164) - (-0.998515304)*((1-p7)**1.1376649)            t8 = p1 + p2 + p3 + p4 + p5 + p6 + p7 - 1     return [t1,t2,t3,t4,t5,t6,t7,t8]   x0 = np.array([-0.01,0.3,0.1,0.2,0.1,0.1,0.1,0.1]) sol = scipy.optimize.root(f, x0, method='lm') print sol  print 'solution', sol.x print 'success', sol.success 

python not find root whatever method try in scipy.optimize.root.

however there one, found function fsolve in matlab.

it is:

[-0.0622, 0.5855, 0.087, 0.0028, 0.0568, 0.0811, 0.0188, 0.1679].

when specify x0 close root, python algorithm converges. problem have no idea priori on root specify x0. in reality solving many equations of type.

i want use python. can me converge python?

ok, after fooling around, focus on aspect of optimization/root finding algorithms. in comments above went , forth around method in scipy.optimize.root() use. equally important question near-bulletproof 'automatic' root finding zeroing in on initial guesses. times, initial guesses are, in fact, not near real answer @ all. instead, need arranged naturally lead solver in right direction.

in particular case, guesses were, in fact, sending algorithm off in strange directions.

my toy reconstruction of problem is:

import numpy np import scipy sp import scipy.optimize def f(y):     w,p1,p2,p3,p4,p5,p6,p7 = y[:8]     def t(p,w,a):                b = -0.998515304         e1 = 0.92894164         e2 = 1.1376649         return(w-a*p**e1 - b*(1-p)**e2)     t1 = t(p1,w,1.0)     t2 = t(p2,w,4.0)      t3 = t(p3,w,16.0)     t4 = t(p4,w,64.0)     t5 = t(p5,w,256.0)     t6 = t(p6,w,512.0)     t7 = t(p7,w,1024.0)            t8 = p1 + p2 + p3 + p4 + p5 + p6 + p7 - 1.0     return(t1,t2,t3,t4,t5,t6,t7,t8) guess = 0.0001 x0 = np.array([-1000.0,guess,guess,guess,guess,guess,guess,guess]) sol = sp.optimize.root(f, x0, method='lm') print('w=-1000: ', sol.x, sol.success,sol.nfev,np.sum(f(sol.x))) 

note did not use specific prefactors (i wanted broaden range explored), although kept particular exponents on p terms.

the real secret in initial guess, made same p terms. having 0.1 or above bombed of time, since terms want go 1 way , other. reducing 0.01 worked problem. (i note w term robust - varying -1000. +1000. had no effect on solution). reducing initial guess further has no effect on particular problem, no harm either. keep small.

yes, know @ least terms larger. but, putting solver in position can , directly proceed towards real solution.

good luck.


Comments

Popular posts from this blog

Magento/PHP - Get phones on all members in a customer group -

php - .htaccess mod_rewrite for dynamic url which has domain names -

Website Login Issue developed in magento -