performance - How to speed-up python curve_fit over a 2D array? -


i have use curve_fit numpy function on large set of data (5 000 000). i've created 2d array. first dimension number of fittings perform, second dimension number of points used fitting.

t = np.array([0 1 2 3 4])  d in np.ndindex(data.shape[0]):   try:     popt, pcov = curve_fit(func, t, np.squeeze(data[d,:]), p0=[1000,100])   except runtimeerror:     print("error - curve_fit failed") 

multiprocessing can used speed full process, still quite slow. there way use curve_fit in "vectorized" manner?

one way speed adding in prior knowledge curve_fit.

if know range expect parameters be, , if don't need precision 100th significant number, can speed computations massively.

here example, in you'd fitting param1 , param2:

t = np.array([0 1 2 3 4]) def func(t, param1, param2):   return param1*t + param2*np.exp(t)  d in np.ndindex(data.shape[0]):   try:     popt, pcov = curve_fit(func, t, np.squeeze(data[d,:]), p0=[1000,100],                             bounds=([min_param1, min_param2],[max_param1, max_param2]),                            ftol=0.5, xtol=0.5)   except runtimeerror:     print("error - curve_fit failed") 

note key arguments bounds, ftol , xtol. can read them here.


Comments

Popular posts from this blog

python - No exponential form of the z-axis in matplotlib-3D-plots -

php - Best Light server (Linux + Web server + Database) for Raspberry Pi -

c# - "Newtonsoft.Json.JsonSerializationException unable to find constructor to use for types" error when deserializing class -