python - Solve multiple independent optimizations in scipy -
i need minimize cost function large number (1000s) of different inputs. obviously, can implemented looping on scipy.optimize.minimize
or other minimization routine. here example:
import numpy np import scipy sp def cost(x, a, b): return np.sum((np.sum(a * x.reshape(a.shape), axis=1) - b)**2) = np.random.randn(500, 40) b = np.array(np.arange(500)) x = [] in range(a.shape[0]): res = sp.optimize.minimize(cost, np.zeros(40), args=(a[none, i], b[none, i])) x.append(res.x)
it finds x[i, :]
minimize cost
each a[i, :]
, b[i]
, slow. guess looping on minimize
causes considerable overhead.
a partial solution solve x
simultaneously:
res = sp.optimize.minimize(cost, np.zeros_like(a), args=(a, b))
this slower loop. minimize
not know elements in x
group-wise independent. computes full hessian although block-diagonal matrix sufficient, considering problem structure. slow , overflows computer's memory.
is there way inform minimize
or optimization function problem structure can solve multiple indepentent optimizations in single function call? (similar certain options supported matlab's fsolve
.)
i guess looping on minimize causes considerable overhead.
wrong guess. time required minimizing function dwarfs loop overhead. there no vectorization magic problem.
some time can saved using better starting point of minimization. first, sort parameters consecutive loops have similar parameters. use end point of previous minimization starting point of next one:
a = np.sort(np.random.randn(500, 40), axis=0) # sorted parameters b = np.arange(500) # no need np.array here, np.arange ndarray x0 = np.zeros(40) in range(a.shape[0]): res = minimize(cost, x0, args=(a[none, i], b[none, i])) x.append(res.x) x0 = res.x
this saves 30-40 percent of execution time in test.
another, minor, optimization preallocate ndarray of appropriate size resulting x-values, instead of using list , append
method. before loop: x = np.zeros((500, 40))
; within loop, x[i, :] = res.x
.
Comments
Post a Comment