Computing the Jacobian fails for functions of shape (1,)
jeffrey-hokanson opened this issue · 0 comments
jeffrey-hokanson commented
The call to Jacobian with a function of shape (1,) fails. Although this can be computed using Gradient, this case is helpful when checking the derivates of constraints in optimization problems.
Example
import numpy as np
import numdifftools as nd
fun = lambda x: np.array([np.sum(x**2) - 1,])
J1 = nd.Jacobian(fun)(x)
This fails with message
Traceback (most recent call last):
File "/Users/jeffreyh/SVN/ExtOpt/tests/test_numdifftools.py", line 20, in <module>
J1 = nd.Jacobian(fun)(x)
File "/opt/homebrew/lib/python3.10/site-packages/numdifftools/core.py", line 431, in __call__
return super(Jacobian, self).__call__(np.atleast_1d(x), *args, **kwds)
File "/opt/homebrew/lib/python3.10/site-packages/numdifftools/core.py", line 288, in __call__
results, f_xi = self._derivative(x_i, args, kwds)
File "/opt/homebrew/lib/python3.10/site-packages/numdifftools/core.py", line 428, in _derivative_nonzero_order
return self.fd_rule.apply(results, steps2, step_ratio), fxi
File "/opt/homebrew/lib/python3.10/site-packages/numdifftools/finite_difference.py", line 583, in apply
f_del, h, original_shape = self._vstack(sequence, steps)
File "/opt/homebrew/lib/python3.10/site-packages/numdifftools/finite_difference.py", line 684, in _vstack
h = np.vstack([np.atleast_1d(r).transpose(axes).ravel() for r in steps])
File "/opt/homebrew/lib/python3.10/site-packages/numdifftools/finite_difference.py", line 684, in <listcomp>
h = np.vstack([np.atleast_1d(r).transpose(axes).ravel() for r in steps])
ValueError: axes don't match array
In comparison, if we have output of shape (2,), this works
import numpy as np
import numdifftools as nd
fun = lambda x: np.array([np.sum(x**2) - 1,0])
J1 = nd.Jacobian(fun)(x)