denizyuret/AutoGrad.jl

Dispatching on variables w.r.t. which gradients are taken

Closed this issue · 2 comments

Hi,

Firstly, I've found the package really helpful, so thanks for porting it over from Python.

It appears to be the case that one cannot do the following:

using AutoGrad
f(x::Float64) = x^2
df = grad(f)
df(5.0)

I obtain the error ERROR: MethodError: no method matching foo(::AutoGrad.Rec{Float64}). If I have interpreted this correctly, it would appear that one cannot dispatch on the type of the arguments with respect to which we are taking gradients, without making it a primitive and defining the appropriate computations involving the Jacobian. Is there a simple way to resolve this?

Thanks,
Will

Hi Deniz,

Thanks for the quick response. I figured that it might only be possible to dispatch on primitives, thanks for confirming.

Regards,
Will