Dispatching on variables w.r.t. which gradients are taken
Closed this issue · 2 comments
willtebbutt commented
Hi,
Firstly, I've found the package really helpful, so thanks for porting it over from Python.
It appears to be the case that one cannot do the following:
using AutoGrad
f(x::Float64) = x^2
df = grad(f)
df(5.0)
I obtain the error ERROR: MethodError: no method matching foo(::AutoGrad.Rec{Float64})
. If I have interpreted this correctly, it would appear that one cannot dispatch on the type of the arguments with respect to which we are taking gradients, without making it a primitive and defining the appropriate computations involving the Jacobian. Is there a simple way to resolve this?
Thanks,
Will
denizyuret commented
Hi Will,
Autograd works by feeding your function boxed parameters. Therefore I
recommend to keep the argument of grad a generic function. However if you
need to use typed functions you can declare them as primitives as described
in: https://github.com/denizyuret/AutoGrad.jl#extending-autograd
hope this helps,
deniz
…On Mon, Jan 30, 2017 at 4:24 PM Will Tebbutt ***@***.***> wrote:
Hi,
Firstly, I've found the package really helpful, so thanks for porting it
over from Python.
It appears to be the case that one cannot do the following:
using AutoGrad
f(x::Float64) = x^2
df = grad(f)
df(5.0)
I obtain the error ERROR: MethodError: no method matching
foo(::AutoGrad.Rec{Float64}). If I have interpreted this correctly, it
would appear that one cannot dispatch on the type of the arguments with
respect to which we are taking gradients. Is there a simple way to resolve
this?
Thanks,
Will
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#8>, or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABvNpjpoKOH1bVD-Smz9-CsrKETMS1bmks5rXeSBgaJpZM4LxZXi>
.
willtebbutt commented
Hi Deniz,
Thanks for the quick response. I figured that it might only be possible to dispatch on primitives, thanks for confirming.
Regards,
Will