GauravIyer/MAML-Pytorch

What is the difference between argforward() and forward() functions in Experiment 1 SineNet(nn.Module)?

Closed this issue · 2 comments

Thanks for sharing your code. May I know what is the difference between argforward() and forward() in Experiment_1_Sine_Regression.ipynb ?

Thanks

Functionally argforward() and forward() are the same, and forward() is actually redundant -- thanks for pointing this out! I'll remove forward() soon, I probably typed it up on auto-pilot and forgot to remove it. Essentially, there is no real difference between the two.

To answer your question and clear things up a little more: I implemented argforward() so that I could use a set of custom weights for evaluation. This is important for the "inner loop" in MAML where you temporarily update the weights of the network for a task to calculate the meta-loss and then reset them for the next meta-task.

Once again, thanks for pointing out the redundancy! :)

@GauravIyer No problem. Thanks a lot for sharing your code and answering my question.

I tried to understand MAML with a single neuron simple example without bias, and with weight =1 and used your code to implement and adapt according to your code. But still, I am not sure if my understanding is correct or not. I share my notes in this link.
Eq. 12 shows the chain rule, but I am not sure if Eq 14 which is the second-order gradient of MSE loss is calculated over the support set or query set.

I did not see this second-order gradient in your code, or if I am missing sth in your code?!