ifsheldon/stannum

Proxy `torch.nn.Parameter` for PyTorch optimizers

Opened this issue · 2 comments

Now Tin and Tube are subclasses of torch.nn.Module and they can have learnable parameters in the form of values in Taichi fields. However, now these values cannot be optimized by PyTorch optimizers, since they are not PyTorch-compatible. One way to make them to be PyTorch-compatible is to use a proxy torch.nn.Parameter and sync the values of torch.nn.Parameter with those in the corresponding Taichi field.

If anyone come up with a better solution, discussions and PRs are always welcomed.

rdesc commented

Hi! I'm hoping to try doing RL with difftaichi and stannum. Any update on this issue?

Hi @rdesc ! I'm sorry, I have little time to work on this. If you can make a PR, I will be happy to review it.