lucidrains/invariant-point-attention

Equivariance test for IPA Transformer

amrhamedp opened this issue · 1 comments

@lucidrains I would like to ask about the equivariance of the transformer (not IPA blocks). I wonder if you checked for the equivariance of the output when you allow the transformation of local points to global points using the updated quaternions and translations. I am not sure why this test fails in my case.

I would like to ask about the equivariance. What does it even mean? Why does it matter?