joelberkeley/spidr

Reuse constructed `XlaOp`s rather than rebuilding them every time

Closed this issue · 1 comments

When a Tensor is used more than once, e.g. x in x = 1; y = x + x, a new XlaOp is created for every usage, rather than using reusing the XlaOp. This wastes time in constructing, and likely compiling, the graph. It also may use more memory than necessary.

Refactor the internals of Tensor and Tensor ops such that values are reused.

completed by #247 #244 #250