export and load_learner (for loading for inference) is not working
Closed this issue · 1 comments
butchland commented
My initial code to replace self.model
with self.model.state_dict()
allows export
and load_learner
to work but only if the inference model (inf_model = load_learner('export.pkl'
) was also running on an XLA device.
See this notebook for a work in progress solution
But restoring the model using load_learner in an environment without XLA results in an error shown below.
butchland commented
Fixed on multicore tpu mode by not moving the model in the main process to the TPU (only the spawned processes moves the wrapped model to the TPU)