tensor not found when using tf.estimator.WarmStartSettings
formath opened this issue · 3 comments
formath commented
I construct a model pScore = pSee * pCtr
where pCtr
has an old checkpoint and pSee
is newly added. Variables in pCtr
is restored by warm start and the name scope debias
of pSee
is excluded.
warm_start_conf = tf.estimator.WarmStartSettings(
ckpt_to_initialize_from='/xxx/checkpoint/',
vars_to_warm_start=['^(?!.*debias)'])
classifier = tf.estimator.Estimator(
model_fn=self.model_fn,
warm_start_from=warm_start_conf,
params={
'feature_columns': self.feature_columns_dict,
'lr': self.lr,
'optimizer': self.optimizer,
'task_type': FLAGS.task_type
},
config=tf.estimator.RunConfig(
session_config=session_config,
model_dir=self.checkpoint_path,
tf_random_seed=2022,
save_summary_steps=self.save_summary_steps,
log_step_count_steps=self.every_n_steps,
save_checkpoints_steps=self.save_checkpoint_and_eval_step,
keep_checkpoint_max=90)
)
However, an error occurs.
ValueError: Tensor input_layer/userid_embedding/embedding_weights is not found in /xxx/checkpoint
I am sure those embedding weights exist in pCtr
checkpoint.
input_layer/userid_embedding/embedding_weights/part_0-partition_filter_offset
input_layer/userid_embedding/embedding_weights/part_5-partition_offset
input_layer/userid_embedding/embedding_weights/part_1/Adam-freqs_filtered
input_layer/userid_embedding/embedding_weights/part_1/Adam_1-freqs
input_layer/userid_embedding/embedding_weights/part_4/Adam-partition_offset
input_layer/userid_embedding/embedding_weights/part_1/Adam_1-keys_filtered
input_layer/userid_embedding/embedding_weights/part_4-partition_filter_offset
input_layer/userid_embedding/embedding_weights/part_5-versions_filtered
input_layer/userid_embedding/embedding_weights/part_5/Adam_1-keys_filtered
input_layer/userid_embedding/embedding_weights/part_5-keys
input_layer/userid_embedding/embedding_weights/part_0/Adam-freqs_filtered
input_layer/userid_embedding/embedding_weights/part_1-freqs_filtered
input_layer/userid_embedding/embedding_weights/part_0/Adam_1-versions_filtered
input_layer/userid_embedding/embedding_weights/part_4-values
input_layer/userid_embedding/embedding_weights/part_5/Adam-freqs
input_layer/userid_embedding/embedding_weights/part_1/Adam-versions_filtered
input_layer/userid_embedding/embedding_weights/part_0-keys_filtered
input_layer/userid_embedding/embedding_weights/part_2-values
...
candyzone commented
@formath thanks, please show your all error message, including call stack.
It seems that something wrong parsing tensor in CKPT.
formath commented
@candyzone
userid_embedding
is a partitioned ev
embedding. Other variables have no problem. I guess the partitioned variables need a special logic.
Traceback (most recent call last):
File "prerank_debias.py", line 1854, in <module>
tf.app.run()
File "/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/platform/app.py", line 40, in run
_run(main=main, argv=argv, flags_parser=_parse_flags_tolerate_undef)
File "/usr/local/lib/python3.6/dist-packages/absl/app.py", line 308, in run
_run_main(main, args)
File "/usr/local/lib/python3.6/dist-packages/absl/app.py", line 254, in _run_main
sys.exit(main(argv))
File "prerank_debias.py", line 1839, in main
model.run()
File "prerank_debias.py", line 1690, in run
tf.estimator.train_and_evaluate(classifier, train_spec, eval_spec)
File "/usr/local/lib/python3.6/dist-packages/tensorflow_estimator/python/estimator/training.py", line 479, in train_and_evaluate
return executor.run()
File "/usr/local/lib/python3.6/dist-packages/tensorflow_estimator/python/estimator/training.py", line 646, in run
getattr(self, task_to_run)()
File "/usr/local/lib/python3.6/dist-packages/tensorflow_estimator/python/estimator/training.py", line 652, in run_chief
return self._start_distributed_training()
File "/usr/local/lib/python3.6/dist-packages/tensorflow_estimator/python/estimator/training.py", line 829, in _start_distributed_training
saving_listeners=saving_listeners)
File "/usr/local/lib/python3.6/dist-packages/tensorflow_estimator/python/estimator/estimator.py", line 373, in train
loss = self._train_model(input_fn, hooks, saving_listeners)
File "/usr/local/lib/python3.6/dist-packages/tensorflow_estimator/python/estimator/estimator.py", line 1195, in _train_model
return self._train_model_default(input_fn, hooks, saving_listeners)
File "/usr/local/lib/python3.6/dist-packages/tensorflow_estimator/python/estimator/estimator.py", line 1229, in _train_model_default
saving_listeners)
File "/usr/local/lib/python3.6/dist-packages/tensorflow_estimator/python/estimator/estimator.py", line 1404, in _train_with_estimator_spec
warm_starting_util.warm_start(*self._warm_start_settings)
File "/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/training/warm_starting_util.py", line 531, in warm_start
checkpoint_utils.init_from_checkpoint(ckpt_to_initialize_from, vocabless_vars)
File "/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/training/checkpoint_utils.py", line 293, in init_from_checkpoint
init_from_checkpoint_fn)
File "/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/distribute/distribute_lib.py", line 1940, in merge_call
return self._merge_call(merge_fn, args, kwargs)
File "/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/distribute/distribute_lib.py", line 1947, in _merge_call
return merge_fn(self._strategy, *args, **kwargs)
File "/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/training/checkpoint_utils.py", line 288, in <lambda>
ckpt_dir_or_file, assignment_map)
File "/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/training/checkpoint_utils.py", line 321, in _init_from_checkpoint
tensor_name_in_ckpt, ckpt_dir_or_file, variable_map
ValueError: Tensor input_layer/userid_embedding/embedding_weights is not found in /xxx/checkpoint/model.ckpt-755185 checkpoint {'user_dnn/dnn_2/bias/Adam_1': [64], 'user_dnn/dnn_2/bias/Adam': [64], 'user_dnn/dnn_2/bias': [64], 'user_dnn/dnn_1/kernel/Adam': [256, 128], 'user_dnn/dnn_1/kernel': [256, 128], 'user_dnn/dnn_1/bn/moving_variance': [128], 'user_dnn/dnn_1/bn/moving_mean': [128], 'user_dnn/dnn_1/bn/gamma/Adam_1': [128], 'user_dnn/dnn_1/bn/beta': [128], 'user_dnn/dnn_1/bias/Adam_1': [128], ........
Confucius-hui commented
Do you fix this issue?