keras-team/keras-tuner

TypeError: 'NoneType' object is not callable when calling tuner.get_best_models()

hmf opened this issue · 2 comments

hmf commented

Describe the bug

Simple example from the documentation produces:

Exception ignored in: <function _CheckpointRestoreCoordinatorDeleter.__del__ at 0x7f7201745b40>
Traceback (most recent call last):
  File "/home/vscode/.local/lib/python3.10/site-packages/tensorflow/python/checkpoint/checkpoint.py", line 194, in __del__
TypeError: 'NoneType' object is not callable

To Reproduce

https://colab.research.google.com/drive/1LNBu5Ea1TRoyJujoo9l9IQsACTZoUMUT#scrollTo=I-_CQkbqZRB_

Note: could not get this to compile in Colab. But works locally.

Expected behavior

No error. Get the best model with no problems

Additional context

None

Would you like to help us fix it?

No. But can try changes you suggest to diagnose code and report back.

hmf commented

Working. Seems to be an issue with the versions.

which version is the best for save checkpoints? i have that error on bayesian optimitation
NotFoundError: Unsuccessful TensorSliceReader constructor: Failed to find any matching files for /content/drive/MyDrive/....

resultados={}
listaValoresNdays=[7,14,21,28]
for i in listaValoresNdays:#anterior range(7,28,10):#(1,90,10)
      n_past=i
      cambio=datetime.datetime.now(timezone('UTC')) - timedelta(hours=3)
      tiempoFinal=f"{cambio:%Y-%m-%d %H:%M:%S }"
      print('Tiempo y n_past',tiempoFinal , ' n_past:-->',n_past)
      X_train, y_train = split_series(train.values,n_past, n_future)
      X_train = X_train.reshape((X_train.shape[0], X_train.shape[1],n_features))
      y_train = y_train.reshape((y_train.shape[0], y_train.shape[1], 1)) #no hace falta reshapear
      X_test, y_test = split_series(test.values,n_past, n_future)
      X_test = X_test.reshape((X_test.shape[0], X_test.shape[1],n_features))
      y_test = y_test.reshape((y_test.shape[0], y_test.shape[1], 1)) #no hace falta reshapear
      print("post1 X_train",X_train.shape,y_test.shape)
      tuner = MyTuner(
        lambda hp: build(hp,i ),
        objective ='val_loss',
        max_trials = 50,#10 o 20 o 30
        overwrite = False,
        directory = pathModelos,
        project_name = f'SMAPE-max_trials50- epochs100 {i}')
      #tunner= kt.Hyperband(model_builder,objective="val_loss",max_epochs=10,factor=3,directory=pathModelos,project_name='SMAPE-50-7')
      #tuner.search(X_train,y_train,epochs=30,validation_data=(X_test,y_test))
      callback = tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=10)#https://pub.towardsai.net/keras-earlystopping-callback-to-train-the-neural-networks-perfectly-2a3f865148f7
      #0.2857patience=5
      tuner.search(X_train, y_train, epochs = 100, callbacks=[callback,PrintTimeCallback()], validation_data = (X_test,y_test))
      resultados[str(i)]=deepcopy(tuner)
      del tuner