ThilinaRajapakse/simpletransformers

error when loading a saved fine tunned model

RafieeAshkan opened this issue · 2 comments

Discussed in #1507

Originally posted by RafieeAshkan March 20, 2023
HI,
Thank you for your excellent package. I have fine tuned a roberta model for text classification and saved the fine tuned model. I would like to now load the saved best model and use it for inference. However, I get runtime error as follows:

RuntimeError:
        An attempt has been made to start a new process before the
        current process has finished its bootstrapping phase.

        This probably means that you are not using fork to start your
        child processes and you have forgotten to use the proper idiom
        in the main module:

            if __name__ == '__main__':
                freeze_support()
                ...

        The "freeze_support()" line can be omitted if the program
        is not going to be frozen to produce an executable.

Here is how I am loading the saved model:

model_path = os.path.join(
    "nlp", "classification", "models", "roberta_base", "best_model"
)
model = ClassificationModel("roberta", model_path, use_cuda=True)
preds, outs = model.predict(test_data)

Any help is highly appreciated

Not sure, but this might be the same as #1375, specifically in this comment. Try the workaround provided there.

@luketudge thank you so much, it solved my issue