BoChenGroup/PyDPM

A question about training different datasets without initialization

Closed this issue · 1 comments

How to make the model continuously train on two datasets. Specifically, how to continue training on the second data without initialization after training on the first one.

Thanks

for bayesian generative models, there is a var named is_initial_local to control the initialization of local params, and u can set this var as False when u want to retrain the model on the second dataset, thx

def train(self, data:np.ndarray, num_epochs: int=1, is_train: bool = True, is_initial_local: bool=True):