yaringal/ConcreteDropout

value to set for weight_regularizer & dropout_regularizer when dataset size is unknown

Opened this issue · 2 comments

Hi,

To calculate the value of weight_regularizer & dropout_regularizer the dataset_size N is used. What if we don't know the dataset size in advance like in case of Lifelong Reinforcement learning in which case re-training happens on a periodic basis on new data as it comes.

I am not able to figure out in such cases how we will be able to use this concrete dropout implementation.

Can you please suggest on how to handle such situations using Concrete dropout. Will really appreciate it.

Thanks in Advance
Nikhil

I am facing the same issue. Did you find an answer?

You can update N as the amount of data increases (have a look at this). In effect this will push the dropout p towards 0 in the limit of data