google-deepmind/deepmind-research

Enformer: can I increase the input sequence size in training data?

exnx opened this issue · 0 comments

exnx commented

Hi, I had some questions for the Enformer model.

I was wondering what steps are involved in increasing the context length of both the input sequence (and training data) and the model input.

For example, if I'd like to decrease the bin size and increase the length of the sequence I'd like to process.

Has anyone done that before? Thanks!