ExplainableML/czsl

use C-GQA datasets will make cuda memory be used up quickly

Opened this issue · 1 comments

I found that when I use C-GQA datasets, I will get this problem "RuntimeError: CUDA out of memory" . The Navida gpu momery will be used up even though I use RTX 4090 (24GB) or V-100(32GB) . However , the problem won't happen when I use mit-states and ut-zappos datasets. I don't know if any processed steps exist before using C-GQA datasets。I can't solve it, Please help me.

I've encountered this situation too, when I was testing the C-GQA dataset in an open-world setting.