No train split in `adversarial_qa_dbert_answer_the_following_q_template_0to10_no_opt_x_shot`
imoneoi opened this issue · 0 comments
imoneoi commented
I can't create "t0_submix", says no training split found in "adversarial_qa_dbert_answer_the_following_q_template_0to10_no_opt_x_shot"
ERROR:absl:Failed to load task 't0_task_adaptation:adversarial_qa_dbert_answer_the_following_q_template_0to10_no_opt_x_shot' as part of mixture 't0_submix'
Traceback (most recent call last):
File "/home/one/anaconda3/envs/flan/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/home/one/anaconda3/envs/flan/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/mnt/datadrive/Datasets/LLM/FLAN/flan/v2/run_example.py", line 100, in <module>
dataset = selected_mixture.get_dataset(
File "/home/one/anaconda3/envs/flan/lib/python3.10/site-packages/seqio/dataset_providers.py", line 1805, in get_dataset
ds = task.get_dataset(
File "/home/one/anaconda3/envs/flan/lib/python3.10/site-packages/seqio/dataset_providers.py", line 1443, in get_dataset
ds = source.get_dataset(
File "/home/one/anaconda3/envs/flan/lib/python3.10/site-packages/seqio/experimental.py", line 370, in get_dataset
train_ds = _get_maybe_sharded_dataset(
File "/home/one/anaconda3/envs/flan/lib/python3.10/site-packages/seqio/experimental.py", line 330, in _get_maybe_sharded_dataset
num_shards = len(self._original_source.list_shards(split_))
File "/home/one/anaconda3/envs/flan/lib/python3.10/site-packages/seqio/dataset_providers.py", line 511, in list_shards
return [_get_filename(info) for info in self.tfds_dataset.files(split)]
File "/home/one/anaconda3/envs/flan/lib/python3.10/site-packages/seqio/utils.py", line 161, in files
split_info = self.builder.info.splits[split]
File "/home/one/anaconda3/envs/flan/lib/python3.10/site-packages/tensorflow_datasets/core/splits.py", line 391, in __getitem__
raise KeyError(
KeyError: "Trying to access `splits['train']` but `splits` is empty. This likely indicate the dataset has not been generated yet."