3KG Pretraining Issue: Key 'final_dim' not in 'ECGTransformerConfig'
KadenMc opened this issue · 1 comments
When trying to perform 3KG pretraining using the command fairseq-hydra-train task.data=./manifest/total --config-dir ./fairseq-signals/examples/3kg/config/pretraining/ecg_transformer --config-name 3kg
, I am receiving the error:
omegaconf.errors.ConfigKeyError: Key 'final_dim' not in 'ECGTransformerConfig'
full_key: final_dim
reference_type=Optional[ECGTransformerConfig]
object_type=ECGTransformerConfig
This error does disappear when the final_dim
line of the 3kg.yaml
file is commented out:
model:
# final_dim: 256
Is this an issue with the config? Is commenting out this line the correct fix?
Yes, exactly.
final_dim
is used when projecting the final contextualized vectors from the transformer encoder onto this dimension, but is used only in wav2vec2
models and deprecated in other models such as 3kg
, simclr
, or cmsc
.
I will handle these errors by removing the argument final_dim
from the relevant model configurations.
Thank you for contributing :)