kzl/decision-transformer

AttributeError: 'GPT2Config' object has no attribute 'n_ctx'

Closed this issue · 6 comments

Traceback (most recent call last):
File "D:\Python\Test\test\T\decision-transformer-master\gym\experiment.py", line 312, in
experiment('gym-experiment', variant=vars(args))
File "D:\Python\Test\test\T\decision-transformer-master\gym\experiment.py", line 213, in experiment
model = DecisionTransformer(
File "D:\Python\Test\test\T\decision-transformer-master\gym\decision_transformer\models\decision_transformer.py", line 39, in init
self.transformer = GPT2Model(config)
File "D:\Python\Test\test\T\decision-transformer-master\gym\decision_transformer\models\trajectory_gpt2.py", line 522, in init
self.h = nn.ModuleList([Block(config.n_ctx, config, scale=True) for _ in range(config.n_layer)])
File "D:\Python\Test\test\T\decision-transformer-master\gym\decision_transformer\models\trajectory_gpt2.py", line 522, in
self.h = nn.ModuleList([Block(config.n_ctx, config, scale=True) for _ in range(config.n_layer)])
File "E:\Python\python3.10.5\lib\site-packages\transformers\configuration_utils.py", line 260, in getattribute
return super().getattribute(key)
AttributeError: 'GPT2Config' object has no attribute 'n_ctx'

Hello, I also met the same problem, may I ask you how to solve it?

May I know how you solved this problem

@NanBorui

This issue arises due to a version mismatch in huggingface/transformers. The n_ctx attribute has been either removed or altered in the newer versions of transformers, which is causing errors in this repository that uses an older version.

As specified in conda_env.yml, installing transformers==4.5.1 should resolve this issue.

If that's not feasible, you might be able to solve it by manually altering n_ctx, referencing this commit:

huggingface/transformers@5b45422.

When I install transformers==4.5.1, another mistake happened "packaging.version.InvalidVersion: Invalid version: '0.10.1,<0.11'". And when I change transformers version to 4.4.2, the problem of "packaging.version" solved, but the ['n_ctx'] mistake is still alive.

@feifei-feifei-hub try version transformers=4.12.5

if you are using a higher version of Transformers, change n_ctx to n_positions can solve this problem