BobaZooba/xllm-demo

TypeError: Registry.add() got an unexpected keyword argument 'item'

rshoemake-launch opened this issue · 5 comments

I'm trying to run this basic command directly, and getting the following error:

(xllm) rshoemake@jeremiah:~/git/xllm-demo$ PYTHONPATH=. python xllm_demo/cli/prepare.py
[2023-11-16 15:24:24,708] [INFO] [real_accelerator.py:158:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Traceback (most recent call last):
File "/home/rshoemake/git/xllm-demo/xllm_demo/cli/prepare.py", line 21, in
components_registry()
File "/home/rshoemake/git/xllm-demo/xllm_demo/core/registry.py", line 28, in components_registry
datasets_registry.add(key=DATASET_KEY, item=AntropicDataset)
TypeError: Registry.add() got an unexpected keyword argument 'item'

I've checked the xllm project, and it doesnt have that as a parameter. So, there seems to be a mismatch between the two.
I'm going to try to hack a solution for myself, but I wanted to see if there was something I was doing stupidly, or at least to make sure that I do my best to 'fix' it the right way.

I've changed the word 'item' to 'value', and it seems to work fine now except now my 12GB GPU isnt enough for the prepare step. :-(

Thank you for pointing out the error with the item. I have corrected it. Regarding the issue of insufficient RAM for the prepare step, it arises because, during this step, the model is downloaded and loaded into RAM. I did this to simulate that on the following step, this model would be loaded and everything should go smoothly. Now, I understand that this approach is suboptimal, redundant, and may lead to similar instances that you've experienced. Simply downloading the model will suffice. Thank you for helping me realize this. I will amend this behavior next week. Meanwhile, you may use the GeneralDataset from xllm.

An example Colab can be found here:
https://colab.research.google.com/drive/1CNNB_HPhQ8g7piosdehqWlgA30xoLauP?usp=sharing

I'm closing this issue, because bug about item at registry is solved and current problem is about xllm library

xllm related issue: BobaZooba/xllm#3

Thank you so much for your prompt response! I'm glad that I could help in some small way. Keep up the great work!

Thank you, @rshoemake-launch

Feel free to contact me on anything related