TsinghuaDatabaseGroup/DB-GPT

需要安装 protobuf 包

Closed this issue · 1 comments

/home/hw/miniconda3/envs/D-Bot/lib/python3.10/site-packages/bmtrain/synchronize.py:14: UserWarning: The torch.cuda.DtypeTensor constructors are no longer recommended. It's best to use methods such as torch.tensor(data, dtype=, device='cuda') to create tensors. (Triggered internally at ../torch/csrc/tensor/python_tensor.cpp:83.)
barrier = torch.cuda.FloatTensor([1])
/home/hw/miniconda3/envs/D-Bot/lib/python3.10/site-packages/bmtrain/synchronize.py:15: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
nccl.allReduce(barrier.storage(), barrier.storage(), 'sum', config['comm'])
args.load is not None, start to load checkpoints /home/hw/YYG/D-Bot/DiagLlama/DiagLlama.pt
[INFO][2023-12-12 19:39:19][jeeves-hpc-gpu00][inference.py:69:1135371] - load model in 30.93s
You are using the default legacy behaviour of the <class 'transformers.models.llama.tokenization_llama.LlamaTokenizer'>. This is expected, and simply means that the legacy (previous) behavior will be used so nothing changes for you. If you want to use the new behaviour, set legacy=False. This should only be set if you understand what it means, and thouroughly read the reason why this was added as explained in huggingface/transformers#24565
0%| | 0/1
Traceback (most recent call last):
File "/home/hw/miniconda3/envs/D-Bot/lib/python3.10/asyncio/runners.py", line 44, in run
return loop.run_until_complete(main)
File "/home/hw/miniconda3/envs/D-Bot/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
return future.result()
File "/home/hw/YYG/D-Bot/DB-GPT/main.py", line 14, in main
report, records = await multi_agents.run(args)
File "/home/hw/YYG/D-Bot/DB-GPT/multiagents/multiagents.py", line 65, in run
report, records = await self.environment.step(args)
File "/home/hw/YYG/D-Bot/DB-GPT/multiagents/environments/dba.py", line 172, in step
self.reporter.initialize_report()
File "/home/hw/YYG/D-Bot/DB-GPT/multiagents/agents/reporter.py", line 62, in initialize_report
anomaly_desc = self.llm.parse()
File "/home/hw/YYG/D-Bot/DB-GPT/multiagents/llms/diag_llama.py", line 83, in parse
output = llama_inference.inference(new_messages, max_in_len=self.args.max_in_len, max_length=self.args.max_length, beam_size=self.args.beam_size)
File "/home/hw/YYG/D-Bot/DB-GPT/diagllama/inference.py", line 222, in inference
self.tokenizer, self.model = setup_model(self.args)
File "/home/hw/YYG/D-Bot/DB-GPT/diagllama/inference.py", line 72, in setup_model
python-BaseException
tokenizer = get_tokenizer(args)
File "/home/hw/YYG/D-Bot/DB-GPT/diagllama/inference.py", line 35, in get_tokenizer
tokenizer = AutoTokenizer.from_pretrained(args.vocab,
File "/home/hw/miniconda3/envs/D-Bot/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 768, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/home/hw/miniconda3/envs/D-Bot/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2024, in from_pretrained
return cls._from_pretrained(
File "/home/hw/miniconda3/envs/D-Bot/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2256, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "/home/hw/miniconda3/envs/D-Bot/lib/python3.10/site-packages/transformers/models/llama/tokenization_llama_fast.py", line 124, in init
super().init(
File "/home/hw/miniconda3/envs/D-Bot/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 114, in init
fast_tokenizer = convert_slow_tokenizer(slow_tokenizer)
File "/home/hw/miniconda3/envs/D-Bot/lib/python3.10/site-packages/transformers/convert_slow_tokenizer.py", line 1344, in convert_slow_tokenizer
return converter_class(transformer_tokenizer).converted()
File "/home/hw/miniconda3/envs/D-Bot/lib/python3.10/site-packages/transformers/convert_slow_tokenizer.py", line 464, in init
model_pb2 = import_protobuf()
File "/home/hw/miniconda3/envs/D-Bot/lib/python3.10/site-packages/transformers/convert_slow_tokenizer.py", line 37, in import_protobuf
if version.parse(google.protobuf.version) < version.parse("4.0.0"):
AttributeError: '_jpype._JPackage' object has no attribute 'version'

Thank you for your contribution to our project! We've add it to our requirements.txt.