ValueError: need at least one array to concatenate
Closed this issue · 1 comments
Bug summary
dpdata version : 0.2.21, dpgen version : 0.12.1, deepmd-kit version : 2.2.10(local).
Hello, I encountered this error while using DPGEN. The error occurred after the single-point energy calculation was completed and the data was sent to the Bohrium platform for potential function training. The error message is as follows:
Traceback (most recent call last):
File "/home/hanye/miniconda3/envs/deepmd/bin/dpgen", line 8, in
sys.exit(main())
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpgen/main.py", line 255, in main
args.func(args)
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpgen/generator/run.py", line 5394, in gen_run
run_iter(args.PARAM, args.MACHINE)
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpgen/generator/run.py", line 4722, in run_iter
make_train(ii, jdata, mdata)
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpgen/generator/run.py", line 389, in make_train
init_batch_size.append(detect_batch_size(ss, single_sys))
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpgen/generator/run.py", line 685, in detect_batch_size
s = dpdata.LabeledSystem(system, fmt=format)
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpdata/system.py", line 202, in init
self.from_fmt(
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpdata/system.py", line 239, in from_fmt
return self.from_fmt_obj(load_format(fmt), file_name, **kwargs)
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpdata/system.py", line 1226, in from_fmt_obj
data = fmtobj.from_labeled_system(file_name, **kwargs)
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpdata/plugins/deepmd.py", line 72, in from_labeled_system
return dpdata.deepmd.comp.to_system_data(
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpdata/deepmd/comp.py", line 46, in to_system_data
data["cells"] = np.concatenate(all_cells, axis=0)
ValueError: need at least one array to concatenate.
The machine.json file is as follows:
"api_version": "1.0",
"deepmd_version": "2.1.5",
"train" :
{
"command": "dp",
"machine": {
"batch_type": "Bohrium",
"context_type": "BohriumContext",
"local_root" : "./",
"remote_profile":{
"email": "",
"password": "",
"program_id": ,
"keep_backup":true,
"input_data":{
"log_file": "00/train.log",
"grouped":true,
"job_name": "",
"disk_size": 100,
"scass_type":"c12_m92_1 * NVIDIA V100",
"checkpoint_files":["00/checkpoint","00*/model.ckpt*"],
"checkpoint_time":30,
"platform": "ali",
"job_type": "container",
"image_address":"registry.dp.tech/dptech/deepmd-kit:2.1.5-cuda11.6",
"on_demand":0
}
}
},
"resources": {
"number_node": 1,
"cpu_per_node": 8,
"gpu_per_node": 1,
"queue_name": "V100_8_32",
"group_size": 1,
"custom_flags": [],
"strategy": {"if_cuda_multi_devices": true},
"para_deg": 3,
"source_list": []
}
},
DeePMD-kit Version
deepmd-kit version : 2.2.10(local), deepmd-kit version:2.1.5(bohrium)
Backend and its version
tensorflow v2.9.0
How did you download the software?
conda
Input Files, Running Commands, Error Log, etc.
Hello, I encountered this error while using DPGEN. The error occurred after the single-point energy calculation was completed and the data was sent to the Bohrium platform for potential function training.
Command:
nohup dpgen run param.json machine.json 1>log 2>err&
The error message is as follows:
Traceback (most recent call last):
File "/home/hanye/miniconda3/envs/deepmd/bin/dpgen", line 8, in
sys.exit(main())
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpgen/main.py", line 255, in main
args.func(args)
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpgen/generator/run.py", line 5394, in gen_run
run_iter(args.PARAM, args.MACHINE)
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpgen/generator/run.py", line 4722, in run_iter
make_train(ii, jdata, mdata)
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpgen/generator/run.py", line 389, in make_train
init_batch_size.append(detect_batch_size(ss, single_sys))
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpgen/generator/run.py", line 685, in detect_batch_size
s = dpdata.LabeledSystem(system, fmt=format)
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpdata/system.py", line 202, in init
self.from_fmt(
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpdata/system.py", line 239, in from_fmt
return self.from_fmt_obj(load_format(fmt), file_name, **kwargs)
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpdata/system.py", line 1226, in from_fmt_obj
data = fmtobj.from_labeled_system(file_name, **kwargs)
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpdata/plugins/deepmd.py", line 72, in from_labeled_system
return dpdata.deepmd.comp.to_system_data(
File "/home/hanye/miniconda3/envs/deepmd/lib/python3.10/site-packages/dpdata/deepmd/comp.py", line 46, in to_system_data
data["cells"] = np.concatenate(all_cells, axis=0)
ValueError: need at least one array to concatenate.
The machine.json file is as follows:
"api_version": "1.0",
"deepmd_version": "2.1.5",
"train" :
{
"command": "dp",
"machine": {
"batch_type": "Bohrium",
"context_type": "BohriumContext",
"local_root" : "./",
"remote_profile":{
"email": "",
"password": "",
"program_id": ,
"keep_backup":true,
"input_data":{
"log_file": "00/train.log",
"grouped":true,
"job_name": "",
"disk_size": 100,
"scass_type":"c12_m92_1 * NVIDIA V100",
"checkpoint_files":["00/checkpoint","00*/model.ckpt*"],
"checkpoint_time":30,
"platform": "ali",
"job_type": "container",
"image_address":"registry.dp.tech/dptech/deepmd-kit:2.1.5-cuda11.6",
"on_demand":0
}
}
},
"resources": {
"number_node": 1,
"cpu_per_node": 8,
"gpu_per_node": 1,
"queue_name": "V100_8_32",
"group_size": 1,
"custom_flags": [],
"strategy": {"if_cuda_multi_devices": true},
"para_deg": 3,
"source_list": []
}
},
Steps to Reproduce
..
Further Information, Files, and Links
No response
The problem has been sovled.