[94X] size limits and crab submission
Closed this issue · 3 comments
Dear All,
It seems a bit of stripping of old unused information might be necessary at some point. I am trying to launch a private microAOD production from standard CMS samples, but I the submission invariably fails no matter which unused file I remove from my 9_4_9
external/data
path. See, e.g. the latest failure:
ERROR 2019-03-12 19:26:51,800: Caught exception
Traceback (most recent call last):
File "/cvmfs/cms.cern.ch/crab3/slc6_amd64_gcc493/cms/crabclient/3.3.1903.patch1/bin/crab", line 160, in <module>
client()
File "/cvmfs/cms.cern.ch/crab3/slc6_amd64_gcc493/cms/crabclient/3.3.1903.patch1/bin/crab", line 148, in __call__
self.cmd()
File "/cvmfs/cms.cern.ch/crab3/slc6_amd64_gcc493/cms/crabclient/3.3.1903.patch1/lib/python2.7/site-packages/CRABClient/Commands/submit.py", line 102, in __call__
dummy_inputfiles, jobconfig = plugjobtype.run(filecacheurl)
File "/cvmfs/cms.cern.ch/crab3/slc6_amd64_gcc493/cms/crabclient/3.3.1903.patch1/lib/python2.7/site-packages/CRABClient/JobType/Analysis.py", line 167, in run
raise ClientException(msg)
ClientException: Impossible to upload the sandbox tarball.
Error message: ^[[91mError^[[0m: input tarball size 191 MB exceeds maximum allowed limit of 100 MB
sandbox content sorted by size[Bytes]:
[stripped files list]
On a fresh flashgg
, this tarball size is 232 MB (limit at 100 MB). Can it be that I am missing something obvious there?
Many thanks in advance!
Dear Laurent,
yes it's a known issue and in the developing branch we have already
removed large files that are no longer needed. The workaround we have
been using in the past years is to delete manually all catalogues under
MetaData/data/. This is usually enough.
Best,
simone
Many thanks for the workaround, @simonepigazzini !
Addressed by PR #1095. Tarball size is still too large, large binary files will be moved to cms-data in the near future.