scikit-hep/root_numpy

Opening Root file with a TTree and some histograms

lpernie opened this issue · 0 comments

Dear All,

When I try to load a root file (that you can find here [1]) I get the following error [2].
Usually I never had problems using the command below:
df = spark.read.format("org.dianahep.sparkroot").load("file.root")

One thing I realized is that this file have few Histograms together with the TTree:
KEY: TH1D autoPU;1 pileup
KEY: TH1F h_cutflow;1 h_cutflow
KEY: TH1F h_cutflow_DoubleMuon;1 h_cutflow_DoubleMuon
KEY: TH1F h_cutflow_DoubleEG;1 h_cutflow_DoubleEG
KEY: TH1F h_cutflow_MuonEG;1 h_cutflow_MuonEG
KEY: TH1F h_cutflow_weight;1 h_cutflow_weight
KEY: TTree Friends;2 Friend tree for Events
KEY: TTree Friends;1 Friend tree for Events

Can this be the reason of the error?

Cheers,
Luca

[1] https://www.dropbox.com/s/8yzbdvs4rbaiyf7/6214A145-5711-E811-997E-0CC47A78A42C_Friend.root?dl=0

[2]
Map(path -> /data/taohuang/HHNtuple_20180418_DYestimation/DYJetsToLL_M-10to50_TuneCUETP8M1_13TeV-madgraphMLM-pythia8/6214A145-5711-E811-997E-0CC47A78A42C_Friend.root)
Warnng: Generating dummy read for fIOBits
Traceback (most recent call last):
File "trainDY.py", line 32, in
df = spark.read.format("org.dianahep.sparkroot").load("/data/taohuang/HHNtuple_20180418_DYestimation/DYJetsToLL_M-10to50_TuneCUETP8M1_13TeV-madgraphMLM-pythia8/6214A145-5711-E811-997E-0CC47A78A42C_Friend.root")
File "/home/demarley/anaconda2/lib/python2.7/site-packages/pyspark/sql/readwriter.py", line 166, in load
return self._df(self._jreader.load(path))
File "/home/demarley/anaconda2/lib/python2.7/site-packages/py4j/java_gateway.py", line 1160, in call
answer, self.gateway_client, self.target_id, self.name)
File "/home/demarley/anaconda2/lib/python2.7/site-packages/pyspark/sql/utils.py", line 63, in deco
return f(*a, **kw)
File "/home/demarley/anaconda2/lib/python2.7/site-packages/py4j/protocol.py", line 320, in get_return_value
format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling o41.load.
: java.io.IOException: Cannot skip object with no length
at org.dianahep.root4j.core.RootInputStream.skipObject(RootInputStream.java:596)
at org.dianahep.root4j.core.RootHDFSInputStream.skipObject(RootHDFSInputStream.java:387)
at org.dianahep.root4j.proxy.ROOT.TIOFeatures.readMembers()
at org.dianahep.root4j.core.AbstractRootObject.read(AbstractRootObject.java:52)
at org.dianahep.root4j.core.RootInputStream.readObject(RootInputStream.java:466)
at org.dianahep.root4j.core.RootHDFSInputStream.readObject(RootHDFSInputStream.java:222)
at org.dianahep.root4j.proxy.TTree.readMembers()
at org.dianahep.root4j.core.AbstractRootObject.read(AbstractRootObject.java:52)
at org.dianahep.root4j.proxy.TKey.getObject(:57)
at org.dianahep.sparkroot.core.package$$anonfun$findTree$1.apply(ast.scala:1177)
at org.dianahep.sparkroot.core.package$$anonfun$findTree$1.apply(ast.scala:1166)
at scala.collection.immutable.Range.foreach(Range.scala:160)
at org.dianahep.sparkroot.core.package$.findTree(ast.scala:1166)
at org.dianahep.sparkroot.package$RootTableScan.(sparkroot.scala:97)
at org.dianahep.sparkroot.DefaultSource.createRelation(sparkroot.scala:146)
at org.dianahep.sparkroot.DefaultSource.createRelation(sparkroot.scala:143)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:340)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:239)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:227)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:174)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:214)
at java.lang.Thread.run(Thread.java:748)