Cannot use JOIN
Opened this issue · 1 comments
Hi,
Thanks for simple-hive. It's been great for quickly getting Hive running on CF infrastructure that does not support Docker.
We're running into an issue where any query using a JOIN fails.
set hive.execution.engine=mr;
SELECT * FROM A JOIN B ON B.id = A.b_id;
2017-01-23T12:25:08.67+0000 [APP/0] OUT 2017-01-23 12:25:08.668 INFO 20 --- [Pool: Thread-35] o.a.h.s.cli.thrift.ThriftHttpServlet : Could not validate cookie sent, will try to generate a new cookie
2017-01-23T12:25:08.75+0000 [APP/0] OUT 2017-01-23 12:25:08.754 INFO 20 --- [Pool: Thread-35] o.a.h.s.cli.thrift.ThriftHttpServlet : Cookie added for clientUserName admin
2017-01-23T12:25:08.75+0000 [APP/0] OUT 2017-01-23 12:25:08.755 INFO 20 --- [Pool: Thread-35] o.a.h.service.cli.operation.Operation : Putting temp output to file /home/vcap/tmp/hive8156245218269664233/localScratch/998589d1-425e-4e06-9
3e0-c332351143bb928185943743737458.pipeout
2017-01-23T12:25:08.77+0000 [APP/0] OUT 2017-01-23 12:25:08.775 INFO 20 --- [Pool: Thread-35] o.a.h.s.cli.thrift.ThriftHttpServlet : Could not validate cookie sent, will try to generate a new cookie
2017-01-23T12:25:08.86+0000 [APP/0] OUT 2017-01-23 12:25:08.860 INFO 20 --- [Pool: Thread-35] o.a.h.s.cli.thrift.ThriftHttpServlet : Cookie added for clientUserName admin
2017-01-23T12:25:08.88+0000 [APP/0] OUT 2017-01-23 12:25:08.885 INFO 20 --- [Pool: Thread-35] o.a.h.s.cli.thrift.ThriftHttpServlet : Could not validate cookie sent, will try to generate a new cookie
2017-01-23T12:25:08.97+0000 [APP/0] OUT 2017-01-23 12:25:08.974 INFO 20 --- [Pool: Thread-35] o.a.h.s.cli.thrift.ThriftHttpServlet : Cookie added for clientUserName admin
2017-01-23T12:25:14.82+0000 [APP/0] OUT 2017-01-23 12:25:14.820 INFO 20 --- [Pool: Thread-35] o.a.h.s.cli.thrift.ThriftHttpServlet : Could not validate cookie sent, will try to generate a new cookie
2017-01-23T12:25:14.90+0000 [APP/0] OUT 2017-01-23 12:25:14.905 INFO 20 --- [Pool: Thread-35] o.a.h.s.cli.thrift.ThriftHttpServlet : Cookie added for clientUserName admin
2017-01-23T12:25:14.90+0000 [APP/0] OUT 2017-01-23 12:25:14.906 INFO 20 --- [Pool: Thread-35] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:25:14.90+0000 [APP/0] OUT 2017-01-23 12:25:14.907 INFO 20 --- [Pool: Thread-35] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:25:14.90+0000 [APP/0] OUT 2017-01-23 12:25:14.907 INFO 20 --- [Pool: Thread-35] hive.ql.parse.ParseDriver : Parsing command: SELECT p.objectid FROM opportunitystaff p JOIN opportunit76630dff c ON c.objectid=p
.type_id
2017-01-23T12:25:14.90+0000 [APP/0] OUT 2017-01-23 12:25:14.907 INFO 20 --- [Pool: Thread-35] hive.ql.parse.ParseDriver : Parse Completed
2017-01-23T12:25:14.90+0000 [APP/0] OUT 2017-01-23 12:25:14.907 INFO 20 --- [Pool: Thread-35] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=parse start=1485174314907 end=1485174314907 duration=0 from=org.apache.hadoop.hive.
ql.Driver>
2017-01-23T12:25:14.90+0000 [APP/0] OUT 2017-01-23 12:25:14.907 INFO 20 --- [Pool: Thread-35] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:25:14.90+0000 [APP/0] OUT 2017-01-23 12:25:14.908 INFO 20 --- [Pool: Thread-35] o.a.h.hive.ql.parse.SemanticAnalyzer : Starting Semantic Analysis
2017-01-23T12:25:14.90+0000 [APP/0] OUT 2017-01-23 12:25:14.908 INFO 20 --- [Pool: Thread-35] o.a.h.hive.ql.parse.SemanticAnalyzer : Completed phase 1 of Semantic Analysis
2017-01-23T12:25:14.90+0000 [APP/0] OUT 2017-01-23 12:25:14.908 INFO 20 --- [Pool: Thread-35] o.a.h.hive.ql.parse.SemanticAnalyzer : Get metadata for source tables
2017-01-23T12:25:14.90+0000 [APP/0] OUT 2017-01-23 12:25:14.908 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.metastore.HiveMetaStore : 17: get_table : db=b8e90977_fcff_4ab8_aae5_b1c7c74d34a6 tbl=opportunitystaff
2017-01-23T12:25:14.90+0000 [APP/0] OUT 2017-01-23 12:25:14.908 INFO 20 --- [Pool: Thread-35] o.a.h.h.metastore.HiveMetaStore.audit : ugi=admin ip=unknown-ip-addr cmd=get_table : db=b8e90977_fcff_4ab8_aae5_b1c7c74d34a6 tbl
=opportunitystaff
2017-01-23T12:25:14.92+0000 [APP/0] OUT 2017-01-23 12:25:14.922 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.metastore.HiveMetaStore : 17: get_table : db=b8e90977_fcff_4ab8_aae5_b1c7c74d34a6 tbl=opportunit76630dff
2017-01-23T12:25:14.92+0000 [APP/0] OUT 2017-01-23 12:25:14.922 INFO 20 --- [Pool: Thread-35] o.a.h.h.metastore.HiveMetaStore.audit : ugi=admin ip=unknown-ip-addr cmd=get_table : db=b8e90977_fcff_4ab8_aae5_b1c7c74d34a6 tbl
=opportunit76630dff
2017-01-23T12:25:14.93+0000 [APP/0] OUT 2017-01-23 12:25:14.930 INFO 20 --- [Pool: Thread-35] o.a.h.hive.ql.parse.SemanticAnalyzer : Get metadata for subqueries
2017-01-23T12:25:14.93+0000 [APP/0] OUT 2017-01-23 12:25:14.931 INFO 20 --- [Pool: Thread-35] o.a.h.hive.ql.parse.SemanticAnalyzer : Get metadata for destination tables
2017-01-23T12:25:14.93+0000 [APP/0] OUT 2017-01-23 12:25:14.934 INFO 20 --- [Pool: Thread-35] o.a.h.hive.ql.parse.SemanticAnalyzer : Completed getting MetaData in Semantic Analysis
2017-01-23T12:25:14.93+0000 [APP/0] OUT 2017-01-23 12:25:14.936 INFO 20 --- [Pool: Thread-35] org.apache.hadoop.hive.common.FileUtils : Creating directory if it doesn't exist: file:/home/vcap/tmp/hive8156245218269664233/localScratch/998
589d1-425e-4e06-93e0-c332351143bb/hive_2017-01-23_12-25-14_906_3555154913006934385-3/-mr-10000/.hive-staging_hive_2017-01-23_12-25-14_906_3555154913006934385-3
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.961 INFO 20 --- [Pool: Thread-35] o.a.h.hive.ql.parse.SemanticAnalyzer : Set stats collection dir : file:/home/vcap/tmp/hive8156245218269664233/localScratch/998589d1-425e-4e
06-93e0-c332351143bb/hive_2017-01-23_12-25-14_906_3555154913006934385-3/-mr-10000/.hive-staging_hive_2017-01-23_12-25-14_906_3555154913006934385-3/-ext-10002
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.964 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for FS(290)
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.964 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for SEL(289)
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.964 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for JOIN(288)
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.964 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for RS(285)
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.964 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for FIL(284)
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.965 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.ql.ppd.OpProcFactory : Pushdown Predicates of FIL For Alias : p
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.965 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.ql.ppd.OpProcFactory : type_id is not null
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.965 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for TS(282)
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.965 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.ql.ppd.OpProcFactory : Pushdown Predicates of TS For Alias : p
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.965 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.ql.ppd.OpProcFactory : type_id is not null
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.965 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for RS(287)
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.965 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for FIL(286)
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.965 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.ql.ppd.OpProcFactory : Pushdown Predicates of FIL For Alias : c
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.965 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.ql.ppd.OpProcFactory : objectid is not null
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.965 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for TS(283)
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.966 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.ql.ppd.OpProcFactory : Pushdown Predicates of TS For Alias : c
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.966 INFO 20 --- [Pool: Thread-35] o.a.hadoop.hive.ql.ppd.OpProcFactory : objectid is not null
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.967 INFO 20 --- [Pool: Thread-35] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=partition-retrieving from=org.apache.hadoop.hive.ql.optimizer.ppr.PartitionPruner>
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.967 INFO 20 --- [Pool: Thread-35] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=partition-retrieving start=1485174314966 end=1485174314967 duration=1 from=org.apac
he.hadoop.hive.ql.optimizer.ppr.PartitionPruner>
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.967 INFO 20 --- [Pool: Thread-35] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=partition-retrieving from=org.apache.hadoop.hive.ql.optimizer.ppr.PartitionPruner>
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.967 INFO 20 --- [Pool: Thread-35] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=partition-retrieving start=1485174314967 end=1485174314967 duration=0 from=org.apac
he.hadoop.hive.ql.optimizer.ppr.PartitionPruner>
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.967 INFO 20 --- [Pool: Thread-35] o.a.h.h.q.o.ColumnPrunerProcFactory : JOIN 288 oldExprs: {0=[Column[VALUE._col0], Column[VALUE._col1], Column[VALUE._col2], Column[VALUE._
col3], Column[VALUE._col4], Column[VALUE._col5], Column[VALUE._col6], Column[VALUE._col7], Column[VALUE._col8], Column[VALUE._col9], Column[VALUE._col10], Column[VALUE._col11], Column[VALUE._col12], Column[VALUE._col13], Column[VALUE._col14],
Column[KEY.reducesinkkey0], Column[VALUE._col15], Column[VALUE._col16], Column[VALUE._col17], Column[VALUE._col18], Column[VALUE._col19], Column[VALUE._col20], Column[VALUE._col21]], 1=[Column[KEY.reducesinkkey0], Column[VALUE._col0], Column[V
ALUE._col1], Column[VALUE._col2], Column[VALUE._col3], Column[VALUE._col4], Column[VALUE._col5], Column[VALUE._col6], Column[VALUE._col7], Column[VALUE._col8], Column[VALUE._col9], Column[VALUE._col10], Column[VALUE._col11], Column[VALUE._col1
2], Column[VALUE._col13], Column[VALUE._col14], Column[VALUE._col15], Column[VALUE._col16]]}
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.967 INFO 20 --- [Pool: Thread-35] o.a.h.h.q.o.ColumnPrunerProcFactory : JOIN 288 newExprs: {0=[Column[VALUE._col0]], 1=[]}
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.967 INFO 20 --- [Pool: Thread-35] o.a.h.h.q.o.ColumnPrunerProcFactory : RS 285 oldColExprMap: {VALUE._col20=Column[INPUT__FILE__NAME], VALUE._col10=Column[shortdescription]
, VALUE._col21=Column[ROW__ID], VALUE._col11=Column[opportunity_id], VALUE._col12=Column[modifiedbyuser_id], KEY.reducesinkkey0=Column[type_id], VALUE._col2=Column[modifiedutc], VALUE._col3=Column[createdutc], VALUE._col4=Column[isdeleted], VA
LUE._col5=Column[display], VALUE._col0=Column[objectid], VALUE._col1=Column[objectclass], VALUE._col13=Column[originalsystem_id], VALUE._col14=Column[createdbyuser_id], VALUE._col15=Column[staffofficer_id], VALUE._col16=Column[contributid6c8f3
27], VALUE._col6=Column[display_u], VALUE._col17=Column[team_id], VALUE._col7=Column[sendnotification], VALUE._col18=Column[contributic200615c], VALUE._col8=Column[acl], VALUE._col19=Column[BLOCK__OFFSET__INSIDE__FILE], VALUE._col9=Column[cont
ributi7232ff8d]}
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.968 INFO 20 --- [Pool: Thread-35] o.a.h.h.q.o.ColumnPrunerProcFactory : RS 285 newColExprMap: {KEY.reducesinkkey0=Column[type_id], VALUE._col0=Column[objectid]}
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.968 INFO 20 --- [Pool: Thread-35] o.a.h.h.q.o.ColumnPrunerProcFactory : RS 287 oldColExprMap: {VALUE._col10=Column[rank], VALUE._col11=Column[modifiedbyuser_id], VALUE._col
12=Column[originalsystem_id], KEY.reducesinkkey0=Column[objectid], VALUE._col2=Column[createdutc], VALUE._col3=Column[isdeleted], VALUE._col4=Column[display], VALUE._col5=Column[display_u], VALUE._col0=Column[objectclass], VALUE._col1=Column[m
odifiedutc], VALUE._col13=Column[createdbyuser_id], VALUE._col14=Column[BLOCK__OFFSET__INSIDE__FILE], VALUE._col15=Column[INPUT__FILE__NAME], VALUE._col16=Column[ROW__ID], VALUE._col6=Column[sendnotification], VALUE._col7=Column[acl], VALUE._c
ol8=Column[name], VALUE._col9=Column[name_u]}
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.968 INFO 20 --- [Pool: Thread-35] o.a.h.h.q.o.ColumnPrunerProcFactory : RS 287 newColExprMap: {KEY.reducesinkkey0=Column[objectid]}
2017-01-23T12:25:14.96+0000 [APP/0] OUT 2017-01-23 12:25:14.969 INFO 20 --- [Pool: Thread-35] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=getInputSummary from=org.apache.hadoop.hive.ql.exec.Utilities>
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.970 INFO 20 --- [Pool: Thread-35] o.apache.hadoop.hive.ql.exec.Utilities : Cache Content Summary for file:/home/vcap/tmp/hive8156245218269664233/warehouse/b8e90977_fcff_4ab8_a
ae5_b1c7c74d34a6.db/opportunit76630dff length: 0 file count: 0 directory count: 1
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.970 INFO 20 --- [Pool: Thread-35] o.apache.hadoop.hive.ql.exec.Utilities : Cache Content Summary for file:/home/vcap/tmp/hive8156245218269664233/warehouse/b8e90977_fcff_4ab8_a
ae5_b1c7c74d34a6.db/opportunitystaff length: 0 file count: 0 directory count: 1
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.970 INFO 20 --- [Pool: Thread-35] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=getInputSummary start=1485174314969 end=1485174314970 duration=1 from=org.apache.ha
doop.hive.ql.exec.Utilities>
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.971 INFO 20 --- [Pool: Thread-35] o.a.h.h.q.o.p.LocalMapJoinProcFactory : Setting max memory usage to 0.9 for table sink not followed by group by
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.971 INFO 20 --- [Pool: Thread-35] o.a.h.h.q.o.p.NullScanTaskDispatcher : Looking for table scans where optimization is applicable
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.971 INFO 20 --- [Pool: Thread-35] o.a.h.h.q.o.p.NullScanTaskDispatcher : Found 0 null table scans
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.972 INFO 20 --- [Pool: Thread-35] o.a.h.h.q.o.p.NullScanTaskDispatcher : Looking for table scans where optimization is applicable
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.972 INFO 20 --- [Pool: Thread-35] o.a.h.h.q.o.p.NullScanTaskDispatcher : Found 0 null table scans
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.972 INFO 20 --- [Pool: Thread-35] o.a.h.h.q.o.p.NullScanTaskDispatcher : Looking for table scans where optimization is applicable
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.972 INFO 20 --- [Pool: Thread-35] o.a.h.h.q.o.p.NullScanTaskDispatcher : Found 0 null table scans
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.972 INFO 20 --- [Pool: Thread-35] o.a.h.hive.ql.parse.SemanticAnalyzer : Completed plan generation
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.972 INFO 20 --- [Pool: Thread-35] org.apache.hadoop.hive.ql.Driver : Semantic Analysis Completed
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.972 INFO 20 --- [Pool: Thread-35] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=semanticAnalyze start=1485174314907 end=1485174314972 duration=65 from=org.apache.h
adoop.hive.ql.Driver>
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.973 INFO 20 --- [Pool: Thread-35] o.a.h.hive.ql.exec.ListSinkOperator : Initializing operator OP[296]
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.973 INFO 20 --- [Pool: Thread-35] o.a.h.hive.ql.exec.ListSinkOperator : Initialization Done 296 OP
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.973 INFO 20 --- [Pool: Thread-35] o.a.h.hive.ql.exec.ListSinkOperator : Operator 296 OP initialized
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.973 INFO 20 --- [Pool: Thread-35] org.apache.hadoop.hive.ql.Driver : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:p.objectid, type:string, comment:null)]
, properties:null)
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.973 INFO 20 --- [Pool: Thread-35] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=compile start=1485174314906 end=1485174314973 duration=67 from=org.apache.hadoop.hi
ve.ql.Driver>
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.974 INFO 20 --- [ool: Thread-631] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=Driver.run from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.974 INFO 20 --- [ool: Thread-631] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=TimeToSubmit from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.974 INFO 20 --- [ool: Thread-631] org.apache.hadoop.hive.ql.Driver : Concurrency mode is disabled, not creating a lock manager
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.974 INFO 20 --- [ool: Thread-631] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=Driver.execute from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.974 INFO 20 --- [ool: Thread-631] org.apache.hadoop.hive.ql.Driver : Starting command(queryId=vcap_20170123122514_49622150-113e-485d-a0a0-8e99bd39f8d2): SELECT p.objecti
d FROM opportunitystaff p JOIN opportunit76630dff c ON c.objectid=p.type_id
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.975 INFO 20 --- [ool: Thread-631] org.apache.hadoop.hive.ql.Driver : Query ID = vcap_20170123122514_49622150-113e-485d-a0a0-8e99bd39f8d2
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.975 INFO 20 --- [ool: Thread-631] org.apache.hadoop.hive.ql.Driver : Total jobs = 1
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.975 INFO 20 --- [ool: Thread-631] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=TimeToSubmit start=1485174314974 end=1485174314975 duration=1 from=org.apache.hadoo
p.hive.ql.Driver>
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.975 INFO 20 --- [ool: Thread-631] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=runTasks from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.975 INFO 20 --- [ool: Thread-631] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=task.MAPREDLOCAL.Stage-4 from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:25:14.97+0000 [APP/0] OUT 2017-01-23 12:25:14.977 INFO 20 --- [ool: Thread-631] org.apache.hadoop.hive.ql.Driver : Starting task [Stage-4:MAPREDLOCAL] in serial mode
2017-01-23T12:25:14.98+0000 [APP/0] OUT 2017-01-23 12:25:14.981 INFO 20 --- [ool: Thread-631] o.a.h.hive.ql.exec.mr.MapredLocalTask : Generating plan file file:/home/vcap/tmp/hive8156245218269664233/localScratch/998589d1-425e-4e06-93e
0-c332351143bb/hive_2017-01-23_12-25-14_906_3555154913006934385-22/-local-10005/plan.xml
2017-01-23T12:25:14.98+0000 [APP/0] OUT 2017-01-23 12:25:14.984 INFO 20 --- [ool: Thread-631] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=serializePlan from=org.apache.hadoop.hive.ql.exec.Utilities>
2017-01-23T12:25:14.98+0000 [APP/0] OUT 2017-01-23 12:25:14.984 INFO 20 --- [ool: Thread-631] o.apache.hadoop.hive.ql.exec.Utilities : Serializing MapredLocalWork via kryo
2017-01-23T12:25:14.99+0000 [APP/0] OUT 2017-01-23 12:25:14.990 INFO 20 --- [ool: Thread-631] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=serializePlan start=1485174314984 end=1485174314990 duration=6 from=org.apache.hado
op.hive.ql.exec.Utilities>
2017-01-23T12:25:15.00+0000 [APP/0] OUT 2017-01-23 12:25:15.004 INFO 20 --- [Pool: Thread-35] o.a.h.s.cli.thrift.ThriftHttpServlet : Could not validate cookie sent, will try to generate a new cookie
2017-01-23T12:25:15.09+0000 [APP/0] OUT 2017-01-23 12:25:15.092 INFO 20 --- [Pool: Thread-35] o.a.h.s.cli.thrift.ThriftHttpServlet : Cookie added for clientUserName admin
2017-01-23T12:25:15.09+0000 [APP/0] OUT 2017-01-23 12:25:15.095 INFO 20 --- [ool: Thread-631] o.a.h.hive.ql.exec.mr.MapredLocalTask : Executing: /dev/null-hadoop-bin jar /home/vcap/app/BOOT-INF/lib/hive-common-1.2.1.jar org.apache.had
oop.hive.ql.exec.mr.ExecDriver -localtask -plan file:/home/vcap/tmp/hive8156245218269664233/localScratch/998589d1-425e-4e06-93e0-c332351143bb/hive_2017-01-23_12-25-14_906_3555154913006934385-22/-local-10005/plan.xml -jobconffile file:/home/v
cap/tmp/hive8156245218269664233/localScratch/998589d1-425e-4e06-93e0-c332351143bb/hive_2017-01-23_12-25-14_906_3555154913006934385-22/-local-10006/jobconf.xml
2017-01-23T12:25:15.09+0000 [APP/0] OUT 2017-01-23 12:25:15.097 ERROR 20 --- [ool: Thread-631] o.a.h.hive.ql.exec.mr.MapredLocalTask : Exception: java.io.IOException: Cannot run program "/dev/null-hadoop-bin" (in directory "/home/vcap/
app"): error=2, No such file or directory
2017-01-23T12:25:15.09+0000 [APP/0] OUT java.io.IOException: Cannot run program "/dev/null-hadoop-bin" (in directory "/home/vcap/app"): error=2, No such file or directory
2017-01-23T12:25:15.09+0000 [APP/0] OUT at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048) ~[na:1.8.0_91]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at java.lang.Runtime.exec(Runtime.java:620) ~[na:1.8.0_91]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at java.lang.Runtime.exec(Runtime.java:450) ~[na:1.8.0_91]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.exec.mr.MapredLocalTask.executeInChildVM(MapredLocalTask.java:289) [hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.exec.mr.MapredLocalTask.execute(MapredLocalTask.java:138) [hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) [hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88) [hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1653) [hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1412) [hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195) [hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059) [hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1054) [hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:154) [hive-service-1.2.1.jar!/:1.2.1]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:71) [hive-service-1.2.1.jar!/:1.2.1]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:206) [hive-service-1.2.1.jar!/:1.2.1]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at java.security.AccessController.doPrivileged(Native Method) [na:1.8.0_91]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at javax.security.auth.Subject.doAs(Subject.java:422) [na:1.8.0_91]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) [hadoop-common-2.6.0.jar!/:na]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:218) [hive-service-1.2.1.jar!/:1.2.1]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_91]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_91]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_91]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_91]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at java.lang.Thread.run(Thread.java:745) [na:1.8.0_91]
2017-01-23T12:25:15.09+0000 [APP/0] OUT Caused by: java.io.IOException: error=2, No such file or directory
2017-01-23T12:25:15.09+0000 [APP/0] OUT at java.lang.UNIXProcess.forkAndExec(Native Method) ~[na:1.8.0_91]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at java.lang.UNIXProcess.<init>(UNIXProcess.java:248) ~[na:1.8.0_91]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at java.lang.ProcessImpl.start(ProcessImpl.java:134) ~[na:1.8.0_91]
2017-01-23T12:25:15.09+0000 [APP/0] OUT at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029) ~[na:1.8.0_91]
2017-01-23T12:25:15.09+0000 [APP/0] OUT ... 23 common frames omitted
2017-01-23T12:25:15.09+0000 [APP/0] OUT 2017-01-23 12:25:15.098 ERROR 20 --- [ool: Thread-631] org.apache.hadoop.hive.ql.Driver : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapredLocalTask
2017-01-23T12:25:15.10+0000 [APP/0] OUT 2017-01-23 12:25:15.098 INFO 20 --- [ool: Thread-631] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=Driver.execute start=1485174314974 end=1485174315098 duration=124 from=org.apache.h
adoop.hive.ql.Driver>
2017-01-23T12:25:15.10+0000 [APP/0] OUT 2017-01-23 12:25:15.098 INFO 20 --- [ool: Thread-631] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:25:15.10+0000 [APP/0] OUT 2017-01-23 12:25:15.098 INFO 20 --- [ool: Thread-631] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=releaseLocks start=1485174315098 end=1485174315098 duration=0 from=org.apache.hadoo
p.hive.ql.Driver>
2017-01-23T12:25:15.10+0000 [APP/0] OUT 2017-01-23 12:25:15.098 ERROR 20 --- [ool: Thread-631] o.a.h.service.cli.operation.Operation : Error running hive query:
2017-01-23T12:25:15.10+0000 [APP/0] OUT org.apache.hive.service.cli.HiveSQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapredLocalTask
2017-01-23T12:25:15.10+0000 [APP/0] OUT at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:315) ~[hive-service-1.2.1.jar!/:1.2.1]
2017-01-23T12:25:15.10+0000 [APP/0] OUT at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:156) ~[hive-service-1.2.1.jar!/:1.2.1]
2017-01-23T12:25:15.10+0000 [APP/0] OUT at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:71) ~[hive-service-1.2.1.jar!/:1.2.1]
2017-01-23T12:25:15.10+0000 [APP/0] OUT at org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:206) ~[hive-service-1.2.1.jar!/:1.2.1]
2017-01-23T12:25:15.10+0000 [APP/0] OUT at java.security.AccessController.doPrivileged(Native Method) [na:1.8.0_91]
2017-01-23T12:25:15.10+0000 [APP/0] OUT at javax.security.auth.Subject.doAs(Subject.java:422) [na:1.8.0_91]
2017-01-23T12:25:15.10+0000 [APP/0] OUT at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) [hadoop-common-2.6.0.jar!/:na]
2017-01-23T12:25:15.10+0000 [APP/0] OUT at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:218) [hive-service-1.2.1.jar!/:1.2.1]
2017-01-23T12:25:15.10+0000 [APP/0] OUT at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_91]
2017-01-23T12:25:15.10+0000 [APP/0] OUT at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_91]
2017-01-23T12:25:15.10+0000 [APP/0] OUT at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_91]
2017-01-23T12:25:15.10+0000 [APP/0] OUT at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_91]
2017-01-23T12:25:15.10+0000 [APP/0] OUT at java.lang.Thread.run(Thread.java:745) [na:1.8.0_91]
2017-01-23T12:25:15.12+0000 [APP/0] OUT 2017-01-23 12:25:15.128 INFO 20 --- [Pool: Thread-35] o.a.h.s.cli.thrift.ThriftHttpServlet : Could not validate cookie sent, will try to generate a new cookie
2017-01-23T12:25:15.21+0000 [APP/0] OUT 2017-01-23 12:25:15.216 INFO 20 --- [Pool: Thread-35] o.a.h.s.cli.thrift.ThriftHttpServlet : Cookie added for clientUserName admin
2017-01-23T12:25:15.21+0000 [APP/0] OUT 2017-01-23 12:25:15.217 INFO 20 --- [Pool: Thread-35] o.a.h.hive.ql.exec.ListSinkOperator : 296 finished. closing...
2017-01-23T12:25:15.21+0000 [APP/0] OUT 2017-01-23 12:25:15.217 INFO 20 --- [Pool: Thread-35] o.a.h.hive.ql.exec.ListSinkOperator : 296 Close done
2017-01-23T12:25:15.21+0000 [APP/0] OUT 2017-01-23 12:25:15.219 INFO 20 --- [Pool: Thread-35] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:25:15.21+0000 [APP/0] OUT 2017-01-23 12:25:15.219 INFO 20 --- [Pool: Thread-35] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=releaseLocks start=1485174315219 end=1485174315219 duration=0 from=org.apache.hadoo
p.hive.ql.Driver>
This seems to be because the hadoop binary cannot be found (it is set to /dev/null). We changed it to /dev/null-hadoop-bin to confirm this.
Switching to execution engine Tez we get a slightly different error (I suspect that the hadoop bin thing could be hard to overcome, but this error may be easier to solve).
set hive.execution.engine=tez;
SELECT * FROM A JOIN B ON B.id = A.b_id;
2017-01-23T12:23:44.34+0000 [APP/0] OUT 2017-01-23 12:23:44.340 INFO 20 --- [Pool: Thread-36] o.a.h.s.cli.thrift.ThriftHttpServlet : Could not validate cookie sent, will try to generate a new cookie
2017-01-23T12:23:44.42+0000 [APP/0] OUT 2017-01-23 12:23:44.428 INFO 20 --- [Pool: Thread-36] o.a.h.s.cli.thrift.ThriftHttpServlet : Cookie added for clientUserName admin
2017-01-23T12:23:44.42+0000 [APP/0] OUT 2017-01-23 12:23:44.429 INFO 20 --- [Pool: Thread-36] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:23:44.43+0000 [APP/0] OUT 2017-01-23 12:23:44.430 INFO 20 --- [Pool: Thread-36] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:23:44.43+0000 [APP/0] OUT 2017-01-23 12:23:44.430 INFO 20 --- [Pool: Thread-36] hive.ql.parse.ParseDriver : Parsing command: SELECT p.objectid FROM opportunitystaff p JOIN opportunit76630dff c ON c.objectid=p
.type_id
2017-01-23T12:23:44.43+0000 [APP/0] OUT 2017-01-23 12:23:44.430 INFO 20 --- [Pool: Thread-36] hive.ql.parse.ParseDriver : Parse Completed
2017-01-23T12:23:44.43+0000 [APP/0] OUT 2017-01-23 12:23:44.430 INFO 20 --- [Pool: Thread-36] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=parse start=1485174224430 end=1485174224430 duration=0 from=org.apache.hadoop.hive.
ql.Driver>
2017-01-23T12:23:44.43+0000 [APP/0] OUT 2017-01-23 12:23:44.430 INFO 20 --- [Pool: Thread-36] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:23:44.43+0000 [APP/0] OUT 2017-01-23 12:23:44.430 INFO 20 --- [Pool: Thread-36] o.a.h.hive.ql.parse.SemanticAnalyzer : Starting Semantic Analysis
2017-01-23T12:23:44.43+0000 [APP/0] OUT 2017-01-23 12:23:44.430 INFO 20 --- [Pool: Thread-36] o.a.h.hive.ql.parse.SemanticAnalyzer : Completed phase 1 of Semantic Analysis
2017-01-23T12:23:44.43+0000 [APP/0] OUT 2017-01-23 12:23:44.430 INFO 20 --- [Pool: Thread-36] o.a.h.hive.ql.parse.SemanticAnalyzer : Get metadata for source tables
2017-01-23T12:23:44.43+0000 [APP/0] OUT 2017-01-23 12:23:44.431 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.metastore.HiveMetaStore : 4: get_table : db=b8e90977_fcff_4ab8_aae5_b1c7c74d34a6 tbl=opportunitystaff
2017-01-23T12:23:44.43+0000 [APP/0] OUT 2017-01-23 12:23:44.431 INFO 20 --- [Pool: Thread-36] o.a.h.h.metastore.HiveMetaStore.audit : ugi=admin ip=unknown-ip-addr cmd=get_table : db=b8e90977_fcff_4ab8_aae5_b1c7c74d34a6 tbl
=opportunitystaff
2017-01-23T12:23:44.44+0000 [APP/0] OUT 2017-01-23 12:23:44.440 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.metastore.HiveMetaStore : 4: get_table : db=b8e90977_fcff_4ab8_aae5_b1c7c74d34a6 tbl=opportunit76630dff
2017-01-23T12:23:44.44+0000 [APP/0] OUT 2017-01-23 12:23:44.440 INFO 20 --- [Pool: Thread-36] o.a.h.h.metastore.HiveMetaStore.audit : ugi=admin ip=unknown-ip-addr cmd=get_table : db=b8e90977_fcff_4ab8_aae5_b1c7c74d34a6 tbl
=opportunit76630dff
2017-01-23T12:23:44.44+0000 [APP/0] OUT 2017-01-23 12:23:44.448 INFO 20 --- [Pool: Thread-36] o.a.h.hive.ql.parse.SemanticAnalyzer : Get metadata for subqueries
2017-01-23T12:23:44.44+0000 [APP/0] OUT 2017-01-23 12:23:44.448 INFO 20 --- [Pool: Thread-36] o.a.h.hive.ql.parse.SemanticAnalyzer : Get metadata for destination tables
2017-01-23T12:23:44.45+0000 [APP/0] OUT 2017-01-23 12:23:44.452 INFO 20 --- [Pool: Thread-36] o.a.h.hive.ql.parse.SemanticAnalyzer : Completed getting MetaData in Semantic Analysis
2017-01-23T12:23:44.45+0000 [APP/0] OUT 2017-01-23 12:23:44.456 INFO 20 --- [Pool: Thread-36] org.apache.hadoop.hive.common.FileUtils : Creating directory if it doesn't exist: file:/home/vcap/tmp/hive8156245218269664233/localScratch/998
589d1-425e-4e06-93e0-c332351143bb/hive_2017-01-23_12-23-44_429_3403078215150409173-1/-mr-10000/.hive-staging_hive_2017-01-23_12-23-44_429_3403078215150409173-1
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.481 INFO 20 --- [Pool: Thread-36] o.a.h.hive.ql.parse.SemanticAnalyzer : Set stats collection dir : file:/home/vcap/tmp/hive8156245218269664233/localScratch/998589d1-425e-4e
06-93e0-c332351143bb/hive_2017-01-23_12-23-44_429_3403078215150409173-1/-mr-10000/.hive-staging_hive_2017-01-23_12-23-44_429_3403078215150409173-1/-ext-10002
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.485 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for FS(274)
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.485 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for SEL(273)
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.485 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for JOIN(272)
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.485 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for RS(269)
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.485 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for FIL(275)
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.485 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Pushdown Predicates of FIL For Alias : p
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.485 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : (type_id) IN (RS[271])
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.485 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for FIL(268)
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.486 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Pushdown Predicates of FIL For Alias : p
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.486 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : type_id is not null
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.486 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Pushdown Predicates of FIL For Alias : p
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.486 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : (type_id) IN (RS[271])
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.486 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for TS(266)
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.486 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Pushdown Predicates of TS For Alias : p
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.486 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : type_id is not null
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.487 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : (type_id) IN (RS[271])
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.487 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for RS(271)
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.487 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for FIL(276)
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.487 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Pushdown Predicates of FIL For Alias : c
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.487 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : (objectid) IN (RS[269])
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.487 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for FIL(270)
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.488 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Pushdown Predicates of FIL For Alias : c
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.488 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : objectid is not null
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.488 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Pushdown Predicates of FIL For Alias : c
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.489 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : (objectid) IN (RS[269])
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.489 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Processing for TS(267)
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.489 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : Pushdown Predicates of TS For Alias : c
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.489 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : objectid is not null
2017-01-23T12:23:44.48+0000 [APP/0] OUT 2017-01-23 12:23:44.489 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.ppd.OpProcFactory : (objectid) IN (RS[269])
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.490 INFO 20 --- [Pool: Thread-36] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=partition-retrieving from=org.apache.hadoop.hive.ql.optimizer.ppr.PartitionPruner>
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.490 INFO 20 --- [Pool: Thread-36] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=partition-retrieving start=1485174224490 end=1485174224490 duration=0 from=org.apac
he.hadoop.hive.ql.optimizer.ppr.PartitionPruner>
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.490 INFO 20 --- [Pool: Thread-36] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=partition-retrieving from=org.apache.hadoop.hive.ql.optimizer.ppr.PartitionPruner>
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.490 INFO 20 --- [Pool: Thread-36] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=partition-retrieving start=1485174224490 end=1485174224490 duration=0 from=org.apac
he.hadoop.hive.ql.optimizer.ppr.PartitionPruner>
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.491 INFO 20 --- [Pool: Thread-36] o.a.h.h.q.o.ColumnPrunerProcFactory : JOIN 272 oldExprs: {0=[Column[VALUE._col0], Column[VALUE._col1], Column[VALUE._col2], Column[VALUE._
col3], Column[VALUE._col4], Column[VALUE._col5], Column[VALUE._col6], Column[VALUE._col7], Column[VALUE._col8], Column[VALUE._col9], Column[VALUE._col10], Column[VALUE._col11], Column[VALUE._col12], Column[VALUE._col13], Column[VALUE._col14],
Column[KEY.reducesinkkey0], Column[VALUE._col15], Column[VALUE._col16], Column[VALUE._col17], Column[VALUE._col18], Column[VALUE._col19], Column[VALUE._col20], Column[VALUE._col21]], 1=[Column[KEY.reducesinkkey0], Column[VALUE._col0], Column[V
ALUE._col1], Column[VALUE._col2], Column[VALUE._col3], Column[VALUE._col4], Column[VALUE._col5], Column[VALUE._col6], Column[VALUE._col7], Column[VALUE._col8], Column[VALUE._col9], Column[VALUE._col10], Column[VALUE._col11], Column[VALUE._col1
2], Column[VALUE._col13], Column[VALUE._col14], Column[VALUE._col15], Column[VALUE._col16]]}
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.491 INFO 20 --- [Pool: Thread-36] o.a.h.h.q.o.ColumnPrunerProcFactory : JOIN 272 newExprs: {0=[Column[VALUE._col0]], 1=[]}
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.491 INFO 20 --- [Pool: Thread-36] o.a.h.h.q.o.ColumnPrunerProcFactory : RS 269 oldColExprMap: {VALUE._col20=Column[INPUT__FILE__NAME], VALUE._col10=Column[shortdescription]
, VALUE._col21=Column[ROW__ID], VALUE._col11=Column[opportunity_id], VALUE._col12=Column[modifiedbyuser_id], KEY.reducesinkkey0=Column[type_id], VALUE._col2=Column[modifiedutc], VALUE._col3=Column[createdutc], VALUE._col4=Column[isdeleted], VA
LUE._col5=Column[display], VALUE._col0=Column[objectid], VALUE._col1=Column[objectclass], VALUE._col13=Column[originalsystem_id], VALUE._col14=Column[createdbyuser_id], VALUE._col15=Column[staffofficer_id], VALUE._col16=Column[contributid6c8f3
27], VALUE._col6=Column[display_u], VALUE._col17=Column[team_id], VALUE._col7=Column[sendnotification], VALUE._col18=Column[contributic200615c], VALUE._col8=Column[acl], VALUE._col19=Column[BLOCK__OFFSET__INSIDE__FILE], VALUE._col9=Column[cont
ributi7232ff8d]}
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.491 INFO 20 --- [Pool: Thread-36] o.a.h.h.q.o.ColumnPrunerProcFactory : RS 269 newColExprMap: {KEY.reducesinkkey0=Column[type_id], VALUE._col0=Column[objectid]}
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.491 INFO 20 --- [Pool: Thread-36] o.a.h.h.q.o.ColumnPrunerProcFactory : RS 271 oldColExprMap: {VALUE._col10=Column[rank], VALUE._col11=Column[modifiedbyuser_id], VALUE._col
12=Column[originalsystem_id], KEY.reducesinkkey0=Column[objectid], VALUE._col2=Column[createdutc], VALUE._col3=Column[isdeleted], VALUE._col4=Column[display], VALUE._col5=Column[display_u], VALUE._col0=Column[objectclass], VALUE._col1=Column[m
odifiedutc], VALUE._col13=Column[createdbyuser_id], VALUE._col14=Column[BLOCK__OFFSET__INSIDE__FILE], VALUE._col15=Column[INPUT__FILE__NAME], VALUE._col16=Column[ROW__ID], VALUE._col6=Column[sendnotification], VALUE._col7=Column[acl], VALUE._c
ol8=Column[name], VALUE._col9=Column[name_u]}
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.491 INFO 20 --- [Pool: Thread-36] o.a.h.h.q.o.ColumnPrunerProcFactory : RS 271 newColExprMap: {KEY.reducesinkkey0=Column[objectid]}
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.494 INFO 20 --- [Pool: Thread-36] o.a.h.h.q.o.SetReducerParallelism : Set parallelism for reduce sink RS[269] to: 1
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.494 INFO 20 --- [Pool: Thread-36] o.a.h.h.q.o.SetReducerParallelism : Set parallelism for reduce sink RS[271] to: 1
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.494 INFO 20 --- [Pool: Thread-36] o.a.h.h.ql.optimizer.ConvertJoinMapJoin : Estimated number of buckets 1
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.494 INFO 20 --- [Pool: Thread-36] o.a.h.h.ql.optimizer.ConvertJoinMapJoin : Convert to non-bucketed map join
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.495 INFO 20 --- [Pool: Thread-36] o.a.hadoop.hive.ql.parse.TezCompiler : Cycle free: true
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.496 INFO 20 --- [Pool: Thread-36] o.a.h.h.q.o.ReduceSinkMapJoinProc : Mapjoin MAPJOIN[279], pos: 1 --> Map 2 (1 keys estimated from 1 rows, 1 buckets)
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.496 INFO 20 --- [Pool: Thread-36] o.a.h.h.q.o.p.NullScanTaskDispatcher : Looking for table scans where optimization is applicable
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.496 INFO 20 --- [Pool: Thread-36] o.a.h.h.q.o.p.NullScanTaskDispatcher : Found 0 null table scans
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.496 INFO 20 --- [Pool: Thread-36] o.a.h.h.q.o.p.NullScanTaskDispatcher : Looking for table scans where optimization is applicable
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.496 INFO 20 --- [Pool: Thread-36] o.a.h.h.q.o.p.NullScanTaskDispatcher : Found 0 null table scans
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.496 INFO 20 --- [Pool: Thread-36] o.a.h.h.q.o.p.NullScanTaskDispatcher : Looking for table scans where optimization is applicable
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.496 INFO 20 --- [Pool: Thread-36] o.a.h.h.q.o.p.NullScanTaskDispatcher : Found 0 null table scans
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.497 INFO 20 --- [Pool: Thread-36] o.a.h.hive.ql.parse.SemanticAnalyzer : Completed plan generation
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.497 INFO 20 --- [Pool: Thread-36] org.apache.hadoop.hive.ql.Driver : Semantic Analysis Completed
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.497 INFO 20 --- [Pool: Thread-36] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=semanticAnalyze start=1485174224430 end=1485174224497 duration=67 from=org.apache.h
adoop.hive.ql.Driver>
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.499 INFO 20 --- [Pool: Thread-36] o.a.h.hive.ql.exec.ListSinkOperator : Initializing operator OP[281]
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.499 INFO 20 --- [Pool: Thread-36] o.a.h.hive.ql.exec.ListSinkOperator : Initialization Done 281 OP
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.499 INFO 20 --- [Pool: Thread-36] o.a.h.hive.ql.exec.ListSinkOperator : Operator 281 OP initialized
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.499 INFO 20 --- [Pool: Thread-36] org.apache.hadoop.hive.ql.Driver : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:p.objectid, type:string, comment:null)]
, properties:null)
2017-01-23T12:23:44.49+0000 [APP/0] OUT 2017-01-23 12:23:44.499 INFO 20 --- [Pool: Thread-36] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=compile start=1485174224429 end=1485174224499 duration=70 from=org.apache.hadoop.hi
ve.ql.Driver>
2017-01-23T12:23:44.50+0000 [APP/0] OUT 2017-01-23 12:23:44.500 INFO 20 --- [ool: Thread-618] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=Driver.run from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:23:44.50+0000 [APP/0] OUT 2017-01-23 12:23:44.500 INFO 20 --- [ool: Thread-618] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=TimeToSubmit from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:23:44.50+0000 [APP/0] OUT 2017-01-23 12:23:44.500 INFO 20 --- [ool: Thread-618] org.apache.hadoop.hive.ql.Driver : Concurrency mode is disabled, not creating a lock manager
2017-01-23T12:23:44.50+0000 [APP/0] OUT 2017-01-23 12:23:44.500 INFO 20 --- [ool: Thread-618] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=Driver.execute from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:23:44.50+0000 [APP/0] OUT 2017-01-23 12:23:44.500 INFO 20 --- [ool: Thread-618] org.apache.hadoop.hive.ql.Driver : Starting command(queryId=vcap_20170123122344_35756415-35ef-4ce9-8c5c-2031f78c6d79): SELECT p.objecti
d FROM opportunitystaff p JOIN opportunit76630dff c ON c.objectid=p.type_id
2017-01-23T12:23:44.50+0000 [APP/0] OUT 2017-01-23 12:23:44.500 INFO 20 --- [ool: Thread-618] org.apache.hadoop.hive.ql.Driver : Query ID = vcap_20170123122344_35756415-35ef-4ce9-8c5c-2031f78c6d79
2017-01-23T12:23:44.50+0000 [APP/0] OUT 2017-01-23 12:23:44.500 INFO 20 --- [ool: Thread-618] org.apache.hadoop.hive.ql.Driver : Total jobs = 1
2017-01-23T12:23:44.50+0000 [APP/0] OUT 2017-01-23 12:23:44.500 INFO 20 --- [ool: Thread-618] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=TimeToSubmit start=1485174224500 end=1485174224500 duration=0 from=org.apache.hadoo
p.hive.ql.Driver>
2017-01-23T12:23:44.50+0000 [APP/0] OUT 2017-01-23 12:23:44.500 INFO 20 --- [ool: Thread-618] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=runTasks from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:23:44.50+0000 [APP/0] OUT 2017-01-23 12:23:44.500 INFO 20 --- [ool: Thread-618] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=task.TEZ.Stage-1 from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:23:44.50+0000 [APP/0] OUT 2017-01-23 12:23:44.501 INFO 20 --- [ool: Thread-618] org.apache.hadoop.hive.ql.Driver : Launching Job 1 out of 1
2017-01-23T12:23:44.50+0000 [APP/0] OUT 2017-01-23 12:23:44.501 INFO 20 --- [ool: Thread-618] org.apache.hadoop.hive.ql.Driver : Starting task [Stage-1:MAPRED] in serial mode
2017-01-23T12:23:44.50+0000 [APP/0] OUT 2017-01-23 12:23:44.501 INFO 20 --- [ool: Thread-618] o.a.h.h.q.e.tez.TezSessionPoolManager : The current user: admin, session user: admin
2017-01-23T12:23:44.50+0000 [APP/0] OUT 2017-01-23 12:23:44.501 INFO 20 --- [ool: Thread-618] o.a.h.h.q.e.tez.TezSessionPoolManager : Current queue name is null incoming queue name is null
2017-01-23T12:23:44.52+0000 [APP/0] OUT 2017-01-23 12:23:44.520 INFO 20 --- [Pool: Thread-35] o.a.h.s.cli.thrift.ThriftHttpServlet : Could not validate cookie sent, will try to generate a new cookie
2017-01-23T12:23:44.55+0000 [APP/0] OUT 2017-01-23 12:23:44.556 INFO 20 --- [ool: Thread-618] org.apache.hadoop.hive.ql.exec.Task : Tez session hasn't been created yet. Opening session
2017-01-23T12:23:44.55+0000 [APP/0] OUT 2017-01-23 12:23:44.556 INFO 20 --- [ool: Thread-618] o.a.h.hive.ql.exec.tez.TezSessionState : User of session id 1e7e6e5d-9994-4b6f-ba15-0b2a8e09424a is admin
2017-01-23T12:23:44.55+0000 [APP/0] OUT 2017-01-23 12:23:44.558 INFO 20 --- [ool: Thread-618] o.a.hadoop.hive.ql.exec.tez.DagUtils : Hive jar directory is /home/vcap/tmp/hive8156245218269664233/tezInstallation
2017-01-23T12:23:44.56+0000 [APP/0] OUT 2017-01-23 12:23:44.560 ERROR 20 --- [ool: Thread-618] org.apache.hadoop.hive.ql.exec.Task : Failed to execute tez graph.
2017-01-23T12:23:44.56+0000 [APP/0] OUT java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: jar:file:/home/vcap/app/BOOT-INF/lib/hive-exec-1.2.1.jar!
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hadoop.fs.Path.initialize(Path.java:206) ~[hadoop-common-2.6.0.jar!/:na]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hadoop.fs.Path.<init>(Path.java:172) ~[hadoop-common-2.6.0.jar!/:na]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.createJarLocalResource(TezSessionState.java:331) ~[hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:151) ~[hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.exec.tez.TezTask.updateSession(TezTask.java:239) ~[hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.exec.tez.TezTask.execute(TezTask.java:137) ~[hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) [hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88) [hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1653) [hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1412) [hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195) [hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059) [hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1054) [hive-exec-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:154) [hive-service-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:71) [hive-service-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:206) [hive-service-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at java.security.AccessController.doPrivileged(Native Method) [na:1.8.0_91]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at javax.security.auth.Subject.doAs(Subject.java:422) [na:1.8.0_91]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) [hadoop-common-2.6.0.jar!/:na]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:218) [hive-service-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_91]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_91]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_91]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_91]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at java.lang.Thread.run(Thread.java:745) [na:1.8.0_91]
2017-01-23T12:23:44.56+0000 [APP/0] OUT Caused by: java.net.URISyntaxException: Relative path in absolute URI: jar:file:/home/vcap/app/BOOT-INF/lib/hive-exec-1.2.1.jar!
2017-01-23T12:23:44.56+0000 [APP/0] OUT at java.net.URI.checkPath(URI.java:1823) ~[na:1.8.0_91]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at java.net.URI.<init>(URI.java:745) ~[na:1.8.0_91]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hadoop.fs.Path.initialize(Path.java:203) ~[hadoop-common-2.6.0.jar!/:na]
2017-01-23T12:23:44.56+0000 [APP/0] OUT ... 24 common frames omitted
2017-01-23T12:23:44.56+0000 [APP/0] OUT 2017-01-23 12:23:44.561 ERROR 20 --- [ool: Thread-618] org.apache.hadoop.hive.ql.Driver : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask
2017-01-23T12:23:44.56+0000 [APP/0] OUT 2017-01-23 12:23:44.561 INFO 20 --- [ool: Thread-618] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=Driver.execute start=1485174224500 end=1485174224561 duration=61 from=org.apache.ha
doop.hive.ql.Driver>
2017-01-23T12:23:44.56+0000 [APP/0] OUT 2017-01-23 12:23:44.561 INFO 20 --- [ool: Thread-618] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:23:44.56+0000 [APP/0] OUT 2017-01-23 12:23:44.561 INFO 20 --- [ool: Thread-618] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=releaseLocks start=1485174224561 end=1485174224561 duration=0 from=org.apache.hadoo
p.hive.ql.Driver>
2017-01-23T12:23:44.56+0000 [APP/0] OUT 2017-01-23 12:23:44.562 ERROR 20 --- [ool: Thread-618] o.a.h.service.cli.operation.Operation : Error running hive query:
2017-01-23T12:23:44.56+0000 [APP/0] OUT org.apache.hive.service.cli.HiveSQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:315) ~[hive-service-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:156) ~[hive-service-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:71) ~[hive-service-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:206) ~[hive-service-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at java.security.AccessController.doPrivileged(Native Method) [na:1.8.0_91]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at javax.security.auth.Subject.doAs(Subject.java:422) [na:1.8.0_91]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) [hadoop-common-2.6.0.jar!/:na]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:218) [hive-service-1.2.1.jar!/:1.2.1]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_91]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_91]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_91]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_91]
2017-01-23T12:23:44.56+0000 [APP/0] OUT at java.lang.Thread.run(Thread.java:745) [na:1.8.0_91]
2017-01-23T12:23:44.60+0000 [APP/0] OUT 2017-01-23 12:23:44.608 INFO 20 --- [Pool: Thread-35] o.a.h.s.cli.thrift.ThriftHttpServlet : Cookie added for clientUserName admin
2017-01-23T12:23:44.64+0000 [APP/0] OUT 2017-01-23 12:23:44.643 INFO 20 --- [Pool: Thread-36] o.a.h.s.cli.thrift.ThriftHttpServlet : Could not validate cookie sent, will try to generate a new cookie
2017-01-23T12:23:44.73+0000 [APP/0] OUT 2017-01-23 12:23:44.733 INFO 20 --- [Pool: Thread-36] o.a.h.s.cli.thrift.ThriftHttpServlet : Cookie added for clientUserName admin
2017-01-23T12:23:44.73+0000 [APP/0] OUT 2017-01-23 12:23:44.733 INFO 20 --- [Pool: Thread-36] o.a.h.hive.ql.exec.ListSinkOperator : 281 finished. closing...
2017-01-23T12:23:44.73+0000 [APP/0] OUT 2017-01-23 12:23:44.733 INFO 20 --- [Pool: Thread-36] o.a.h.hive.ql.exec.ListSinkOperator : 281 Close done
2017-01-23T12:23:44.73+0000 [APP/0] OUT 2017-01-23 12:23:44.736 INFO 20 --- [Pool: Thread-36] o.apache.hadoop.hive.ql.log.PerfLogger : <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
2017-01-23T12:23:44.73+0000 [APP/0] OUT 2017-01-23 12:23:44.736 INFO 20 --- [Pool: Thread-36] o.apache.hadoop.hive.ql.log.PerfLogger : </PERFLOG method=releaseLocks start=1485174224736 end=1485174224736 duration=0 from=org.apache.hadoo
p.hive.ql.Driver>
Here the error appears to be: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: jar:file:/home/vcap/app/BOOT-INF/lib/hive-exec-1.2.1.jar!
The path looks absolute to me. I'm not sure if the !
is part of the path (or error message) which may be upsetting it.
Any help would be greatly appreciated.
Cheers,
Theo
I've done some further investigation. I can get the JOIN to work using Map Reduce (as opposed to Tez) by adding these config options:
HIVE_INFER_BUCKET_SORT to false,
HIVEMETADATAONLYQUERIES to false,
HIVEOPTINDEXFILTER to false,
HIVECONVERTJOIN to false,
HIVESKEWJOIN to false,
Tez still does not work. I found these options from the HiveRunner project. There are some more configuration options in there that I haven't tried yet that seem to specifically target Tez.