WeBankFinTech/Exchangis

EXCHGIS执行错误

NXL333 opened this issue · 1 comments

Search before asking

  • I searched the issues and found no similar issues.

Exchangis Component

exchangis-datasource

What happened + What you expected to happen

Listening for transport dt_socket at address: 42985
2022-11-28 19:51:32.153 INFO [main] org.apache.linkis.manager.label.builder.factory.LabelBuilderFactoryContext 85 labelBuilderInitRegister - Succeed to register label builder: org.apache.linkis.manager.label.builder.CombinedLabelBuilder
2022-11-28 19:51:32.160 INFO [main] org.apache.linkis.engineconn.launch.EngineConnServer$ 41 info - <<---------------------EngineConnServer Start --------------------->>
2022-11-28 19:51:32.272 INFO [main] org.apache.linkis.engineconn.launch.EngineConnServer$ 41 info - Finished to init engineCreationContext: {"user":"hadoop","ticketId":"5b40dca5-08dd-4b52-9449-0cec6bc49e1c","labels":[{},{},{}],"options":{"wds.linkis.rm.yarnqueue.instance.max":"30","wds.linkis.rm.yarnqueue":"default","wds.linkis.rm.yarnqueue.memory.max":"300G","label.engineType":"sqoop-1.4.6","wds.linkis.rm.client.core.max":"10","wds.linkis.rm.instance":"10","wds.linkis.engineconn.java.driver.memory":"1","label.userCreator":"hadoop-exchangis","wds.linkis.rm.yarnqueue.cores.max":"150","wds.linkis.rm.client.memory.max":"20G","label.engineConnMode":"once","user":"hadoop","ticketId":"5b40dca5-08dd-4b52-9449-0cec6bc49e1c","onceExecutorContent":"resource_036edc6a883-32b2-4c82-a8b9-3c7f35b56862v000001"},"emInstance":{"org$apache$linkis$common$ServiceInstance$$applicationName":"linkis-cg-engineconnmanager","org$apache$linkis$common$ServiceInstance$$instance":"hadoop01:9102"},"executorId":0,"args":["--engineconn-conf","wds.linkis.rm.instance\u003d10","--engineconn-conf","label.userCreator\u003dhadoop-exchangis","--engineconn-conf","ticketId\u003d5b40dca5-08dd-4b52-9449-0cec6bc49e1c","--engineconn-conf","wds.linkis.engineconn.java.driver.memory\u003d1","--engineconn-conf","wds.linkis.rm.yarnqueue.memory.max\u003d300G","--engineconn-conf","label.engineConnMode\u003donce","--engineconn-conf","label.engineType\u003dsqoop-1.4.6","--engineconn-conf","wds.linkis.rm.yarnqueue.instance.max\u003d30","--engineconn-conf","wds.linkis.rm.client.memory.max\u003d20G","--engineconn-conf","onceExecutorContent\u003dresource_036edc6a883-32b2-4c82-a8b9-3c7f35b56862v000001","--engineconn-conf","wds.linkis.rm.client.core.max\u003d10","--engineconn-conf","wds.linkis.rm.yarnqueue.cores.max\u003d150","--engineconn-conf","user\u003dhadoop","--engineconn-conf","wds.linkis.rm.yarnqueue\u003ddefault","--spring-conf","eureka.client.serviceUrl.defaultZone\u003dhttp://192.168.0.90:9600/eureka/","--spring-conf","logging.config\u003dclasspath:log4j2.xml","--spring-conf","spring.profiles.active\u003dengineconn","--spring-conf","server.port\u003d41649","--spring-conf","spring.application.name\u003dlinkis-cg-engineconn"]}
2022-11-28 19:51:32.273 INFO [main] org.apache.linkis.engineconn.launch.EngineConnServer$ 41 info - Finished to create EngineCreationContext, EngineCreationContext content: {"user":"hadoop","ticketId":"5b40dca5-08dd-4b52-9449-0cec6bc49e1c","labels":[{},{},{}],"options":{"wds.linkis.rm.yarnqueue.instance.max":"30","wds.linkis.rm.yarnqueue":"default","wds.linkis.rm.yarnqueue.memory.max":"300G","label.engineType":"sqoop-1.4.6","wds.linkis.rm.client.core.max":"10","wds.linkis.rm.instance":"10","wds.linkis.engineconn.java.driver.memory":"1","label.userCreator":"hadoop-exchangis","wds.linkis.rm.yarnqueue.cores.max":"150","wds.linkis.rm.client.memory.max":"20G","label.engineConnMode":"once","user":"hadoop","ticketId":"5b40dca5-08dd-4b52-9449-0cec6bc49e1c","onceExecutorContent":"resource_036edc6a883-32b2-4c82-a8b9-3c7f35b56862v000001"},"emInstance":{"org$apache$linkis$common$ServiceInstance$$applicationName":"linkis-cg-engineconnmanager","org$apache$linkis$common$ServiceInstance$$instance":"hadoop01:9102"},"executorId":0,"args":["--engineconn-conf","wds.linkis.rm.instance\u003d10","--engineconn-conf","label.userCreator\u003dhadoop-exchangis","--engineconn-conf","ticketId\u003d5b40dca5-08dd-4b52-9449-0cec6bc49e1c","--engineconn-conf","wds.linkis.engineconn.java.driver.memory\u003d1","--engineconn-conf","wds.linkis.rm.yarnqueue.memory.max\u003d300G","--engineconn-conf","label.engineConnMode\u003donce","--engineconn-conf","label.engineType\u003dsqoop-1.4.6","--engineconn-conf","wds.linkis.rm.yarnqueue.instance.max\u003d30","--engineconn-conf","wds.linkis.rm.client.memory.max\u003d20G","--engineconn-conf","onceExecutorContent\u003dresource_036edc6a883-32b2-4c82-a8b9-3c7f35b56862v000001","--engineconn-conf","wds.linkis.rm.client.core.max\u003d10","--engineconn-conf","wds.linkis.rm.yarnqueue.cores.max\u003d150","--engineconn-conf","user\u003dhadoop","--engineconn-conf","wds.linkis.rm.yarnqueue\u003ddefault","--spring-conf","eureka.client.serviceUrl.defaultZone\u003dhttp://192.168.0.90:9600/eureka/","--spring-conf","logging.config\u003dclasspath:log4j2.xml","--spring-conf","spring.profiles.active\u003dengineconn","--spring-conf","server.port\u003d41649","--spring-conf","spring.application.name\u003dlinkis-cg-engineconn"]}
2022-11-28 19:51:32.279 INFO [main] org.apache.linkis.engineconn.computation.executor.hook.ComputationEngineConnHook 41 info - Spring is enabled, now try to start SpringBoot.
2022-11-28 19:51:32.280 INFO [main] org.apache.linkis.engineconn.computation.executor.hook.ComputationEngineConnHook 41 info - <--------------------Start SpringBoot App-------------------->
2022-11-28 19:51:33.475 INFO [main] org.apache.linkis.engineconn.acessible.executor.log.SendAppender 71 - SendAppender init success

. ____ _ __ _ _
/\ / ' __ _ () __ __ _ \ \ \
( ( )_
_ | '_ | '| | ' / ` | \ \ \
\/ )| |)| | | | | || (| | ) ) ) )
' |
| .__|| ||| |_, | / / / /
=========|
|==============|/=////
:: Spring Boot :: (v2.3.12.RELEASE)

2022-11-28 19:51:33.676 INFO [main] org.apache.linkis.DataWorkCloudApplication 652 logStartupProfileInfo - The following profiles are active: engineconn
2022-11-28 19:51:33.708 INFO [main] org.apache.linkis.DataWorkCloudApplication 95 onApplicationEvent - add config from config server...
2022-11-28 19:51:33.709 INFO [main] org.apache.linkis.DataWorkCloudApplication 100 onApplicationEvent - initialize DataWorkCloud spring application...
2022-11-28 19:51:36.155 WARN [main] com.netflix.config.sources.URLConfigurationSource 126 - No URLs will be polled as dynamic configuration sources.
2022-11-28 19:51:36.156 INFO [main] com.netflix.config.sources.URLConfigurationSource 127 - To enable URLs as dynamic configuration sources, define System property archaius.configurationSource.additionalUrls or make config.properties available on classpath.
2022-11-28 19:51:36.176 INFO [main] com.netflix.config.DynamicPropertyFactory 281 getInstance - DynamicPropertyFactory is initialized with configuration sources: com.netflix.config.ConcurrentCompositeConfiguration@3bea7134
2022-11-28 19:51:36.505 INFO [main] org.apache.linkis.engineconn.executor.listener.EngineConnAsyncListenerBus 41 info - EngineConn-Asyn-Thread-ListenerBus add a new listener => class org.apache.linkis.engineconn.acessible.executor.service.EngineConnTimedLockService
2022-11-28 19:51:36.510 INFO [main] org.apache.linkis.engineconn.executor.listener.EngineConnSyncListenerBus 41 info - org.apache.linkis.engineconn.executor.listener.EngineConnSyncListenerBus@39aa595 add a new listener => class org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl
2022-11-28 19:51:36.679 INFO [main] org.apache.linkis.engineconn.executor.listener.EngineConnSyncListenerBus 41 info - org.apache.linkis.engineconn.executor.listener.EngineConnSyncListenerBus@39aa595 add a new listener => class org.apache.linkis.engineconn.computation.executor.upstream.service.ECTaskEntranceMonitorService
2022-11-28 19:51:36.680 INFO [main] org.apache.linkis.engineconn.computation.executor.upstream.ECTaskEntranceMonitor 41 info - started upstream monitor
2022-11-28 19:51:36.700 INFO [main] org.apache.linkis.engineconn.executor.listener.EngineConnAsyncListenerBus 41 info - EngineConn-Asyn-Thread-ListenerBus add a new listener => class org.apache.linkis.engineconn.acessible.executor.service.DefaultExecutorHeartbeatService
2022-11-28 19:51:36.702 INFO [main] org.apache.linkis.engineconn.executor.listener.EngineConnAsyncListenerBus 41 info - EngineConn-Asyn-Thread-ListenerBus add a new listener => class org.apache.linkis.engineconn.acessible.executor.service.DefaultAccessibleService
2022-11-28 19:51:38.481 INFO [main] org.apache.linkis.rpc.RPCReceiveRestful 41 info - init all receiverChoosers in spring beans, list => List(org.apache.linkis.rpc.CommonReceiverChooser@18e5b50, org.apache.linkis.rpc.MessageReceiverChooser@7a0eca26)
2022-11-28 19:51:38.485 INFO [main] org.apache.linkis.rpc.RPCReceiveRestful 41 info - init all receiverSenderBuilders in spring beans, list => List(org.apache.linkis.rpc.CommonReceiverSenderBuilder@61d4171d)
2022-11-28 19:51:38.488 INFO [main] org.apache.linkis.rpc.RPCReceiveRestful 41 info - init RPCReceiverListenerBus with queueSize 5000 and consumeThreadSize 400.
2022-11-28 19:51:38.489 INFO [main] org.apache.linkis.rpc.AsynRPCMessageBus 41 info - RPC-Receiver-Asyn-Thread-ListenerBus add a new listener => class org.apache.linkis.rpc.RPCReceiveRestful$$anon$1
2022-11-28 19:51:38.569 WARN [main] com.netflix.config.sources.URLConfigurationSource 126 - No URLs will be polled as dynamic configuration sources.
2022-11-28 19:51:38.569 INFO [main] com.netflix.config.sources.URLConfigurationSource 127 - To enable URLs as dynamic configuration sources, define System property archaius.configurationSource.additionalUrls or make config.properties available on classpath.
2022-11-28 19:51:39.523 INFO [main] org.apache.linkis.rpc.conf.RPCSpringConfiguration$$EnhancerBySpringCGLIB$$6ee64123 41 info - DataWorkCloud RPC need register RPCReceiveRestful, now add it to configuration.
2022-11-28 19:51:39.523 INFO [main] org.apache.linkis.DataWorkCloudApplication 95 onApplicationEvent - add config from config server...
2022-11-28 19:51:39.523 INFO [main] org.apache.linkis.DataWorkCloudApplication 100 onApplicationEvent - initialize DataWorkCloud spring application...
2022-11-28 19:51:39.603 INFO [main] org.apache.linkis.DataWorkCloudApplication 61 logStarted - Started DataWorkCloudApplication in 7.062 seconds (JVM running for 9.312)
2022-11-28 19:51:39.749 INFO [main] org.apache.linkis.rpc.AsynRPCMessageBus 41 info - RPC-Sender-Asyn-Thread-ListenerBus add a new listener => class org.apache.linkis.rpc.BaseRPCSender$$anon$1
2022-11-28 19:51:39.945 INFO [main] com.netflix.config.ChainedDynamicProperty 115 checkAndFlip - Flipping property: linkis-cg-engineconnmanager.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647
2022-11-28 19:51:39.964 INFO [main] com.netflix.util.concurrent.ShutdownEnabledTimer 58 - Shutdown hook installed for: NFLoadBalancer-PingTimer-linkis-cg-engineconnmanager
2022-11-28 19:51:39.965 INFO [main] com.netflix.loadbalancer.BaseLoadBalancer 197 initWithConfig - Client: linkis-cg-engineconnmanager instantiated a LoadBalancer: DynamicServerListLoadBalancer:{NFLoadBalancer:name=linkis-cg-engineconnmanager,current list of Servers=[],Load balancer stats=Zone stats: {},Server stats: []}ServerList:null
2022-11-28 19:51:39.971 INFO [main] com.netflix.loadbalancer.DynamicServerListLoadBalancer 222 enableAndInitLearnNewServersFeature - Using serverListUpdater PollingServerListUpdater
2022-11-28 19:51:40.000 INFO [main] com.netflix.config.ChainedDynamicProperty 115 checkAndFlip - Flipping property: linkis-cg-engineconnmanager.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647
2022-11-28 19:51:40.003 INFO [main] com.netflix.loadbalancer.DynamicServerListLoadBalancer 150 restOfInit - DynamicServerListLoadBalancer for client linkis-cg-engineconnmanager initialized: DynamicServerListLoadBalancer:{NFLoadBalancer:name=linkis-cg-engineconnmanager,current list of Servers=[hadoop01:9102],Load balancer stats=Zone stats: {defaultzone=[Zone:defaultzone; Instance count:1; Active connections count: 0; Circuit breaker tripped count: 0; Active connections per server: 0.0;]
},Server stats: [[Server:hadoop01:9102; Zone:defaultZone; Total Requests:0; Successive connection failure:0; Total blackout seconds:0; Last connection made:Thu Jan 01 08:00:00 CST 1970; First connection made: Thu Jan 01 08:00:00 CST 1970; Active Connections:0; total failure count in last (1000) msecs:0; average resp time:0.0; 90 percentile resp time:0.0; 95 percentile resp time:0.0; min resp time:0.0; max resp time:0.0; stddev resp time:0.0]
]}ServerList:org.springframework.cloud.netflix.ribbon.eureka.DomainExtractingServerList@5bfee39d
2022-11-28 19:51:40.110 INFO [main] org.apache.linkis.engineconn.computation.executor.hook.ComputationEngineConnHook 41 info - <--------------------SpringBoot App init succeed-------------------->
2022-11-28 19:51:40.110 INFO [main] org.apache.linkis.engineconn.launch.EngineConnServer$ 41 info - Finished to execute hook of beforeCreateEngineConn.
2022-11-28 19:51:40.131 INFO [main] org.apache.linkis.engineconn.launch.EngineConnServer$ 41 info - Finished to create sqoopEngineConn.
2022-11-28 19:51:40.133 INFO [main] org.apache.linkis.engineconn.launch.EngineConnServer$ 41 info - Finished to execute all hooks of beforeExecutionExecute.
2022-11-28 19:51:40.154 INFO [main] org.apache.linkis.engineconn.core.execution.EngineConnExecution$ 41 info - The list of EngineConnExecution: List(org.apache.linkis.engineconn.once.executor.execution.OnceExecutorManagerEngineConnExecution@6bf1bfec, org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutorManagerEngineConnExecution@46978465, org.apache.linkis.engineconn.acessible.executor.execution.AccessibleEngineConnExecution@422910a6, org.apache.linkis.engineconn.once.executor.execution.OnceEngineConnExecution@555df150, org.apache.linkis.engineconn.computation.executor.execute.ComputationEngineConnExecution@73edfe51)
2022-11-28 19:51:40.159 INFO [main] org.apache.linkis.engineconn.launch.EngineConnServer$ 41 info - Ready to execute OnceExecutorManagerEngineConnExecution.
2022-11-28 19:51:40.161 INFO [main] org.apache.linkis.engineconn.launch.EngineConnServer$ 41 info - Ready to execute ComputationExecutorManagerEngineConnExecution.
2022-11-28 19:51:40.162 INFO [main] org.apache.linkis.engineconn.launch.EngineConnServer$ 41 info - Ready to execute AccessibleEngineConnExecution.
2022-11-28 19:51:40.163 INFO [main] org.apache.linkis.engineconn.core.executor.ExecutorManager$ 41 info - Try to use org.apache.linkis.engineconn.once.executor.creation.OnceExecutorManagerImpl to instance a ExecutorManager.
2022-11-28 19:51:40.180 INFO [main] org.apache.linkis.engineconn.once.executor.creation.OnceExecutorManagerImpl 41 info - Try to create a executor with labels List([key: userCreator, value: {"creator":"exchangis","user":"hadoop"}, str: hadoop-exchangis], [key: engineConnMode, value: {"engineConnMode":"once"}, str: once], [key: engineType, value: {"engineType":"sqoop","version":"1.4.6"}, str: sqoop-1.4.6]).
2022-11-28 19:51:40.183 INFO [main] org.apache.linkis.engineconn.once.executor.creation.OnceExecutorManagerImpl 41 info - No LabelExecutorFactory matched, use DefaultExecutorFactory to create executor.
2022-11-28 19:51:40.194 INFO [main] org.apache.linkis.engineconn.once.executor.creation.OnceExecutorManagerImpl 41 info - Finished to create SqoopOnceCodeExecutor(SqoopOnceApp_0) with labels List([key: userCreator, value: {"creator":"exchangis","user":"hadoop"}, str: hadoop-exchangis], [key: engineConnMode, value: {"engineConnMode":"once"}, str: once], [key: engineType, value: {"engineType":"sqoop","version":"1.4.6"}, str: sqoop-1.4.6]).
2022-11-28 19:51:40.212 INFO [main] org.apache.linkis.common.conf.Configuration$ 41 info - gatewayUrl is http://192.168.0.90:9001
2022-11-28 19:51:40.227 INFO [main] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 53 transition - Waitiing lock release, to change status Starting=>Running.
2022-11-28 19:51:40.227 INFO [main] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 55 transition - Finished wait lock release, to change status Starting=>Running.
2022-11-28 19:51:40.228 INFO [main] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 41 info - org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor@20652a8b changed status Starting => Running.
2022-11-28 19:51:40.229 INFO [main] org.apache.linkis.engineconn.once.executor.creation.OnceExecutorManagerImpl 41 info - Finished to init SqoopOnceCodeExecutor(SqoopOnceApp_0).
2022-11-28 19:51:40.230 INFO [main] org.apache.linkis.engineconn.acessible.executor.execution.AccessibleEngineConnExecution 41 info - Created a report executor SqoopOnceCodeExecutor(SqoopOnceApp_0).
2022-11-28 19:51:40.231 INFO [EngineConn-Asyn-Thread-Thread-0] org.apache.linkis.engineconn.executor.listener.EngineConnAsyncListenerBus 41 info - EngineConn-Asyn-Thread-Thread-0 begin.
2022-11-28 19:51:40.298 INFO [main] com.netflix.config.ChainedDynamicProperty 115 checkAndFlip - Flipping property: linkis-cg-linkismanager.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647
2022-11-28 19:51:40.300 INFO [main] com.netflix.util.concurrent.ShutdownEnabledTimer 58 - Shutdown hook installed for: NFLoadBalancer-PingTimer-linkis-cg-linkismanager
2022-11-28 19:51:40.301 INFO [main] com.netflix.loadbalancer.BaseLoadBalancer 197 initWithConfig - Client: linkis-cg-linkismanager instantiated a LoadBalancer: DynamicServerListLoadBalancer:{NFLoadBalancer:name=linkis-cg-linkismanager,current list of Servers=[],Load balancer stats=Zone stats: {},Server stats: []}ServerList:null
2022-11-28 19:51:40.302 INFO [main] com.netflix.loadbalancer.DynamicServerListLoadBalancer 222 enableAndInitLearnNewServersFeature - Using serverListUpdater PollingServerListUpdater
2022-11-28 19:51:40.304 INFO [main] com.netflix.config.ChainedDynamicProperty 115 checkAndFlip - Flipping property: linkis-cg-linkismanager.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647
2022-11-28 19:51:40.305 INFO [main] com.netflix.loadbalancer.DynamicServerListLoadBalancer 150 restOfInit - DynamicServerListLoadBalancer for client linkis-cg-linkismanager initialized: DynamicServerListLoadBalancer:{NFLoadBalancer:name=linkis-cg-linkismanager,current list of Servers=[hadoop01:9101],Load balancer stats=Zone stats: {defaultzone=[Zone:defaultzone; Instance count:1; Active connections count: 0; Circuit breaker tripped count: 0; Active connections per server: 0.0;]
},Server stats: [[Server:hadoop01:9101; Zone:defaultZone; Total Requests:0; Successive connection failure:0; Total blackout seconds:0; Last connection made:Thu Jan 01 08:00:00 CST 1970; First connection made: Thu Jan 01 08:00:00 CST 1970; Active Connections:0; total failure count in last (1000) msecs:0; average resp time:0.0; 90 percentile resp time:0.0; 95 percentile resp time:0.0; min resp time:0.0; max resp time:0.0; stddev resp time:0.0]
]}ServerList:org.springframework.cloud.netflix.ribbon.eureka.DomainExtractingServerList@35ea9d72
2022-11-28 19:51:40.351 INFO [main] org.apache.linkis.engineconn.acessible.executor.execution.AccessibleEngineConnExecution 41 info - In the first time, report usedResources to LinkisManager succeed.
2022-11-28 19:51:40.352 INFO [main] org.apache.linkis.engineconn.acessible.executor.service.DefaultManagerService 41 info - engineType labels is empty, Not reported
2022-11-28 19:51:40.352 INFO [main] org.apache.linkis.engineconn.acessible.executor.execution.AccessibleEngineConnExecution 41 info - In the first time, report all labels to LinkisManager succeed.
2022-11-28 19:51:40.352 INFO [main] org.apache.linkis.engineconn.acessible.executor.execution.AccessibleEngineConnExecution 77 executorStatusChecker - executorStatusChecker created, maxFreeTimeMills is 1800000
2022-11-28 19:51:40.354 INFO [main] org.apache.linkis.engineconn.launch.EngineConnServer$ 41 info - Ready to execute OnceEngineConnExecution.
2022-11-28 19:51:40.355 INFO [main] org.apache.linkis.engineconn.once.executor.execution.OnceEngineConnExecution 41 info - EngineConnMode is once.
2022-11-28 19:51:40.355 WARN [main] org.apache.linkis.engineconn.once.executor.execution.OnceEngineConnExecution 50 warn - org.apache.linkis.engineconn.once.executor.execution.OnceEngineConnExecution is enabled, now step into it's execution.
2022-11-28 19:51:40.363 INFO [EngineConn-Asyn-Thread-Thread-0] org.apache.linkis.engineconn.acessible.executor.service.DefaultManagerService 41 info - success to send engine heartbeat report to hadoop01:9101,status:Running,msg:null
2022-11-28 19:51:40.617 INFO [main] org.apache.linkis.httpclient.dws.DWSHttpClient 119 org$apache$linkis$httpclient$AbstractHttpClient$$addAttempt$1 - invoke http://192.168.0.90:9001/api/rest_j/v1/bml/download?resourceId=edc6a883-32b2-4c82-a8b9-3c7f35b56862&version=v000001 taken: 246
2022-11-28 19:51:40.665 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 41 info - Try to execute params.{sqoop.args.password=bigdata001, sqoop.args.input.null.string=\N, sqoop.args.driver=com.mysql.jdbc.Driver, sqoop.args.columns=gg,ggg,gggg,ggggg,gggggg, sqoop.args.table=hh, sqoop.args.num.mappers=1, sqoop.mode=export, sqoop.args.hcatalog.table=aa, sqoop.args.username=root, sqoop.args.connect=jdbc:mysql://192.168.0.90:3306/test, sqoop.args.input.fields.terminated.by=�, sqoop.args.input.null.non.string=\N, sqoop.args.hcatalog.database=default}
2022-11-28 19:51:40.665 INFO [main] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 53 transition - Waitiing lock release, to change status Running=>Busy.
2022-11-28 19:51:40.666 INFO [main] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 55 transition - Finished wait lock release, to change status Running=>Busy.
2022-11-28 19:51:40.666 INFO [main] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 41 info - org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor@20652a8b changed status Running => Busy.
2022-11-28 19:51:40.668 INFO [main] org.apache.linkis.engineconn.launch.EngineConnServer$ 41 info - Finished to execute executions.
2022-11-28 19:51:40.669 INFO [main] org.apache.linkis.engineconn.launch.EngineConnServer$ 41 info - Finished to execute hook of afterExecutionExecute
2022-11-28 19:51:40.671 INFO [main] org.apache.linkis.engineconn.callback.service.EngineConnAfterStartCallback 48 callback - protocol will send to em: EngineConnStatusCallback(ServiceInstance(linkis-cg-engineconn, hadoop01:41649),5b40dca5-08dd-4b52-9449-0cec6bc49e1c,Unlock,success)
2022-11-28 19:51:40.684 INFO [EngineConn-Asyn-Thread-Thread-0] org.apache.linkis.engineconn.acessible.executor.service.DefaultManagerService 41 info - success to send engine heartbeat report to hadoop01:9101,status:Busy,msg:null
2022-11-28 19:51:40.688 WARN [main] org.apache.linkis.engineconn.computation.executor.hook.ComputationEngineConnHook 50 warn - EngineConnServer start succeed!
2022-11-28 19:51:40.689 INFO [main] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 53 transition - Waitiing lock release, to change status Busy=>Running.
2022-11-28 19:51:40.689 INFO [main] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 55 transition - Finished wait lock release, to change status Busy=>Running.
2022-11-28 19:51:40.689 INFO [main] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 41 info - org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor@20652a8b changed status Busy => Running.
2022-11-28 19:51:40.702 INFO [EngineConn-Asyn-Thread-Thread-0] org.apache.linkis.engineconn.acessible.executor.service.DefaultManagerService 41 info - success to send engine heartbeat report to hadoop01:9101,status:Running,msg:null
2022-11-28 19:51:40.757 WARN [Linkis-Default-Scheduler-Thread-1] org.apache.sqoop.tool.SqoopTool 177 loadPluginsFromConfDir - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
2022-11-28 19:51:40.793 WARN [Linkis-Default-Scheduler-Thread-1] org.apache.sqoop.tool.BaseSqoopTool 1021 applyCredentialsOptions - Setting your password on the command-line is insecure. Consider using -P instead.
2022-11-28 19:51:40.794 WARN [Linkis-Default-Scheduler-Thread-1] org.apache.sqoop.tool.BaseSqoopTool 1523 validateHCatalogOptions - Input field/record delimiter options are not used in HCatalog jobs unless the format is text. It is better to use --hive-import in those cases. For text formats
2022-11-28 19:51:40.828 WARN [Linkis-Default-Scheduler-Thread-1] org.apache.sqoop.ConnFactory 132 getManager - Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
2022-11-28 19:51:40.833 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.sqoop.manager.SqlManager 98 initOptionDefaults - Using default fetchSize of 1000
2022-11-28 19:51:40.833 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.sqoop.tool.CodeGenTool 92 generateORM - Beginning code generation
2022-11-28 19:51:40.976 INFO [PollingServerListUpdater-0] com.netflix.config.ChainedDynamicProperty 115 checkAndFlip - Flipping property: linkis-cg-engineconnmanager.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647
2022-11-28 19:51:41.283 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.sqoop.manager.SqlManager 757 execute - Executing SQL statement: SELECT t.* FROM hh AS t WHERE 1=0
2022-11-28 19:51:41.292 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.sqoop.orm.CompilationManager 85 findHadoopJars - $HADOOP_MAPRED_HOME is not set
2022-11-28 19:51:41.304 INFO [PollingServerListUpdater-1] com.netflix.config.ChainedDynamicProperty 115 checkAndFlip - Flipping property: linkis-cg-linkismanager.ribbon.ActiveConnectionsLimit to use NEXT property: niws.loadbalancer.availabilityFilteringRule.activeConnectionsLimit = 2147483647
2022-11-28 19:51:42.901 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.sqoop.orm.CompilationManager 330 jar - Writing jar file: /tmp/sqoop-hadoop/compile/70bb56fc937ea15df1e7768b55c803b5/hh.jar
org.apache.linkis.engineconnplugin.sqoop.client.utils.JarLoader@31af1b7e
2022-11-28 19:51:42.910 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.sqoop.mapreduce.ExportJobBase 378 runExport - Beginning export of hh
2022-11-28 19:51:43.048 WARN [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.util.NativeCodeLoader 62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2022-11-28 19:51:43.058 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.conf.Configuration.deprecation 1173 warnOnceIfDeprecated - mapred.jar is deprecated. Instead, use mapreduce.job.jar
2022-11-28 19:51:43.058 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.sqoop.mapreduce.ExportJobBase 419 runExport - Configuring HCatalog for export job
2022-11-28 19:51:43.069 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities 247 checkHomeDirs - Configuring HCatalog specific details for job
2022-11-28 19:51:43.266 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.sqoop.manager.SqlManager 757 execute - Executing SQL statement: SELECT t.* FROM hh AS t WHERE 1=0
2022-11-28 19:51:43.268 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities 519 initDBColumnInfo - Database column names projected : [gg, ggg, gggg, ggggg, gggggg]
2022-11-28 19:51:43.268 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities 530 initDBColumnInfo - Database column name - info map :
gg : [Type : 4,Precision : 20,Scale : 0]
gggg : [Type : 4,Precision : 20,Scale : 0]
ggg : [Type : 4,Precision : 20,Scale : 0]
ggggg : [Type : 12,Precision : 255,Scale : 0]
gggggg : [Type : 4,Precision : 20,Scale : 0]

2022-11-28 19:51:43.280 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.conf.HiveConf 181 findConfigFile - Found configuration file file:/usr/software/apache-hive-2.3.3/conf/hive-site.xml
2022-11-28 19:51:43.543 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hive.hcatalog.common.HiveClientCache 119 - Initializing cache: eviction-timeout=120 initial-capacity=50 maximum-capacity=50
2022-11-28 19:51:43.815 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.metastore.HiveMetaStore 610 newRawStoreForConf - 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2022-11-28 19:51:43.874 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.metastore.ObjectStore 401 initializeHelper - ObjectStore, initialize called
2022-11-28 19:51:44.038 INFO [Linkis-Default-Scheduler-Thread-1] DataNucleus.Persistence 77 info - Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
2022-11-28 19:51:44.040 INFO [Linkis-Default-Scheduler-Thread-1] DataNucleus.Persistence 77 info - Property datanucleus.cache.level2 unknown - will be ignored
2022-11-28 19:51:44.213 WARN [Linkis-Default-Scheduler-Thread-1] DataNucleus.Connection 96 warn - BoneCP specified but not present in CLASSPATH (or one of dependencies)
2022-11-28 19:51:44.295 WARN [Linkis-Default-Scheduler-Thread-1] DataNucleus.Connection 96 warn - BoneCP specified but not present in CLASSPATH (or one of dependencies)
2022-11-28 19:51:44.484 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.metastore.ObjectStore 524 getPMF - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2022-11-28 19:51:45.666 INFO [Linkis-Default-Scheduler-Thread-3] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 41 info - The Sqoop Process In Running
2022-11-28 19:51:45.761 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.metastore.MetaStoreDirectSql 146 - Using direct SQL, underlying DB is MYSQL
2022-11-28 19:51:45.765 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.metastore.ObjectStore 315 setConf - Initialized ObjectStore
2022-11-28 19:51:45.906 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.metastore.HiveMetaStore 694 createDefaultRoles_core - Added admin role in metastore
2022-11-28 19:51:45.910 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.metastore.HiveMetaStore 703 createDefaultRoles_core - Added public role in metastore
2022-11-28 19:51:45.931 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.metastore.HiveMetaStore 743 addAdminUsers_core - No user is added in admin role, since config is empty
2022-11-28 19:51:46.060 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.metastore.HiveMetaStore 777 logInfo - 0: get_databases: NonExistentDatabaseUsedForHealthCheck
2022-11-28 19:51:46.061 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.metastore.HiveMetaStore.audit 309 logAuditEvent - ugi=hadoop ip=unknown-ip-addr cmd=get_databases: NonExistentDatabaseUsedForHealthCheck
2022-11-28 19:51:46.087 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.metastore.HiveMetaStore 777 logInfo - 0: get_table : db=default tbl=aa
2022-11-28 19:51:46.088 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.metastore.HiveMetaStore.audit 309 logAuditEvent - ugi=hadoop ip=unknown-ip-addr cmd=get_table : db=default tbl=aa
2022-11-28 19:51:46.381 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities 354 configureHCat - HCatalog full table schema fields = [dws_lxx_ffp_areaid, dws_lxx_ffp_areacode, dws_lxx_ffp_arealevel, dws_lxx_ffp_name, dws_lxx_ffp_count]
2022-11-28 19:51:46.482 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.metastore.HiveMetaStore 777 logInfo - 0: get_databases: NonExistentDatabaseUsedForHealthCheck
2022-11-28 19:51:46.483 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.metastore.HiveMetaStore.audit 309 logAuditEvent - ugi=hadoop ip=unknown-ip-addr cmd=get_databases: NonExistentDatabaseUsedForHealthCheck
2022-11-28 19:51:46.489 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.metastore.HiveMetaStore 777 logInfo - 0: get_table : db=default tbl=aa
2022-11-28 19:51:46.490 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.metastore.HiveMetaStore.audit 309 logAuditEvent - ugi=hadoop ip=unknown-ip-addr cmd=get_table : db=default tbl=aa
2022-11-28 19:51:46.513 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.metastore.HiveMetaStore 777 logInfo - 0: get_index_names : db=default tbl=aa
2022-11-28 19:51:46.514 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.hive.metastore.HiveMetaStore.audit 309 logAuditEvent - ugi=hadoop ip=unknown-ip-addr cmd=get_index_names : db=default tbl=aa
2022-11-28 19:51:46.628 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.hadoop.conf.Configuration.deprecation 1173 warnOnceIfDeprecated - mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
2022-11-28 19:51:47.356 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities 385 configureHCat - HCatalog table partitioning key fields = []
2022-11-28 19:51:47.359 ERROR [Linkis-Default-Scheduler-Thread-1] org.apache.linkis.engineconnplugin.sqoop.client.Sqoop 192 runSqoop - Got exception running Sqoop: java.lang.NullPointerException
2022-11-28 19:51:47.361 ERROR [Linkis-Default-Scheduler-Thread-1] org.apache.linkis.engineconnplugin.sqoop.client.LinkisSqoopClient 69 run - Run Error Message:java.lang.reflect.InvocationTargetException java.lang.reflect.InvocationTargetException: null
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_261]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_261]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_261]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_261]
at org.apache.linkis.engineconnplugin.sqoop.client.LinkisSqoopClient.run(LinkisSqoopClient.java:67) ~[linkis-engineplugin-sqoop-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor$$anonfun$runSqoop$1.apply$mcI$sp(SqoopOnceCodeExecutor.scala:75) ~[linkis-engineplugin-sqoop-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor$$anonfun$runSqoop$1.apply(SqoopOnceCodeExecutor.scala:71) ~[linkis-engineplugin-sqoop-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor$$anonfun$runSqoop$1.apply(SqoopOnceCodeExecutor.scala:71) ~[linkis-engineplugin-sqoop-1.1.1.jar:1.1.1]
at org.apache.linkis.common.utils.Utils$.tryCatch(Utils.scala:40) ~[linkis-common-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor.runSqoop(SqoopOnceCodeExecutor.scala:76) ~[linkis-engineplugin-sqoop-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor$$anon$1.run(SqoopOnceCodeExecutor.scala:56) ~[linkis-engineplugin-sqoop-1.1.1.jar:1.1.1]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_261]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_261]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_261]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) ~[?:1.8.0_261]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_261]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_261]
at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_261]
Caused by: java.lang.NullPointerException
at org.apache.hive.hcatalog.data.schema.HCatSchema.get(HCatSchema.java:105) ~[hive-hcatalog-core-2.3.3.jar:2.3.3]
at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCatUtilities.java:390) ~[sqoop-1.4.6-hadoop200.jar:?]
at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:421) ~[sqoop-1.4.6-hadoop200.jar:1.1.1]
at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:912) ~[sqoop-1.4.6-hadoop200.jar:?]
at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:81) ~[sqoop-1.4.6-hadoop200.jar:?]
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100) ~[sqoop-1.4.6-hadoop200.jar:?]
at org.apache.linkis.engineconnplugin.sqoop.client.Sqoop.run(Sqoop.java:157) ~[linkis-engineplugin-sqoop-1.1.1.jar:1.1.1]
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) ~[hadoop-common-2.7.2.jar:?]
at org.apache.linkis.engineconnplugin.sqoop.client.Sqoop.runSqoop(Sqoop.java:190) ~[linkis-engineplugin-sqoop-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconnplugin.sqoop.client.Sqoop.runTool(Sqoop.java:237) ~[linkis-engineplugin-sqoop-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconnplugin.sqoop.client.Sqoop.runTool(Sqoop.java:297) ~[linkis-engineplugin-sqoop-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconnplugin.sqoop.client.Sqoop.main(Sqoop.java:301) ~[linkis-engineplugin-sqoop-1.1.1.jar:1.1.1]
... 18 more

2022-11-28 19:51:47.373 ERROR [Linkis-Default-Scheduler-Thread-1] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 62 error - SqoopOnceApp_0 has failed with old status Running, now stop it.
2022-11-28 19:51:47.374 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 53 transition - Waitiing lock release, to change status Running=>Failed.
2022-11-28 19:51:47.375 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 55 transition - Finished wait lock release, to change status Running=>Failed.
2022-11-28 19:51:47.375 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 41 info - org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor@20652a8b changed status Running => Failed.
2022-11-28 19:51:47.378 ERROR [Linkis-Default-Scheduler-Thread-1] org.apache.linkis.engineconn.once.executor.execution.OnceEngineConnExecution 58 error - Unknown reason.
2022-11-28 19:51:47.378 ERROR [Linkis-Default-Scheduler-Thread-1] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 62 error - SqoopOnceApp_0 has failed with old status Failed, now stop it.
2022-11-28 19:51:47.384 WARN [Linkis-Default-Scheduler-Thread-1] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 50 warn - Executor(SqoopOnceApp_0) exit by close.
2022-11-28 19:51:47.385 WARN [Linkis-Default-Scheduler-Thread-1] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 50 warn - Executor(SqoopOnceApp_0) exit by close.
2022-11-28 19:51:47.385 INFO [main] org.apache.linkis.engineconn.launch.EngineConnServer$ 41 info - <<---------------------EngineConnServer Exit --------------------->>
2022-11-28 19:51:47.386 INFO [Linkis-Default-Scheduler-Thread-1] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 41 info - All codes completed, now to stop SqoopEngineConn.
2022-11-28 19:51:47.386 INFO [EngineConn-Asyn-Thread-Thread-1] org.apache.linkis.engineconn.executor.listener.EngineConnAsyncListenerBus 41 info - EngineConn-Asyn-Thread-Thread-1 begin.
2022-11-28 19:51:47.389 INFO [Thread-25] com.netflix.loadbalancer.PollingServerListUpdater 53 run - Shutting down the Executor Pool for PollingServerListUpdater
2022-11-28 19:51:47.393 INFO [SpringContextShutdownHook] org.apache.linkis.engineconn.acessible.executor.service.DefaultAccessibleService 41 info - executorShutDownHook start to execute.
2022-11-28 19:51:47.396 WARN [SpringContextShutdownHook] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 50 warn - Executor(SqoopOnceApp_0) exit by close.
2022-11-28 19:51:47.397 WARN [SpringContextShutdownHook] org.apache.linkis.engineconn.acessible.executor.service.DefaultAccessibleService 50 warn - Engine : hadoop01:41649 with state has stopped successfully.
2022-11-28 19:51:47.398 WARN [SpringContextShutdownHook] org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor 50 warn - Executor(SqoopOnceApp_0) exit by close.
2022-11-28 19:51:47.399 WARN [SpringContextShutdownHook] org.apache.linkis.engineconn.acessible.executor.service.DefaultAccessibleService 50 warn - executorShutDownHook start to close executor... org.apache.linkis.engineconnplugin.sqoop.executor.SqoopOnceCodeExecutor@20652a8b
2022-11-28 19:51:47.409 INFO [EngineConn-Asyn-Thread-Thread-0] org.apache.linkis.engineconn.acessible.executor.service.DefaultManagerService 41 info - success to send engine heartbeat report to hadoop01:9101,status:Failed,msg:null
2022-11-28 19:51:47.409 INFO [EngineConn-Asyn-Thread-Thread-1] org.apache.linkis.engineconn.acessible.executor.service.DefaultManagerService 41 info - success to send engine heartbeat report to hadoop01:9101,status:Failed,msg:null
2022-11-28 19:51:47.417 INFO [SpringContextShutdownHook] org.apache.linkis.engineconn.acessible.executor.service.DefaultManagerService 41 info - success to send engine heartbeat report to hadoop01:9101,status:Failed,msg:null
2022-11-28 19:51:47.421 INFO [SpringContextShutdownHook] org.apache.linkis.engineconn.acessible.executor.service.DefaultAccessibleService 41 info - Reported status shuttingDown to manager.
2022-11-28 19:51:49.440 INFO [SpringContextShutdownHook] org.apache.linkis.engineconn.acessible.executor.service.DefaultAccessibleService 41 info - executorShutDownHook start to execute.
2022-11-28 19:51:49.441 WARN [SpringContextShutdownHook] org.apache.linkis.engineconn.acessible.executor.service.DefaultAccessibleService 50 warn - had stop, do not shutdown
2022-11-28 19:51:49.448 INFO [SpringContextShutdownHook] org.apache.linkis.engineconn.acessible.executor.service.DefaultAccessibleService 41 info - executorShutDownHook start to execute.
2022-11-28 19:51:49.448 WARN [SpringContextShutdownHook] org.apache.linkis.engineconn.acessible.executor.service.DefaultAccessibleService 50 warn - had stop, do not shutdown
2022-11-28 19:51:49.450 INFO [SpringContextShutdownHook] com.netflix.util.concurrent.ShutdownEnabledTimer 67 cancel - Shutdown hook removed for: NFLoadBalancer-PingTimer-linkis-cg-engineconnmanager
2022-11-28 19:51:49.450 INFO [SpringContextShutdownHook] com.netflix.util.concurrent.ShutdownEnabledTimer 72 cancel - Exception caught (might be ok if at shutdown) java.lang.IllegalStateException: Shutdown in progress
at java.lang.ApplicationShutdownHooks.remove(ApplicationShutdownHooks.java:82) ~[?:1.8.0_261]
at java.lang.Runtime.removeShutdownHook(Runtime.java:239) ~[?:1.8.0_261]
at com.netflix.util.concurrent.ShutdownEnabledTimer.cancel(ShutdownEnabledTimer.java:70) ~[netflix-commons-util-0.3.0.jar:0.3.0]
at com.netflix.loadbalancer.BaseLoadBalancer.cancelPingTask(BaseLoadBalancer.java:632) ~[ribbon-loadbalancer-2.3.0.jar:2.3.0]
at com.netflix.loadbalancer.BaseLoadBalancer.shutdown(BaseLoadBalancer.java:883) ~[ribbon-loadbalancer-2.3.0.jar:2.3.0]
at com.netflix.loadbalancer.DynamicServerListLoadBalancer.shutdown(DynamicServerListLoadBalancer.java:285) ~[ribbon-loadbalancer-2.3.0.jar:2.3.0]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_261]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_261]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_261]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_261]
at org.springframework.beans.factory.support.DisposableBeanAdapter.invokeCustomDestroyMethod(DisposableBeanAdapter.java:280) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DisposableBeanAdapter.destroy(DisposableBeanAdapter.java:214) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroyBean(DefaultSingletonBeanRegistry.java:587) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroySingleton(DefaultSingletonBeanRegistry.java:559) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.destroySingleton(DefaultListableBeanFactory.java:1092) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroySingletons(DefaultSingletonBeanRegistry.java:520) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.destroySingletons(DefaultListableBeanFactory.java:1085) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.destroyBeans(AbstractApplicationContext.java:1061) ~[spring-context-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.doClose(AbstractApplicationContext.java:1030) ~[spring-context-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.close(AbstractApplicationContext.java:979) ~[spring-context-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.cloud.context.named.NamedContextFactory.destroy(NamedContextFactory.java:93) ~[spring-cloud-context-2.2.9.RELEASE.jar:2.2.9.RELEASE]
at org.springframework.beans.factory.support.DisposableBeanAdapter.destroy(DisposableBeanAdapter.java:199) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroyBean(DefaultSingletonBeanRegistry.java:587) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroySingleton(DefaultSingletonBeanRegistry.java:559) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.destroySingleton(DefaultListableBeanFactory.java:1092) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroySingletons(DefaultSingletonBeanRegistry.java:520) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.destroySingletons(DefaultListableBeanFactory.java:1085) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.destroyBeans(AbstractApplicationContext.java:1061) ~[spring-context-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.doClose(AbstractApplicationContext.java:1030) ~[spring-context-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.doClose(ServletWebServerApplicationContext.java:170) ~[spring-boot-2.3.12.RELEASE.jar:2.3.12.RELEASE]
at org.springframework.context.support.AbstractApplicationContext$1.run(AbstractApplicationContext.java:949) ~[spring-context-5.2.15.RELEASE.jar:5.2.15.RELEASE]

2022-11-28 19:51:49.452 INFO [SpringContextShutdownHook] org.apache.linkis.engineconn.acessible.executor.service.DefaultAccessibleService 41 info - executorShutDownHook start to execute.
2022-11-28 19:51:49.452 WARN [SpringContextShutdownHook] org.apache.linkis.engineconn.acessible.executor.service.DefaultAccessibleService 50 warn - had stop, do not shutdown
2022-11-28 19:51:49.453 INFO [SpringContextShutdownHook] com.netflix.util.concurrent.ShutdownEnabledTimer 67 cancel - Shutdown hook removed for: NFLoadBalancer-PingTimer-linkis-cg-linkismanager
2022-11-28 19:51:49.453 INFO [SpringContextShutdownHook] com.netflix.util.concurrent.ShutdownEnabledTimer 72 cancel - Exception caught (might be ok if at shutdown) java.lang.IllegalStateException: Shutdown in progress
at java.lang.ApplicationShutdownHooks.remove(ApplicationShutdownHooks.java:82) ~[?:1.8.0_261]
at java.lang.Runtime.removeShutdownHook(Runtime.java:239) ~[?:1.8.0_261]
at com.netflix.util.concurrent.ShutdownEnabledTimer.cancel(ShutdownEnabledTimer.java:70) ~[netflix-commons-util-0.3.0.jar:0.3.0]
at com.netflix.loadbalancer.BaseLoadBalancer.cancelPingTask(BaseLoadBalancer.java:632) ~[ribbon-loadbalancer-2.3.0.jar:2.3.0]
at com.netflix.loadbalancer.BaseLoadBalancer.shutdown(BaseLoadBalancer.java:883) ~[ribbon-loadbalancer-2.3.0.jar:2.3.0]
at com.netflix.loadbalancer.DynamicServerListLoadBalancer.shutdown(DynamicServerListLoadBalancer.java:285) ~[ribbon-loadbalancer-2.3.0.jar:2.3.0]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_261]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_261]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_261]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_261]
at org.springframework.beans.factory.support.DisposableBeanAdapter.invokeCustomDestroyMethod(DisposableBeanAdapter.java:280) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DisposableBeanAdapter.destroy(DisposableBeanAdapter.java:214) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroyBean(DefaultSingletonBeanRegistry.java:587) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroySingleton(DefaultSingletonBeanRegistry.java:559) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.destroySingleton(DefaultListableBeanFactory.java:1092) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroySingletons(DefaultSingletonBeanRegistry.java:520) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.destroySingletons(DefaultListableBeanFactory.java:1085) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.destroyBeans(AbstractApplicationContext.java:1061) ~[spring-context-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.doClose(AbstractApplicationContext.java:1030) ~[spring-context-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.close(AbstractApplicationContext.java:979) ~[spring-context-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.cloud.context.named.NamedContextFactory.destroy(NamedContextFactory.java:93) ~[spring-cloud-context-2.2.9.RELEASE.jar:2.2.9.RELEASE]
at org.springframework.beans.factory.support.DisposableBeanAdapter.destroy(DisposableBeanAdapter.java:199) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroyBean(DefaultSingletonBeanRegistry.java:587) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroySingleton(DefaultSingletonBeanRegistry.java:559) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.destroySingleton(DefaultListableBeanFactory.java:1092) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroySingletons(DefaultSingletonBeanRegistry.java:520) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.destroySingletons(DefaultListableBeanFactory.java:1085) ~[spring-beans-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.destroyBeans(AbstractApplicationContext.java:1061) ~[spring-context-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.doClose(AbstractApplicationContext.java:1030) ~[spring-context-5.2.15.RELEASE.jar:5.2.15.RELEASE]
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.doClose(ServletWebServerApplicationContext.java:170) ~[spring-boot-2.3.12.RELEASE.jar:2.3.12.RELEASE]
at org.springframework.context.support.AbstractApplicationContext$1.run(AbstractApplicationContext.java:949) ~[spring-context-5.2.15.RELEASE.jar:5.2.15.RELEASE]

Relevent platform

null

Reproduction script

null

Anything else

No response

Are you willing to submit a PR?

  • Yes I am willing to submit a PR!

It had been resolved by pr #408 . The latest version is 1.1.2, the latest dss version is 1.1.2, and the latest linkis version is 1.4.0