cordon-thiago/airflow-spark

airflow Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

Pipka091 opened this issue · 1 comments

Cant work, i run dag spark submit and get spark writes that there are no resources
How to fix it?

[2022-05-26 13:09:06,603] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:06,603] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:06 INFO MemoryStore: MemoryStore started with capacity 1458.6 MiB
[2022-05-26 13:09:06,616] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:06,616] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:06 INFO SparkEnv: Registering OutputCommitCoordinator
[2022-05-26 13:09:06,766] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:06,766] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:06 INFO Utils: Successfully started service 'SparkUI' on port 4040.
[2022-05-26 13:09:06,813] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:06,813] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:06 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://01c927d91284:4040
[2022-05-26 13:09:06,826] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:06,826] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:06 INFO SparkContext: Added JAR file:///usr/local/spark/resources/jars/postgresql-42.3.6.jar at spark://01c927d91284:35835/jars/postgresql-42.3.6.jar with timestamp 1653570546285
[2022-05-26 13:09:06,827] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:06,827] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:06 INFO SparkContext: Added JAR file:/usr/local/spark/resources/jars/ss-user-rule-algorithm-1.0.jar at spark://01c927d91284:35835/jars/ss-user-rule-algorithm-1.0.jar with timestamp 1653570546285
[2022-05-26 13:09:06,972] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:06,972] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:06 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://spark:7077...
[2022-05-26 13:09:07,005] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:07,005] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:07 INFO TransportClientFactory: Successfully created connection to spark/172.19.0.6:7077 after 19 ms (0 ms spent in bootstraps)
[2022-05-26 13:09:07,071] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:07,071] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:07 INFO StandaloneSchedulerBackend: Connected to Spark cluster with app ID app-20220526130907-0034
[2022-05-26 13:09:07,073] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:07,072] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:07 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20220526130907-0034/0 on worker-20220526111530-172.19.0.4-42347 (172.19.0.4:42347) with 2 core(s)
[2022-05-26 13:09:07,074] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:07,074] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:07 INFO StandaloneSchedulerBackend: Granted executor ID app-20220526130907-0034/0 on hostPort 172.19.0.4:42347 with 2 core(s), 1024.0 MiB RAM
[2022-05-26 13:09:07,075] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:07,074] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:07 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20220526130907-0034/1 on worker-20220526111530-172.19.0.4-42347 (172.19.0.4:42347) with 2 core(s)
[2022-05-26 13:09:07,075] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:07,075] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:07 INFO StandaloneSchedulerBackend: Granted executor ID app-20220526130907-0034/1 on hostPort 172.19.0.4:42347 with 2 core(s), 1024.0 MiB RAM
[2022-05-26 13:09:07,079] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:07,078] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:07 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 33001.
[2022-05-26 13:09:07,079] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:07,079] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:07 INFO NettyBlockTransferService: Server created on 01c927d91284:33001
[2022-05-26 13:09:07,080] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:07,080] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:07 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
[2022-05-26 13:09:07,086] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:07,086] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:07 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20220526130907-0034/0 is now RUNNING
[2022-05-26 13:09:07,087] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:07,087] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:07 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20220526130907-0034/1 is now RUNNING
[2022-05-26 13:09:07,087] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:07,087] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:07 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 01c927d91284, 33001, None)
[2022-05-26 13:09:07,090] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:07,090] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:07 INFO BlockManagerMasterEndpoint: Registering block manager 01c927d91284:33001 with 1458.6 MiB RAM, BlockManagerId(driver, 01c927d91284, 33001, None)
[2022-05-26 13:09:07,092] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:07,092] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:07 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 01c927d91284, 33001, None)
[2022-05-26 13:09:07,093] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:07,093] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:07 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 01c927d91284, 33001, None)
[2022-05-26 13:09:07,265] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:07,265] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:07 INFO StandaloneSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
[2022-05-26 13:09:25,992] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:25,992] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:25 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
[2022-05-26 13:09:40,994] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:40,993] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:40 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
[2022-05-26 13:09:55,993] {{logging_mixin.py:112}} INFO - [2022-05-26 13:09:55,993] {{spark_submit_hook.py:436}} INFO - 22/05/26 13:09:55 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

Sory, i use spark 3.2.1. Need 3.1.2