tensorflow/ecosystem

configure executor cores for spark-tensorflow-distributor to utilize multi-thread per worker

cmxcn opened this issue · 1 comments

cmxcn commented

Hi I have one question regarding the correct configure to make spark-tensorflow-distributor to utilize multi-thread per worker. As pyspark will allocate tasks per core*executor, so there may be multiple workers running on one executor, any idea how to avoid this? Thanks in advance.

cmxcn commented

found config spark.task.cpus=core, issue closed