Default logging level causing issues on Spark cluster
dfenn opened this issue · 2 comments
dfenn commented
I'm running Sherpa on a Spark cluster (I've written a little Spark scheduler), and the default logging level is causing my Spark jobs to hang.
I've fixed it for my case by changing the logging options in each of the source files to
logging.basicConfig(level=logging.WARNING)
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
LarsHH commented
Great! Would you be happy to push the modifications to a branch then I can check the behavior on my side and merge it. Also, would you be happy to share your Spark scheduler?
Thanks, Lars