Azure/spark-cdm-connector

Could not find ADLS Gen2 Token when running as Job

Closed this issue · 7 comments

We are creating a CDM using the 0.19 version of the connector. We use Spark context to switch the context of the running system to use an application id. When running in normal mode (not job), the code works well, but when running as a Job, we get an error stating: Could not find ADLS Gen2 Token.

We are running DataBricks Runtime 6.4 (Spark 2.4.5). With a High Concurrency cluster and Passthrough enabled.

Any help of further information would be greatly appreciated.

BTW, we have tried to use the appId and secret parameters but we get an error on that as well stating that the clientId is null.

Regards
Fabian

Can you provide a sample of what you mean by , "We use Spark context to switch the context of the running system to use an application id"?

What language are you using? Scala is only supported with a standard premium cluster. Does it work with a standard cluster?

I'm not aware of normal mode vs job in DB. Can you share a link? - For the appid/secrect param, does it work when using a notebook?

You dont need to set this configuration. The CDM connector is handling internally. Please run the job in same way as you do in notebook.

Any word on this? We have a ticket open with or Premier support and going in circles with the support folks. Is AppId and Secret supported with the .18 or .19 version?

Hello @fvalencia12 ,
https://docs.microsoft.com/en-us/azure/databricks/security/credential-passthrough/adls-passthrough#cluster-requirements, As per this link, Clusters enabled for credential passthrough do not support jobs

But you can continue using the CDM connector as a Job using app creds. You don't need to set the Oauth configuration. If you still face the issue, please send an email to asksparkcdm@microsoft.com and we can continue from there.

Closing this as we have not heard back.