Secured data ingestion(ELT) from an External Snowflake Account into Azure Blob store using Azure Databricks secrets and Azure Keyvault
Sometimes accessing data requires that you authenticate to external data sources through JDBC. Instead of directly entering your credentials into a notebook, use Azure Databricks secrets to store your credentials and reference them in notebooks and jobs.
There are 2 types of secrets
- Azure KeyVault backed scopes
- Databricks backed scopes
We would prefer using Azure Keyvault backed scopes whenever possible because it has much better security controls than Databricks backed scope which would only be for testing.
-
Use the Databricks CLI databricks secrets list-scopes command to verify that the scope was created successfully.
databricks configure --token
Databricks Host: https://<databricksurl>.azuredatabricks.net/
Token: <token>
databricks secrets list-scopes
Create a snowflake-databricks-etl notebook in Azure Databricks
https://docs.databricks.com/data/data-sources/snowflake.html