Connecting Snowflake to Azure Container

Snowflake Wiki
5 min readNov 29, 2022

--

Here is the step by step guide for beginners to connect Snowflake with Azure blob storage containers using storage integration. If you are looking for navigation using Azure UI to configure the set up (that is not in snowflake documentation) then you are at a right place here.

  1. Login to Azure account and make a note of Tenant ID

Get the Tenant ID from Microsoft Entra ID

2) Create a Resource Group

I created resource group with name snowflake

3) Create a Storage Account

I created storage account with name azuresnowflakedemo with below changes to default options:

Redundancy: locally-redundant storage (LRS)

Data Lake Storage Gen2: Enable hierarchical namespace

4) Create a Storage container and upload files

I created container with name snowflakedata

5) Create a Storage Integration

Login to Snowflake and create storage integration

5a) I created integration with name azure_int using below

AZURE_TENANT_ID that is captured in step1)

STORAGE_ALLOWED_LOCATIONS = (‘azure://<account>.blob.core.windows.net/<container>/<path>/’)
where <account> is azuresnowflakedemo and <container> is snowflakedata

create or replace storage integration azure_int
type = external_stage
storage_provider = ‘AZURE’
enabled = true
azure_tenant_id = ‘9d900ac6-cee8–xxxx–xxxx-3e5065bd8f70’
storage_allowed_locations = (‘azure://azuresnowflakedemo.blob.core.windows.net/snowflakedata/’)
;

5b) Execute the DESCRIBE STORAGE INTEGRATION command to retrieve AZURE_CONSENT_URL and AZURE_MULTI_TENANT_APP_NAME.

5c) Open the URL in the AZURE_CONSENT_URL column and Click the Accept button on the Microsoft permissions request page.

5d) Note down the property_value of AZURE_MULTI_TENANT_APP_NAME that is before underscore (_) which here is otfo3asnowflakepacint

6) Grant snowflake access to Azure storage locations

In Azure Storage account Access Control (IAM), add role assignment and select any of below:

Storage Blob Data Reader — grants read access only.
Storage Blob Data Contributor — grants read and write access.

Select the service principal value that is noted in step 5d) and add it to the storage account.

7) Create an External Stage to access the files in Azure storage container

7a) Back in snowflake create stage using azure url and storage integration

create or replace stage my_azure_stage
storage_integration = azure_int
url = ‘azure://azuresnowflakedemo.blob.core.windows.net/snowflakedata/’
;

7b) List the stage to see the files loaded in Azure container

7) Copy files from Azure to Snowflake Table

7a) Using the stage:

copy into mycsvtable
from
@my_azure_stage
file_format = (type = csv skip_header = 1 field_delimiter = ‘|’);

7b) Using the url:

copy into mycsvtable
from ‘azure://azuresnowflakedemo.blob.core.windows.net/snowflakedata/’
storage_integration = azure_int
file_format = (type = csv skip_header = 1 field_delimiter = ‘|’);

Probable issues:

Failed to access remote file: access denied. Please check your credentials

If you are able to list the files in the stage, but if you get be above error when querying the stage files or executing COPY into command, then check on the network access rule if all the above steps are diligently followed.

Go to the storage account you want to secure and locate the Networking settings under Security + networking.

By default, storage accounts accept connections from clients on any network. But if the selected option is different such as ‘Enabled from selected virtual networks and IP addresses’ or ‘Disabled’, then change it back to ‘Enabled from all networks’ to allow traffic from all networks.

References: https://docs.snowflake.com/en/user-guide/data-load-azure-config.html

Follow and Clap if you like the content and feel free to ask if you have any questions in the comments. I will be more than happy to assist and guide you.

--

--

Snowflake Wiki

Snowflake Basics | Features | New releases | Tricks & Tips | SnowPro Certifications | Solutions | Knowledge Sharing | ~~~ By satitiru (Snowflake DataSuperHero)