Snowpipe setup to Azure Container

Snowflake Wiki
4 min readDec 3, 2022

Here is the step by step guide for beginners to establish snowpipe connectivity in Snowflake with Azure blob storage containers. If you are looking for navigation using Azure UI to configure the set up (that is not in snowflake documentation) then you are at a right place here.

1) Set up snowflake connectivity with Azure container

https://medium.com/@snowflakewiki/connecting-snowflake-to-azure-container-448793503b1a

2) Create a storage queue

I created a storage queue as snowflake-queue

Make a note of the queue created: https://azuresnowflakedemo.queue.core.windows.net/snowflake-queue

3) Create Event grid subscription

I created event subscription name as snowflake-event with below details

System Topic Name as azuresnowflakedemo
Filter to Event Types as Blob Created
Endpoint Type as Storage Queues

Select an endpoint and select existing queue created in Step 2) and create the event subscription.

4) Create a Notification Integration

Login to Snowflake and create notification integration

4a) I created integration with name my_notification_int using below

AZURE_TENANT_ID that is captured as part of step 1)

AZURE_STORAGE_QUEUE_PRIMARY_URI that’s captured as part of step 2)

create notification integration my_notification_int
enabled = true
type = queue
notification_provider = azure_storage_queue
azure_storage_queue_primary_uri = ‘https://azuresnowflakedemo1.queue.core.windows.net/snowflake-queue'
azure_tenant_id = ‘9d900ac6-xxxx–xxxx–xxxx-3e5067bd8f70’;

4b) Execute the DESCRIBE NOTIFICATION INTEGRATION command to retrieve AZURE_CONSENT_URL and AZURE_MULTI_TENANT_APP_NAME.

4c) Open the URL in the AZURE_CONSENT_URL column and Click the Accept button on the Microsoft permissions request page.

4d) Note down the property_value of AZURE_MULTI_TENANT_APP_NAME that is before underscore (_) which here is mzhzchsnowflakepacint

5) Grant Queue access to Azure storage locations

In Azure Storage account Access Control (IAM), add role assignment and select Storage Queue Data Contributor.

Select the service principal value that is noted in step 4d) and add it to the storage account.

6) Create Pipe

6a) Back in snowflake create pipe using stage and notification integration

create pipe mypipe
auto_ingest = true
integration = ‘MY_NOTIFICATION_INT’
as
copy into mycsvtable
from @my_azure_stage
file_format = (type = csv skip_header = 1 field_delimiter = ‘|’);

Note: Integration must be in all uppercase.

7) Verify the Pipe and check if data is loading through it

7a) Check the status of the pipe if its in RUNNING status

select system$pipe_status(‘mypipe’);

7b) Upload new file in Azure blog storage and validate if they are being processed by the pipe

select *
from table(information_schema.copy_history(table_name=>’MYTABLE’, start_time=> dateadd(hours, -1, current_timestamp())));

References: https://docs.snowflake.com/en/user-guide/data-load-azure-config.html

Clap if you like the content and feel free to ask if you have any questions in the comments. I will be more than happy to assist and guide you.

--

--

Snowflake Wiki

Snowflake Basics | Features | New releases | Tricks & Tips | SnowPro Certifications | Solutions | Knowledge Sharing | ~~~ By satitiru (Snowflake DataSuperHero)