Internal and External Staging in Snowflake

Snowflake’s flexible staging architecture makes it easy to load and unload data from a wide variety of sources—including Microsoft Azure. Whether you’re performing a simple file load or designing a robust ELT pipeline, understanding how to configure both internal and external stages is crucial for managing data at scale.
In this article, we’ll explore what internal and external stages are, how to configure them for Azure Blob Storage, and when to use each—complete with SQL examples and best practices.
A stage in Snowflake is a storage location for loading data into or out of Snowflake tables. Stages are divided into two types:
- Internal Stages: Managed by Snowflake, located within your Snowflake account.
- External Stages: Linked to cloud storage services like Azure Blob, AWS S3, or Google Cloud Storage.
- Table Stage – Automatically created for each table.
- User Stage – Automatically created for each user.
- Named Stage – Manually created and reusable.
CREATE STAGE my_internal_stage;
-- Upload a file to internal stage
PUT file://mydata.csv @my_internal_stage;
-- Load data from the stage into a table
COPY INTO my_table
FROM @my_internal_stage/mydata.csv
FILE_FORMAT = (TYPE = 'CSV' FIELD_OPTIONALLY_ENCLOSED_BY = '"');
- Temporary data loads
- ETL pipelines managed fully inside Snowflake
- Avoid managing cloud credentials
- Log in to Azure Portal
- Create a new Blob Storage container
- Note the container URL and your Azure SAS token or Storage Account Key
CREATE STAGE azure_stage_sas
URL = 'azure://mystorageaccount.blob.core.windows.net/mycontainer'
CREDENTIALS = (
AZURE_SAS_TOKEN = '<your-sas-token>'
)
FILE_FORMAT = (TYPE = 'CSV');
CREATE STAGE azure_stage_key
URL = 'azure://mystorageaccount.blob.core.windows.net/mycontainer'
STORAGE_INTEGRATION = azure_int
FILE_FORMAT = (TYPE = 'CSV');
💡 You can also use
STORAGE_INTEGRATION
to securely access Azure via Snowflake IAM.
COPY INTO my_table
FROM @azure_stage_sas/myfile.csv
FILE_FORMAT = (TYPE = 'CSV');
COPY INTO @azure_stage_sas/myfile_export.csv
FROM my_table
FILE_FORMAT = (TYPE = 'CSV')
OVERWRITE = TRUE;
Use Case | Internal Stage | External Stage (Azure) |
---|---|---|
Quick testing or dev | ✅ Yes | ❌ Not needed |
Secure and managed by Snowflake | ✅ Yes | ❌ Customer-managed |
Centralized data lake or external apps | ❌ No | ✅ Ideal |
Large-scale ingestion from Azure Data Factory or Synapse | ❌ Limited | ✅ Recommended |
Multi-cloud or hybrid architectures | ❌ No | ✅ Yes |
Instead of embedding credentials, create a storage integration to manage access through Snowflake’s built-in identity framework.
CREATE STORAGE INTEGRATION azure_int
TYPE = EXTERNAL_STAGE
STORAGE_PROVIDER = AZURE
ENABLED = TRUE
STORAGE_ALLOWED_LOCATIONS = ('azure://mystorageaccount.blob.core.windows.net/mycontainer');
Link this in your CREATE STAGE
command to avoid direct use of SAS tokens or keys.
Configuring internal and external staging in Snowflake is a foundational skill for anyone working with cloud data pipelines. If your data is stored in Azure, external stages give you powerful flexibility for large-scale, secure data movement. For smaller tasks or fully Snowflake-managed pipelines, internal stages are lightweight and efficient.
Mastering both gives you the agility to work across architectures—from quick loads to enterprise-scale ELT.
#Snowflake #AzureBlob #DataEngineering #CloudData #SnowflakeStages #SQL #ETL #CloudStorage #DataPipeline #AzureIntegration