Running dbt-core with Snowflake: Complete Setup Guide
Step-by-step guide to setting up dbt-core with Snowflake — user creation, role configuration, warehouse sizing, profiles.yml, and common pitfalls.
Snowflake is one of the most popular warehouses for dbt-core projects, and for good reason. It's fast, it scales on demand, and it has a clean permission model. But getting dbt-core connected to Snowflake properly — with the right roles, the right warehouse config, and the right authentication — takes a bit of careful setup.
This guide walks through the entire dbt Snowflake setup from scratch. We'll create a dedicated user and role, configure profiles.yml, set up key pair authentication for production, run your first model, and troubleshoot the errors you're most likely to hit.
Prerequisites
Before you start, make sure you have:
- A Snowflake account with
ACCOUNTADMINorSECURITYADMINaccess (you'll need it to create roles and users) - Python 3.9+ installed
- pip (comes with Python)
- A terminal you're comfortable working in
If you don't have a Snowflake account yet, the free trial gives you $400 in credits — more than enough to work through this guide and build a real project.
Installing dbt-snowflake
The dbt-snowflake package includes dbt-core as a dependency, so you only need one install. Use a virtual environment to keep things clean:
# Create and activate a virtual environmentpython -m venv dbt-envsource dbt-env/bin/activate# Install dbt with the Snowflake adapterpip install dbt-snowflake# Verify the installationdbt --version
You should see both dbt-core and dbt-snowflake in the output. If you're on an M-series Mac and hit build errors, make sure you have the Xcode command line tools installed (xcode-select --install).
Snowflake User and Role Setup
Don't connect dbt with your personal Snowflake account. Create a dedicated user with a dedicated role. This is important for security, auditing, and avoiding accidental privilege escalation.
Log into Snowflake (the web UI or SnowSQL) as ACCOUNTADMIN and run the following:
-- Create a role for dbtCREATE ROLE IF NOT EXISTS DBT_ROLE;-- Create a warehouse for dbtCREATE WAREHOUSE IF NOT EXISTS DBT_WHWITH WAREHOUSE_SIZE = 'X-SMALL'AUTO_SUSPEND = 60AUTO_RESUME = TRUEINITIALLY_SUSPENDED = TRUE;-- Grant the role access to the warehouseGRANT USAGE ON WAREHOUSE DBT_WH TO ROLE DBT_ROLE;-- Create a database for your projectCREATE DATABASE IF NOT EXISTS ANALYTICS;-- Grant the role access to the databaseGRANT USAGE ON DATABASE ANALYTICS TO ROLE DBT_ROLE;GRANT CREATE SCHEMA ON DATABASE ANALYTICS TO ROLE DBT_ROLE;-- Create schemas (dbt will create schemas too, but this is good practice)CREATE SCHEMA IF NOT EXISTS ANALYTICS.RAW;CREATE SCHEMA IF NOT EXISTS ANALYTICS.STAGING;CREATE SCHEMA IF NOT EXISTS ANALYTICS.MARTS;-- Grant schema-level permissionsGRANT ALL ON SCHEMA ANALYTICS.RAW TO ROLE DBT_ROLE;GRANT ALL ON SCHEMA ANALYTICS.STAGING TO ROLE DBT_ROLE;GRANT ALL ON SCHEMA ANALYTICS.MARTS TO ROLE DBT_ROLE;-- Grant permissions on future objects in those schemasGRANT ALL ON FUTURE TABLES IN SCHEMA ANALYTICS.RAW TO ROLE DBT_ROLE;GRANT ALL ON FUTURE VIEWS IN SCHEMA ANALYTICS.RAW TO ROLE DBT_ROLE;GRANT ALL ON FUTURE TABLES IN SCHEMA ANALYTICS.STAGING TO ROLE DBT_ROLE;GRANT ALL ON FUTURE VIEWS IN SCHEMA ANALYTICS.STAGING TO ROLE DBT_ROLE;GRANT ALL ON FUTURE TABLES IN SCHEMA ANALYTICS.MARTS TO ROLE DBT_ROLE;GRANT ALL ON FUTURE VIEWS IN SCHEMA ANALYTICS.MARTS TO ROLE DBT_ROLE;-- Create the dbt userCREATE USER IF NOT EXISTS DBT_USERPASSWORD = 'a-strong-password-here'DEFAULT_ROLE = DBT_ROLEDEFAULT_WAREHOUSE = DBT_WHMUST_CHANGE_PASSWORD = FALSE;-- Assign the role to the userGRANT ROLE DBT_ROLE TO USER DBT_USER;
A few notes on the SQL above:
- **
X-SMALLwarehouse** is fine for development and small projects. You can always scale it up later. - **
AUTO_SUSPEND = 60** shuts down the warehouse after 60 seconds of inactivity. You only pay for compute time. - **
INITIALLY_SUSPENDED = TRUE** means the warehouse won't start (and charge you) until you actually run a query. - Future grants ensure that new tables and views created by dbt automatically get the right permissions.
profiles.yml Configuration
dbt needs a profiles.yml file to know how to connect to your warehouse. By default, dbt looks for this in ~/.dbt/profiles.yml.
Password Authentication
The simplest setup for getting started:
# ~/.dbt/profiles.ymlmy_project:target: devoutputs:dev:type: snowflakeaccount: "your-account-id" # e.g., xy12345.us-east-1user: "DBT_USER"password: "{{ env_var('DBT_SNOWFLAKE_PASSWORD') }}"role: "DBT_ROLE"warehouse: "DBT_WH"database: "ANALYTICS"schema: "STAGING"threads: 4
Then set the environment variable:
export DBT_SNOWFLAKE_PASSWORD="a-strong-password-here"
Never hardcode passwords in profiles.yml. Always use env_var(). If you commit a password to Git, you'll have a bad day.
Finding Your Account Identifier
The account field trips people up. It's not your username or your organization name. In the Snowflake web UI, look at the URL:
- If your URL is
https://xy12345.us-east-1.snowflakecomputing.com, your account isxy12345.us-east-1 - If you're on the newer Snowflake format like
https://myorg-myaccount.snowflakecomputing.com, your account ismyorg-myaccount
You can also run SELECT CURRENT_ACCOUNT(); in Snowflake to get it.
Key Pair Authentication
For production, key pair authentication is more secure than passwords. No passwords to rotate, no risk of leaking credentials in environment variables.
# ~/.dbt/profiles.ymlmy_project:target: prodoutputs:prod:type: snowflakeaccount: "your-account-id"user: "DBT_USER"role: "DBT_ROLE"warehouse: "DBT_WH"database: "ANALYTICS"schema: "STAGING"threads: 4authenticator: snowflake_jwtprivate_key_path: "{{ env_var('DBT_SNOWFLAKE_PRIVATE_KEY_PATH') }}"private_key_passphrase: "{{ env_var('DBT_SNOWFLAKE_PRIVATE_KEY_PASSPHRASE') }}"
Key Pair Authentication Setup
Here's how to generate the key pair and register it with Snowflake.
Generate the Key Pair
# Generate an encrypted private keyopenssl genrsa 2048 | openssl pkcs8 -topk8 -v2 aes-256-cbc -inform PEM -out rsa_key.p8# You'll be prompted for a passphrase — remember it# Generate the public key from the private keyopenssl rsa -in rsa_key.p8 -pubout -out rsa_key.pub
Register the Public Key in Snowflake
-- Get the public key content (without the header/footer lines)-- Copy the base64 content from rsa_key.pub, then run:ALTER USER DBT_USER SET RSA_PUBLIC_KEY='MIIBIjANBgkqhk...your-public-key-here...';
Make sure you strip the -----BEGIN PUBLIC KEY----- and -----END PUBLIC KEY----- lines and any newlines when pasting the key.
Set the Environment Variables
export DBT_SNOWFLAKE_PRIVATE_KEY_PATH="/path/to/rsa_key.p8"export DBT_SNOWFLAKE_PRIVATE_KEY_PASSPHRASE="your-passphrase"
Verify the Connection
dbt debug
If everything is configured correctly, you'll see a series of green "OK" messages. If not, the error messages are usually descriptive enough to point you in the right direction.
Creating Your First Model and Running It
With the connection working, let's actually build something.
Initialize a dbt Project
If you don't have one yet:
dbt init my_projectcd my_project
Follow the prompts — select Snowflake as your adapter. This creates the project scaffolding including dbt_project.yml and an example model.
Create a Simple Model
Create a file at models/staging/stg_example.sql:
-- models/staging/stg_example.sqlwith source as (select1 as id,'Alice' as name,current_timestamp() as created_at)select * from source
Run It
# Test the connection firstdbt debug# Install any packagesdbt deps# Run your modeldbt run# Run tests (if you have any defined)dbt test# Or do both at oncedbt build
You should see output like:
Running with dbt=1.9.0Found 1 model, 0 sources, 0 tests, 0 snapshotsConcurrency: 4 threads (target='dev')1 of 1 START sql view model STAGING.stg_example .............. [RUN]1 of 1 OK created sql view model STAGING.stg_example ......... [SUCCESS 1 in 2.31s]Finished running 1 view model in 0 hours 0 minutes and 4.52 seconds (4.52s).Completed successfully.
Check Snowflake — you should see the view in ANALYTICS.STAGING.
Common Errors and Fixes
Here are the issues you'll almost certainly hit at some point.
"Object does not exist, or operation cannot be performed"
This usually means the role doesn't have the right permissions. Double check:
-- Verify role grantsSHOW GRANTS TO ROLE DBT_ROLE;-- Make sure the user is using the right roleSHOW GRANTS TO USER DBT_USER;
The fix is almost always a missing GRANT statement. Go back to the role setup section and make sure you didn't skip any grants.
"No active warehouse selected"
Your user doesn't have a default warehouse set, or the role doesn't have USAGE on the warehouse. Fix it:
ALTER USER DBT_USER SET DEFAULT_WAREHOUSE = DBT_WH;GRANT USAGE ON WAREHOUSE DBT_WH TO ROLE DBT_ROLE;
Also make sure warehouse is set in your profiles.yml.
Connection Timeout
If dbt debug hangs or times out:
- Check your account identifier is correct (this is the most common cause)
- Verify your network can reach Snowflake (some corporate networks block the connection)
- If you're behind a proxy, set
HTTP_PROXYandHTTPS_PROXYenvironment variables - Try adding
connect_retries: 3andconnect_timeout: 30to your profile
Key Pair Errors
"JWT token is invalid" — The public key registered in Snowflake doesn't match your private key. Regenerate the public key from the private key and re-register it.
"Could not deserialize key data" — The private key format is wrong. Make sure you used PKCS#8 format (the openssl pkcs8 step). dbt-snowflake doesn't support PKCS#1 format.
"Passphrase is incorrect" — Self-explanatory, but double check there are no trailing newlines or spaces in your environment variable.
"Insufficient privileges to operate on schema"
The role can use the database but can't create objects in the schema. You need:
GRANT ALL ON SCHEMA ANALYTICS.STAGING TO ROLE DBT_ROLE;-- Or more specifically:GRANT CREATE TABLE ON SCHEMA ANALYTICS.STAGING TO ROLE DBT_ROLE;GRANT CREATE VIEW ON SCHEMA ANALYTICS.STAGING TO ROLE DBT_ROLE;
Performance Tips
Warehouse Sizing
Snowflake warehouse sizes double in compute power (and cost) with each step up:
| Size | Credits/Hour | Good For |
|---|---|---|
| X-Small | 1 | Development, small models |
| Small | 2 | Light production workloads |
| Medium | 4 | Medium production, complex transformations |
| Large | 8 | Heavy workloads, large datasets |
Start with X-Small and scale up only if your runs are too slow. You can even use different warehouses for dev and prod:
my_project:outputs:dev:warehouse: "DBT_WH_XS"# ...prod:warehouse: "DBT_WH_M"# ...
Auto-Suspend Settings
Set AUTO_SUSPEND aggressively. Every second the warehouse runs idle, you're burning credits:
-- Suspend after 60 seconds of inactivity (minimum)ALTER WAREHOUSE DBT_WH SET AUTO_SUSPEND = 60;
For dbt workloads, 60 seconds is almost always the right setting. dbt runs queries back-to-back, so the warehouse stays active during a run and suspends immediately after.
Query Tags
Add query tags to your dbt runs so you can track costs and performance in Snowflake:
# dbt_project.ymlmodels:my_project:+query_tag: "dbt_{{ target.name }}"
Then in Snowflake, you can query SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY filtered by query tag to see exactly how much your dbt runs cost.
Threads
The threads setting in profiles.yml controls how many models dbt runs in parallel. A good starting point:
- Development: 4 threads
- Production: 8-16 threads (depends on warehouse size and model complexity)
More threads means more concurrent queries, which means faster runs — but only if your warehouse can handle the load.
Production Considerations
When you're ready to move beyond local development:
- Use key pair authentication instead of passwords. No passwords to rotate, no risk of leaks.
- Separate dev and prod targets with different warehouses, databases, or schemas.
- Pin your dbt version in a
requirements.txtso production matches development. - Run dbt in CI/CD — use
dbt buildin your pipeline to catch errors before they reach production. - Set up monitoring — you need to know when runs fail, not find out the next morning when dashboards are stale.
- Use Snowflake resource monitors to set spending limits and avoid surprise bills.
-- Create a resource monitor with a monthly credit quotaCREATE RESOURCE MONITOR DBT_MONITORWITH CREDIT_QUOTA = 100FREQUENCY = MONTHLYSTART_TIMESTAMP = IMMEDIATELYTRIGGERSON 75 PERCENT DO NOTIFYON 100 PERCENT DO SUSPEND;ALTER WAREHOUSE DBT_WH SET RESOURCE_MONITOR = DBT_MONITOR;
Skip the Infrastructure Work
Setting up dbt-core with Snowflake isn't hard, but running it reliably in production is a different story. You need scheduling, monitoring, credential management, log storage, and a way to handle failures — all on top of maintaining the infrastructure itself.
ModelDock handles all of that. Connect your Git repo, enter your Snowflake credentials (encrypted with AES-256-GCM), set a schedule, and your dbt project runs in an isolated container with full logging and artifact storage. It supports password auth and key pair auth out of the box.
Free during open beta. No credit card required.