Back to Blog
dbt-coretutorialwarehouse

Running dbt-core with Snowflake: Complete Setup Guide

Step-by-step guide to setting up dbt-core with Snowflake — user creation, role configuration, warehouse sizing, profiles.yml, and common pitfalls.

ModelDock TeamFebruary 17, 202611 min read

Snowflake is one of the most popular warehouses for dbt-core projects, and for good reason. It's fast, it scales on demand, and it has a clean permission model. But getting dbt-core connected to Snowflake properly — with the right roles, the right warehouse config, and the right authentication — takes a bit of careful setup.

This guide walks through the entire dbt Snowflake setup from scratch. We'll create a dedicated user and role, configure profiles.yml, set up key pair authentication for production, run your first model, and troubleshoot the errors you're most likely to hit.

Prerequisites

Before you start, make sure you have:

  • A Snowflake account with ACCOUNTADMIN or SECURITYADMIN access (you'll need it to create roles and users)
  • Python 3.9+ installed
  • pip (comes with Python)
  • A terminal you're comfortable working in

If you don't have a Snowflake account yet, the free trial gives you $400 in credits — more than enough to work through this guide and build a real project.

Installing dbt-snowflake

The dbt-snowflake package includes dbt-core as a dependency, so you only need one install. Use a virtual environment to keep things clean:

# Create and activate a virtual environment
python -m venv dbt-env
source dbt-env/bin/activate
# Install dbt with the Snowflake adapter
pip install dbt-snowflake
# Verify the installation
dbt --version

You should see both dbt-core and dbt-snowflake in the output. If you're on an M-series Mac and hit build errors, make sure you have the Xcode command line tools installed (xcode-select --install).

Snowflake User and Role Setup

Don't connect dbt with your personal Snowflake account. Create a dedicated user with a dedicated role. This is important for security, auditing, and avoiding accidental privilege escalation.

Log into Snowflake (the web UI or SnowSQL) as ACCOUNTADMIN and run the following:

-- Create a role for dbt
CREATE ROLE IF NOT EXISTS DBT_ROLE;
-- Create a warehouse for dbt
CREATE WAREHOUSE IF NOT EXISTS DBT_WH
WITH WAREHOUSE_SIZE = 'X-SMALL'
AUTO_SUSPEND = 60
AUTO_RESUME = TRUE
INITIALLY_SUSPENDED = TRUE;
-- Grant the role access to the warehouse
GRANT USAGE ON WAREHOUSE DBT_WH TO ROLE DBT_ROLE;
-- Create a database for your project
CREATE DATABASE IF NOT EXISTS ANALYTICS;
-- Grant the role access to the database
GRANT USAGE ON DATABASE ANALYTICS TO ROLE DBT_ROLE;
GRANT CREATE SCHEMA ON DATABASE ANALYTICS TO ROLE DBT_ROLE;
-- Create schemas (dbt will create schemas too, but this is good practice)
CREATE SCHEMA IF NOT EXISTS ANALYTICS.RAW;
CREATE SCHEMA IF NOT EXISTS ANALYTICS.STAGING;
CREATE SCHEMA IF NOT EXISTS ANALYTICS.MARTS;
-- Grant schema-level permissions
GRANT ALL ON SCHEMA ANALYTICS.RAW TO ROLE DBT_ROLE;
GRANT ALL ON SCHEMA ANALYTICS.STAGING TO ROLE DBT_ROLE;
GRANT ALL ON SCHEMA ANALYTICS.MARTS TO ROLE DBT_ROLE;
-- Grant permissions on future objects in those schemas
GRANT ALL ON FUTURE TABLES IN SCHEMA ANALYTICS.RAW TO ROLE DBT_ROLE;
GRANT ALL ON FUTURE VIEWS IN SCHEMA ANALYTICS.RAW TO ROLE DBT_ROLE;
GRANT ALL ON FUTURE TABLES IN SCHEMA ANALYTICS.STAGING TO ROLE DBT_ROLE;
GRANT ALL ON FUTURE VIEWS IN SCHEMA ANALYTICS.STAGING TO ROLE DBT_ROLE;
GRANT ALL ON FUTURE TABLES IN SCHEMA ANALYTICS.MARTS TO ROLE DBT_ROLE;
GRANT ALL ON FUTURE VIEWS IN SCHEMA ANALYTICS.MARTS TO ROLE DBT_ROLE;
-- Create the dbt user
CREATE USER IF NOT EXISTS DBT_USER
PASSWORD = 'a-strong-password-here'
DEFAULT_ROLE = DBT_ROLE
DEFAULT_WAREHOUSE = DBT_WH
MUST_CHANGE_PASSWORD = FALSE;
-- Assign the role to the user
GRANT ROLE DBT_ROLE TO USER DBT_USER;

A few notes on the SQL above:

  • **X-SMALL warehouse** is fine for development and small projects. You can always scale it up later.
  • **AUTO_SUSPEND = 60** shuts down the warehouse after 60 seconds of inactivity. You only pay for compute time.
  • **INITIALLY_SUSPENDED = TRUE** means the warehouse won't start (and charge you) until you actually run a query.
  • Future grants ensure that new tables and views created by dbt automatically get the right permissions.

profiles.yml Configuration

dbt needs a profiles.yml file to know how to connect to your warehouse. By default, dbt looks for this in ~/.dbt/profiles.yml.

Password Authentication

The simplest setup for getting started:

# ~/.dbt/profiles.yml
my_project:
target: dev
outputs:
dev:
type: snowflake
account: "your-account-id" # e.g., xy12345.us-east-1
user: "DBT_USER"
password: "{{ env_var('DBT_SNOWFLAKE_PASSWORD') }}"
role: "DBT_ROLE"
warehouse: "DBT_WH"
database: "ANALYTICS"
schema: "STAGING"
threads: 4

Then set the environment variable:

export DBT_SNOWFLAKE_PASSWORD="a-strong-password-here"

Never hardcode passwords in profiles.yml. Always use env_var(). If you commit a password to Git, you'll have a bad day.

Finding Your Account Identifier

The account field trips people up. It's not your username or your organization name. In the Snowflake web UI, look at the URL:

  • If your URL is https://xy12345.us-east-1.snowflakecomputing.com, your account is xy12345.us-east-1
  • If you're on the newer Snowflake format like https://myorg-myaccount.snowflakecomputing.com, your account is myorg-myaccount

You can also run SELECT CURRENT_ACCOUNT(); in Snowflake to get it.

Key Pair Authentication

For production, key pair authentication is more secure than passwords. No passwords to rotate, no risk of leaking credentials in environment variables.

# ~/.dbt/profiles.yml
my_project:
target: prod
outputs:
prod:
type: snowflake
account: "your-account-id"
user: "DBT_USER"
role: "DBT_ROLE"
warehouse: "DBT_WH"
database: "ANALYTICS"
schema: "STAGING"
threads: 4
authenticator: snowflake_jwt
private_key_path: "{{ env_var('DBT_SNOWFLAKE_PRIVATE_KEY_PATH') }}"
private_key_passphrase: "{{ env_var('DBT_SNOWFLAKE_PRIVATE_KEY_PASSPHRASE') }}"

Key Pair Authentication Setup

Here's how to generate the key pair and register it with Snowflake.

Generate the Key Pair

# Generate an encrypted private key
openssl genrsa 2048 | openssl pkcs8 -topk8 -v2 aes-256-cbc -inform PEM -out rsa_key.p8
# You'll be prompted for a passphrase — remember it
# Generate the public key from the private key
openssl rsa -in rsa_key.p8 -pubout -out rsa_key.pub

Register the Public Key in Snowflake

-- Get the public key content (without the header/footer lines)
-- Copy the base64 content from rsa_key.pub, then run:
ALTER USER DBT_USER SET RSA_PUBLIC_KEY='MIIBIjANBgkqhk...your-public-key-here...';

Make sure you strip the -----BEGIN PUBLIC KEY----- and -----END PUBLIC KEY----- lines and any newlines when pasting the key.

Set the Environment Variables

export DBT_SNOWFLAKE_PRIVATE_KEY_PATH="/path/to/rsa_key.p8"
export DBT_SNOWFLAKE_PRIVATE_KEY_PASSPHRASE="your-passphrase"

Verify the Connection

dbt debug

If everything is configured correctly, you'll see a series of green "OK" messages. If not, the error messages are usually descriptive enough to point you in the right direction.

Creating Your First Model and Running It

With the connection working, let's actually build something.

Initialize a dbt Project

If you don't have one yet:

dbt init my_project
cd my_project

Follow the prompts — select Snowflake as your adapter. This creates the project scaffolding including dbt_project.yml and an example model.

Create a Simple Model

Create a file at models/staging/stg_example.sql:

-- models/staging/stg_example.sql
with source as (
select
1 as id,
'Alice' as name,
current_timestamp() as created_at
)
select * from source

Run It

# Test the connection first
dbt debug
# Install any packages
dbt deps
# Run your model
dbt run
# Run tests (if you have any defined)
dbt test
# Or do both at once
dbt build

You should see output like:

Running with dbt=1.9.0
Found 1 model, 0 sources, 0 tests, 0 snapshots
Concurrency: 4 threads (target='dev')
1 of 1 START sql view model STAGING.stg_example .............. [RUN]
1 of 1 OK created sql view model STAGING.stg_example ......... [SUCCESS 1 in 2.31s]
Finished running 1 view model in 0 hours 0 minutes and 4.52 seconds (4.52s).
Completed successfully.

Check Snowflake — you should see the view in ANALYTICS.STAGING.

Common Errors and Fixes

Here are the issues you'll almost certainly hit at some point.

"Object does not exist, or operation cannot be performed"

This usually means the role doesn't have the right permissions. Double check:

-- Verify role grants
SHOW GRANTS TO ROLE DBT_ROLE;
-- Make sure the user is using the right role
SHOW GRANTS TO USER DBT_USER;

The fix is almost always a missing GRANT statement. Go back to the role setup section and make sure you didn't skip any grants.

"No active warehouse selected"

Your user doesn't have a default warehouse set, or the role doesn't have USAGE on the warehouse. Fix it:

ALTER USER DBT_USER SET DEFAULT_WAREHOUSE = DBT_WH;
GRANT USAGE ON WAREHOUSE DBT_WH TO ROLE DBT_ROLE;

Also make sure warehouse is set in your profiles.yml.

Connection Timeout

If dbt debug hangs or times out:

  • Check your account identifier is correct (this is the most common cause)
  • Verify your network can reach Snowflake (some corporate networks block the connection)
  • If you're behind a proxy, set HTTP_PROXY and HTTPS_PROXY environment variables
  • Try adding connect_retries: 3 and connect_timeout: 30 to your profile

Key Pair Errors

"JWT token is invalid" — The public key registered in Snowflake doesn't match your private key. Regenerate the public key from the private key and re-register it.

"Could not deserialize key data" — The private key format is wrong. Make sure you used PKCS#8 format (the openssl pkcs8 step). dbt-snowflake doesn't support PKCS#1 format.

"Passphrase is incorrect" — Self-explanatory, but double check there are no trailing newlines or spaces in your environment variable.

"Insufficient privileges to operate on schema"

The role can use the database but can't create objects in the schema. You need:

GRANT ALL ON SCHEMA ANALYTICS.STAGING TO ROLE DBT_ROLE;
-- Or more specifically:
GRANT CREATE TABLE ON SCHEMA ANALYTICS.STAGING TO ROLE DBT_ROLE;
GRANT CREATE VIEW ON SCHEMA ANALYTICS.STAGING TO ROLE DBT_ROLE;

Performance Tips

Warehouse Sizing

Snowflake warehouse sizes double in compute power (and cost) with each step up:

SizeCredits/HourGood For
X-Small1Development, small models
Small2Light production workloads
Medium4Medium production, complex transformations
Large8Heavy workloads, large datasets

Start with X-Small and scale up only if your runs are too slow. You can even use different warehouses for dev and prod:

my_project:
outputs:
dev:
warehouse: "DBT_WH_XS"
# ...
prod:
warehouse: "DBT_WH_M"
# ...

Auto-Suspend Settings

Set AUTO_SUSPEND aggressively. Every second the warehouse runs idle, you're burning credits:

-- Suspend after 60 seconds of inactivity (minimum)
ALTER WAREHOUSE DBT_WH SET AUTO_SUSPEND = 60;

For dbt workloads, 60 seconds is almost always the right setting. dbt runs queries back-to-back, so the warehouse stays active during a run and suspends immediately after.

Query Tags

Add query tags to your dbt runs so you can track costs and performance in Snowflake:

# dbt_project.yml
models:
my_project:
+query_tag: "dbt_{{ target.name }}"

Then in Snowflake, you can query SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY filtered by query tag to see exactly how much your dbt runs cost.

Threads

The threads setting in profiles.yml controls how many models dbt runs in parallel. A good starting point:

  • Development: 4 threads
  • Production: 8-16 threads (depends on warehouse size and model complexity)

More threads means more concurrent queries, which means faster runs — but only if your warehouse can handle the load.

Production Considerations

When you're ready to move beyond local development:

  • Use key pair authentication instead of passwords. No passwords to rotate, no risk of leaks.
  • Separate dev and prod targets with different warehouses, databases, or schemas.
  • Pin your dbt version in a requirements.txt so production matches development.
  • Run dbt in CI/CD — use dbt build in your pipeline to catch errors before they reach production.
  • Set up monitoring — you need to know when runs fail, not find out the next morning when dashboards are stale.
  • Use Snowflake resource monitors to set spending limits and avoid surprise bills.
-- Create a resource monitor with a monthly credit quota
CREATE RESOURCE MONITOR DBT_MONITOR
WITH CREDIT_QUOTA = 100
FREQUENCY = MONTHLY
START_TIMESTAMP = IMMEDIATELY
TRIGGERS
ON 75 PERCENT DO NOTIFY
ON 100 PERCENT DO SUSPEND;
ALTER WAREHOUSE DBT_WH SET RESOURCE_MONITOR = DBT_MONITOR;

Skip the Infrastructure Work

Setting up dbt-core with Snowflake isn't hard, but running it reliably in production is a different story. You need scheduling, monitoring, credential management, log storage, and a way to handle failures — all on top of maintaining the infrastructure itself.

ModelDock handles all of that. Connect your Git repo, enter your Snowflake credentials (encrypted with AES-256-GCM), set a schedule, and your dbt project runs in an isolated container with full logging and artifact storage. It supports password auth and key pair auth out of the box.

Free during open beta. No credit card required.

Ready to run dbt-core in production?

ModelDock handles scheduling, infrastructure, and credential management so you don't have to.

Start For Free