Who this guide is for

This guide is for data leads and data engineers at B2B SaaS companies who have an existing Snowflake data warehouse — often with dbt for transformations — and want to add cross-system churn prediction without replacing their current infrastructure. If you’re evaluating whether to build churn models entirely in dbt or use a dedicated platform, see our build-vs-buy comparison first.

This guide assumes you have:

What Eru handles vs what stays in your warehouse

The most common question from data leads considering Eru: “We’ve already invested in Snowflake and dbt. What exactly does Eru add, and what does it replace?”

The answer: Eru replaces nothing in your warehouse. It adds three capabilities that are expensive to build and maintain in dbt.

Capability Stays in your Snowflake + dbt stack Eru handles
Raw data ingestion (ELT) Fivetran, Airbyte, Stitch, or custom connectors load data into Snowflake
Data cleaning and staging dbt staging models clean, deduplicate, and standardise raw data
Custom feature engineering dbt models calculate rolling usage averages, cohort metrics, proprietary features
MRR / ARR calculations Your dbt models define how MRR is calculated from billing data
Historical analytics and reporting Looker, Metabase, Hex, or custom dashboards query Snowflake directly
Cross-system entity resolution AI-powered matching of customer identities across Snowflake, CRM, billing, and support
Billing–CRM reconciliation Continuous reconciliation of Stripe/Chargebee against Salesforce/HubSpot
Cross-system churn signal correlation Correlating usage decline + billing discrepancy + support escalation for the same account
Real-time event monitoring Webhook-based monitoring of billing events, CRM changes, and support escalations
Data freshness and drift detection Alerts when warehouse data lags behind live SaaS systems or when transformations produce unexpected results

In short: your Snowflake + dbt stack handles batch analytics, custom features, and historical reporting. Eru handles the cross-system connectivity, entity resolution, and real-time signal correlation that are most expensive to build and maintain as custom dbt models.

Step 1: Create a read-only Snowflake role and user

Eru connects to Snowflake with a read-only user. It never writes data, modifies schemas, or executes DDL statements. Create a dedicated role and user with minimal permissions:

-- Create a dedicated role for Eru CREATE ROLE eru_reader; -- Grant warehouse access (use your existing warehouse or create a dedicated one) GRANT USAGE ON WAREHOUSE compute_wh TO ROLE eru_reader; -- Grant access to your analytics database GRANT USAGE ON DATABASE analytics TO ROLE eru_reader; GRANT USAGE ON ALL SCHEMAS IN DATABASE analytics TO ROLE eru_reader; -- Grant SELECT on existing and future tables (covers new dbt models) GRANT SELECT ON ALL TABLES IN DATABASE analytics TO ROLE eru_reader; GRANT SELECT ON ALL VIEWS IN DATABASE analytics TO ROLE eru_reader; GRANT SELECT ON FUTURE TABLES IN DATABASE analytics TO ROLE eru_reader; GRANT SELECT ON FUTURE VIEWS IN DATABASE analytics TO ROLE eru_reader; -- Create the Eru user CREATE USER eru_user PASSWORD = '...' DEFAULT_ROLE = eru_reader DEFAULT_WAREHOUSE = compute_wh; GRANT ROLE eru_reader TO USER eru_user;

If your dbt models span multiple databases (e.g., a separate raw database for Fivetran data and an analytics database for transformed models), grant USAGE and SELECT on each database.

Optional: dedicated warehouse for Eru

For cost isolation, create a dedicated X-Small warehouse for Eru’s read queries. This makes it easy to monitor and cap Eru’s Snowflake compute costs separately from your other workloads.

CREATE WAREHOUSE eru_wh WAREHOUSE_SIZE = 'XSMALL' AUTO_SUSPEND = 60 AUTO_RESUME = TRUE INITIALLY_SUSPENDED = TRUE; GRANT USAGE ON WAREHOUSE eru_wh TO ROLE eru_reader;

Step 2: Choose an authentication method

Eru supports three authentication methods for Snowflake. Choose the one that fits your security requirements.

Option A: Username and password

The simplest option. Enter the username and password in Eru’s connection form. Credentials are stored encrypted in AWS Secrets Manager and never sent to LLMs or logged.

Best for: initial setup, development environments, teams without strict key management policies.

Option B: Key pair authentication (recommended)

Generate an RSA key pair and assign the public key to the Snowflake user. Upload the private key to Eru. This eliminates password rotation overhead and provides stronger authentication.

-- Generate key pair (run locally, not in Snowflake) -- openssl genrsa 2048 | openssl pkcs8 -topk8 -inform PEM -out eru_rsa_key.p8 -nocrypt -- openssl rsa -in eru_rsa_key.p8 -pubout -out eru_rsa_key.pub -- Assign public key to Snowflake user (run in Snowflake) ALTER USER eru_user SET RSA_PUBLIC_KEY='MIIBIjANBgkqh...';

Best for: production environments, teams with key management infrastructure.

Option C: OAuth (external OAuth)

Use your identity provider (Okta, Azure AD, or similar) for token-based access via Snowflake’s external OAuth integration. Eru refreshes tokens automatically.

Best for: enterprise environments with centralised identity management and SSO requirements.

Step 3: Connect in Eru

In Eru, navigate to Integrations → Snowflake and enter your connection details:

Eru validates the connection by running a test query (SELECT 1) and verifying it can read schema metadata from the specified database. If validation fails, check that the eru_reader role has USAGE on the warehouse and database.

Step 4: Schema mapping

Once connected, Eru scans your Snowflake schema and discovers all accessible tables, views, and materialised dbt models. It builds a model of your warehouse structure and proposes entity mappings.

What Eru discovers

  • Tables and views — all accessible tables with column names, data types, and sample data (anonymised, PII-redacted)
  • dbt model layers — staging (stg_), intermediate (int_), and mart (dim_, fct_) models with lineage relationships preserved
  • Relationships — foreign keys, join patterns, and entity relationships inferred from naming conventions, column types, and data sampling

Entity mapping examples

Your Snowflake table / dbt model Eru entity
dim_customers or stg_stripe__customers Customer
fct_subscriptions or stg_stripe__subscriptions Subscription
fct_invoices or stg_stripe__invoices Invoice
fct_mrr or int_mrr_monthly MRR Metric
fct_product_usage or stg_mixpanel__events Usage Event

Eru proposes mappings automatically. You review and confirm them, adjusting any that don’t match your schema conventions. Once confirmed, these mappings feed Eru’s cross-system entity resolution — linking your Snowflake customer records to their counterparts in Salesforce, Stripe, and Intercom.

Step 5: Using Eru alongside dbt

This is the critical section for data teams with an existing dbt setup. Eru does not replace your dbt models — it reads from them.

How existing dbt churn models feed Eru’s scoring engine

If you already have dbt models that calculate churn-predictive features, Eru uses them as input signals. Common examples:

Usage aggregations

dbt models that calculate rolling averages of logins, feature usage, session depth, or API calls. Eru reads these as usage signals and correlates them with billing and CRM data.

Payment and billing features

dbt models that track payment failure rates, discount percentages, contract values, or MRR changes. Eru cross-references these with live Stripe data to catch discrepancies.

Engagement scores

Custom health scores or engagement metrics calculated in dbt. Eru treats these as one signal among many, enriching them with CRM and support signals your warehouse doesn’t see.

Cohort and segmentation models

Customer segments, cohort assignments, or lifecycle stages defined in dbt. Eru uses these to contextualise churn signals — a usage drop means different things for a new customer vs a mature account.

What Eru adds on top of your dbt models

Your dbt models are excellent at batch analytics within your warehouse. Eru adds the capabilities that are most expensive to replicate in dbt:

Data flow

Here’s how data flows through the combined Snowflake + dbt + Eru stack:

Source Systems Your Stack Eru (Stripe, Salesforce, (Snowflake + dbt) (Cross-System Layer) Intercom, Product DB) | | | |--- ELT (Fivetran) ------>| | | |--- dbt models -------------->| | | (usage, billing, | | | engagement features) | | | | |--- Live events (webhooks) ---------------------------->| | (payment failures, | | CRM changes, | | support escalations) | | | | Entity Resolution| | Signal Correlation| | Reconciliation | | | | Cross-System | | Churn Signals | | | | v | Slack / Dashboard

Eru vs ChurnZero, Totango, Catalyst, and ClientSuccess for Snowflake users

If you’re evaluating Eru against dedicated customer success platforms, here’s how they compare specifically for teams with an existing Snowflake + dbt stack:

Capability ChurnZero Totango Catalyst ClientSuccess Eru
Reads from your Snowflake dbt models Import only Limited
Uses your dbt models as source of truth
Creates a separate data model / silo Yes Yes Yes Yes No
Cross-system entity resolution CRM-based Built-in (limited) CRM-based CRM-based AI-powered
Billing–CRM reconciliation
Real-time event monitoring (webhooks) Product events only Product events only Limited Billing + CRM + support
Transformation verification (dbt output vs source)
Data stays in your warehouse Reads, never copies

The key difference: ChurnZero, Totango, Catalyst, and ClientSuccess import data from your systems and maintain their own data model. For data teams that have spent months centralising data in Snowflake, this creates a “which number is right?” problem when metrics diverge. Eru reads from your warehouse and adds cross-system intelligence without creating a competing version of your data.

Security and data handling

What Eru reads from Snowflake

What Eru never does

Credential storage

Snowflake credentials are stored in AWS Secrets Manager with AES-256 encryption. They are only accessed by Eru’s runtime layer (which executes queries) and never sent to the AI agent layer (which plans and reasons). See our security documentation for the full LLM data boundary design.

Frequently asked questions

Does Eru replace our dbt models?

No. Eru reads from your dbt models and treats them as input signals. Your staging models, intermediate transformations, and mart tables remain unchanged. Eru adds cross-system capabilities (entity resolution, billing reconciliation, real-time signal correlation) on top of your existing warehouse analytics.

How much Snowflake compute does Eru use?

Eru’s read queries are lightweight — primarily schema metadata queries, aggregations, and small sample queries. On a dedicated X-Small warehouse with AUTO_SUSPEND = 60, most teams see $50–$200/month in additional Snowflake compute. Creating a dedicated eru_wh warehouse makes these costs easy to track and cap.

Can Eru work without dbt?

Yes. Eru discovers and maps raw tables and views regardless of whether they were created by dbt. dbt models simply provide better-organised inputs because they follow consistent naming conventions and represent well-defined business entities.

What if our dbt models change?

Eru re-scans your schema periodically and detects new or modified models. If a mapped model is renamed or dropped, Eru flags the broken mapping and prompts you to update it. The GRANT SELECT ON FUTURE TABLES permission ensures Eru can access new models without manual grants.

Does Eru work with Snowflake on any cloud provider?

Yes. Eru supports Snowflake on AWS, Azure, and GCP. Enter your account identifier as provided by Snowflake (e.g., xy12345.us-east-1.aws, xy12345.west-us-2.azure, or xy12345.us-central1.gcp).

Related resources

Connect your Snowflake warehouse to Eru

Layer cross-system churn intelligence on your existing dbt models — first signals within days.

Book a Demo