I’m migrating an on‑prem SSAS 2019 Tabular model from SQL Server (local) to Snowflake. The model is deployed, and I’ve added a new legacy data source via 64‑bit ODBC. Processing a table in SSMS completes with “Success,” but 0 rows are processed. The same query returns rows in Snowflake (Snowsight and SnowSQL), so the data is available and connectivity is established.
Environment
- SSAS 2019 Tabular, on‑prem
- SSMS for processing
- Snowflake accessed via 64‑bit
- ODBC Provider: MSDASQL (legacy datasource)
- Driver:SnowflakeDSIIDriver
- Authentication: Key‑pair (Authenticator = SNOWFLAKE_JWT), private key on local filesystem
- Impersonation mode:ImpersonateServiceAccount
- SSAS service account has read access to the private key directory
Connection string (simplified)
Provider=MSDASQL;Driver=SnowflakeDSIIDriver;Server=<account/host>;
Database=<db>;Schema=<schema>;
User=<user>;Role=<role>;Warehouse=<warehouse>;
Authenticator=SNOWFLAKE_JWT;
JWT_private_key_file=<path-to-private-key>;
Password=<private-key-passphrase>
What I’ve verified
- The Snowflake SQL query returns rows in Snowsight.
- The same query runs successfully via SnowSQL on the SSAS host.
- The SSAS service account has read permissions on the private key file and directory.
- The ODBC DSN and connection string resolve (no connection errors reported).
- The model deploys; changing the table’s data source to the Snowflake source applies without errors.
Symptoms
- Processing a single table in SSMS ends with “Success” but shows 0 rows processed.
- No obvious errors in SSAS logs tied to the processing event.
- Query works outside SSAS, but SSAS appears to neither push the query nor ingest rows.
Questions
- Why does SSAS processing report “Success” with 0 rows when querying Snowflake via ODBC (MSDASQL) with JWT key‑pair auth?
- Are there known limitations or required settings when using MSDASQL + Snowflake ODBC for SSAS Tabular?
- What diagnostics or logging (SSAS flight recorder, OLE DB/ODBC tracing) would reveal whether SSAS is issuing the query and receiving rows?
- Is there a recommended alternative provider/driver path (e.g., an OLE DB provider for Snowflake, ADO.NET via gateway, or different auth) that works reliably for SSAS Tabular?
Additional details and hypotheses
- Column type mapping: Could SSAS be silently discarding rows due to unsupported types or nullability mismatches? Any known mappings for VARIANT, TIMESTAMP_NTZ/LTZ/TZ, or large VARCHARs that break Tabular processing?
- Rowset vs query mode: Does SSAS require specific ODBC settings (e.g., “Use Query Result Format” or “NoDescribeParam”) to return a rowset it can consume?
- Role/warehouse: The Snowflake role and warehouse are set in the connection string. Is SSAS ignoring them or overriding via session parameters?
- JWT integration: MSDASQL + SnowflakeDSIIDriver with SNOWFLAKE_JWT works in SnowSQL; does SSAS need the key in a specific format (PKCS#8 vs PKCS#12, encrypted vs unencrypted)?
- Isolation/transactions: Any SSAS differences with ODBC cursors or fetch sizes that cause an empty result?
What would help diagnose
- A sample working SSAS Tabular connection string to Snowflake (including any critical ODBC parameters).
- Guidance on enabling detailed SSAS processing logs and ODBC driver tracing for Snowflake to confirm query execution and row retrieval.
- Known compatibility notes for SSAS 2019 + Snowflake ODBC via MSDASQL.
Expected vs Actual
- Expected: Processing runs the query against Snowflake and loads N rows into the table partition.
- Actual: Processing reports “Success,” but 0 rows are loaded. Query works elsewhere.