Snowflake
Connect your Varicent ELT and Snowflake accounts to import your data.
Connect your Varicent ELT and Snowflake accounts to import your data.
Before you begin
Before you can create a connection between Varicent ELT to import data from Snowflake, you will need a Snowflake account.
Supported data sources
This connector can import data from the Databases you define within your Snowflake role under the Snowflake account.
Using OAuth authentication
If you want to use the OAuth authentication method for connecting Varicent ELT to your Snowflake account, you must first create a security integration object.
Important
Only users with the ACCOUNTADMIN
role can create a security integration object. Snowflake does not allow the following roles to log in through OAuth: ACCOUNTADMIN
, SECURITYADMIN
, and ORGADMIN
. Learn more...
Open your Snowflake instance.
In Snowflake, enter the following query to create the security integration object:
CREATE OR REPLACE SECURITY INTEGRATION <name> TYPE=OAUTH OAUTH_CLIENT=CUSTOM OAUTH_REDIRECT_URI='<redirect URI>' OAUTH_CLIENT_TYPE='CONFIDENTIAL' OAUTH_REFRESH_TOKEN_VALIDITY = <1 ~ 90 days in seconds> OAUTH_ISSUE_REFRESH_TOKENS=TRUE ENABLED=TRUE;
Replace the following variables with the correct information for your instance:
Table 20. Snowflake OAuth variablesVariable
Description
Name
Enter the name of the security integration.
Redirect URI
Enter the import redirect URI of the Varicent ELT application.
For NA:
https://app.symon.ai/importRedirect
For EU:
https://eu.symon.ai/importRedirect
OAUTH_REFRESH_TOKEN_VALIDITY
Enter the integer that specifies how long the refresh token is valid for in seconds. This value is used to expire the refresh token periodically.
Learn more about creating a security integration.
Caution
Re-running the query with the same security integration name will replace the existing security object. This action will rotate the existing client id and client secret. Any connections tied to the security integration object are no longer valid and the user must edit the Snowflake connection tied to the security integration with a new client id and client secret pair.
Enter the following query to get the client id and secret:
SELECT SYSTEM$SHOW_OAUTH_CLIENT_SECRETS('<security integration name in capital letters>');
This query returns values for
OAUTH_CLIENT_ID, OAUTH_CLIENT_SECRET, OAUTH_CLIENT_SECRET_2
. You can use eitherOAUTH_CLIENT_SECRET
orOAUTH_CLIENT_SERCRET_2
as your client secret.Note
If the refresh token expires, you must edit the connection and reauthorize. The Client ID and Client Secret can be retrieved with the query in Step 3.
Import
With the correct permissions in place, you can now import your data into Varicent ELT.
In Varicent ELT, go to the Data tab.
Click Import data.
In the Third party sources section, click Snowflake.
Click New connection.
On the Connect to your Database page, enter the following information to connect:
Table 21. SnowflakeField
Description
Connector name
The name of the connection.
Snowflake URL
The URL to connect to Snowflake. You can use the account URL copied from the Snowflake app, or you can use
https://<org_name>-<account_name>.snowflakecomputing.com
.Note
The
org_name
is the name of your Snowflake organization.The
account_name
is the unique name of your account within your organization.Snowflake role
The role in Snowflake, such as
<user>_role
.Note
The Snowflake role is an optional parameter. If you don't provide the role, the connection uses the default role.
Warehouse
The name of the Snowflake warehouse your data runs out of.
Database name
The name of the Snowflake database.
Authentication method
Select either the Basic standard authentication method or OAuth authentication method
Basic standard authentication method
Enter your Snowflake username, such as
admin
and password.OAuth authentication method
Enter your Snowflake OAuth Client ID and Client Secret to connect to your Snowflake connector.
Note
Prior to connecting with OAuth, set up your Snowflake environment with the Security Integration Object.
Use Storage integration
Click the checkbox to use Storage integration. For more information, see Step 7.
Click Next.
If you want use Storage integration, you must create or replace your storage integration in Snowflake. For more information, see: Snowflake storage integration.
On the Select data page, in the Directory field, select the directory for the data set. Click Next.
For data imports with high performance, switch the toggle on to import using S3 unload.
Note
Using the S3 unload to import data with high performance may incur data transfer costs. To review the costs, see the Data Transfer section of the Snowflake pricing guide.
On the Import data page, review your data and select the data to obfuscate (if any). Click Next.
On the Filter data page, optionally click Filter data to refine your data, or click Import data to continue.
Filter the rows in your data by selecting the column to filter, the filter category and the value or text.
Click Import data to continue.
Note
For more information about how to configure data filters, see: Filter data.
On the Review page, review the data sets that you are importing.
Click Import data.
The data is available on the My Data page.
Snowflake storage integration
If you enable to use the storage integration in Snowflake during configuration, you must create or replace the integration within Snowflake.
Important
Only users with the ACCOUNTADMIN
role or a role with the global CREATE INTEGRATION
privilege can execute this SQL command can create a storage integration object. For more information on Snowflake storage integration, see: Configuring a Snowflake storage integration to access Amazon S3.
Open your Snowflake instance.
In Snowflake, enter and run the following query to create the storage integration object:
CREATE STORAGE INTEGRATION <integration_name> TYPE = EXTERNAL_STAGE STORAGE_PROVIDER = 'S3' ENABLED = TRUE STORAGE_AWS_ROLE_ARN = '<iam_role_arn>' STORAGE_ALLOWED_LOCATIONS = ('<protocol>://<bucket>/<path>/', '<protocol>://<bucket>/<path>/')
Run the following query to grant
USAGE
privilege on storage integration to the Snowflake role used for this connection:GRANT USAGE ON INTEGRATION <INTEGRATION NAME> TO ROLE <ROLE NAME>
Run the following query to retrieve the
STORAGE_AWS_IAM_USER_ARN
andSTORAGE_AWS_EXTERNAL_ID
:DESCRIBE INTEGRATION <INTEGRATION_NAME>
In Varicent ELT, enter your information on the Connect to your Database page, in the following input fields:
Table 22. Snowflake input table fieldsField
Description
Principal
Enter the name of the
STORAGE_AWS_IAM_USER_ARN
.External ID
Enter the name of the
STORAGE_AWS_EXTERNAL_ID
.Storage integration name
Enter the name of your storage integration.
Usage examples
After creating the connection, you can view a list of available tables under your database based on the role. You can search for and select the specific table data that you want to analyze, complete the configuration steps, and start the import.
When your data is available in Varicent ELT, where you can view and start analyzing it in a pipe.
Security and privacy
We use two authentication practices to connect with Snowflake:
Basic standard username and password authentication practices, and securely store encrypted passwords.
OAuth authentication method with Client ID and Client Secret authentication practices.
Error handling
The following errors may appear during the import process:
Error | Description | Solution |
---|---|---|
| There is an incorrect warehouse name. | To fix this error, enter the correct warehouse name. |
| There is an incorrect database name. | To fix this error, enter the correct database name. |
| There is an incorrect database name. | To fix this error, enter the correct database name. |
| This error occurs if a | To fix this error, see your Snowflake account and make the appropriate updates. |
| This error occurs during Snowflake SQL compilation, when the SQL script is trying to access tables that don't exist or doesn't have access to. | To fix this error, see your Snowflake account and make the appropriate updates. |
| This error occurs when the imported data has too many columns. This is not a performance issue on the Varicent ELT side, but is caused by the Snowflake python connector (maintained by Snowflake) pulling in too much data in parallel which results in a memory issue. Based on the testing results, Varicent ELT sets a limitation of 150 columns. | To fix this error, see your Snowflake account and make the appropriate updates. |
| This error occurs when a direct URL is trying to pass. Snowflake doesn't allow a direct URL to pass. | To fix this error, use storage integration which will define all of the data to be passed to. |