This section provides you the details on various fields required to configure Snowflake in Data Integrity Suite.
| Field | Description |
|---|---|
| Name | Specifies the name of data connection. This is a mandatory field. Example: SalesDataConnection |
| Description | Allows you to describe the purpose of this data connection. Example: Connection to the sales data warehouse for quarterly reporting. |
| Agent to access data | Select the agent from the dropdown that connects to the database. It helps securely access data from your network. |
| Account | The unique identifier for your Snowflake account. It can be your organization name or a specific account ID needed to connect to Snowflake. This field is mandatory, and only the format <orgname>-<account_name> is valid in the Data Integrity Suite. Example: Corp-data_warehouseNote: When providing the information to the account field, ensure to exclude
https:// or http:// from the URL. |
| Role | Specifies the Snowflake role for the connection, defining access to databases, schemas, and tables. It must have read and write permissions to work with the Data Integrity Suite. This is a mandatory field. Note: When integrating Data Integrity Suite with Snowflake, read and write permissions are required only if the workspace includes Quality. If used solely for data cataloging, these permissions are not needed.
|
| Warehouse | Specifies the name of the Snowflake warehouse to be used for
query execution. This is a mandatory field. Example:
compute_wh For recommendations on warehouse sizing for Data Quality rules, refer to Recommendation for warehouse sizing for Snowflake datasource |
| Database | Specifies the name of database to connect to on the designated host server. This is a mandatory field. Example: sales_db |
| You will see the advanced options if your worksapce is enabled with Advanced connectivity bundle. | |
|
Advanced Options |
For advanced configuration, refer Advanced connectivity bundle |
Note: During configuration, the Data Integrity Suite creates an internal stage in Snowflake and stores java files for Data Quality pipeline execution.