Installing Unravel for Snowflake
To install Unravel for Snowflake, complete the prerequisites and then run the following steps to Install and configure Unravel. You can configure Unravel for Snowflake based on your chosen database connection method for metadata collection.
To configure Unravel for Snowflake, you can choose between establishing a Real-time connection or a Snapshot connection to link up with the database for metadata collection. You must complete the prerequisites specific to your preferred connection type (Real-time or Snapshot) before proceeding with the installation of Unravel for Snowflake.
Download Unravel onto the VM instance that you have created.
You can install Unravel on Snowflake using the Interactive Precheck method or manually.
The setup command allows you to do the following:
Runs Precheck automatically to detect possible issues that prevent a successful installation. Suggestions are provided to resolve issues. Refer to Precheck filters for the expected value for each filter.
Let you run extra parameters to integrate the database of your choice.
The setup command allows you to use a managed database shipped with Unravel or an external database. When you run the setup command without additional parameters, the Unravel-managed PostgreSQL database is used. Otherwise, you can specify any of the following databases, which are supported by Unravel, with the setup command:
MySQL (Unravel managed as well as external MySQL database)
MariaDB (Unravel managed as well as external MariaDB database)
PostgreSQL (External PostgreSQL)
Refer to Integrate database for details.
Let you specify a separate path for the data directory other than the default path.
The Unravel data and configurations are located in the
data
directory. By default, the installer maintains the data directory under<Unravel installation directory>/data
. You can also change thedata
directory's default location by running additional parameters with the setup command. To install Unravel with the setup command.Provides more options for setup.
To install Unravel with the setup command, do the following:
Switch to Unravel user.
su -
<unravel user>
Notice
The Unravel user who owns the installation directory should run the setup command to install Unravel.
Run setup command with any of the following databases (PostgreSQL, MySQL, MariaDB). Refer to setup options for all the additional parameters that you can run with the setup command.
Run setup command:
<unravel_installation_directory>
/unravel/versions/<Unravel version>
/setup --enable-snowflakePrecheck is automatically run when you run the setup command. Refer to Precheck filters for the expected value for each filter. Also, refer to the Precheck sample.
Tip
Run --help with the setup command and any combination of the setup command for complete usage details.
<unravel_installation_directory>
/unravel/versions/<Unravel version>
/setup --helpOptionally, if you want to provide a different data directory, you can pass an extra parameter (--data-directory) with the setup command as shown below:
<unravel_installation_directory>/unravel/versions/
<Unravel version>
/setup --enable-snowflake --data-directory/the/data/directory
Similarly, you can configure separate directories for other unravel directories. Contact support for assistance.
Refer to setup Options for all the additional parameters that can be run with the setup command
Set the path of a license file.
<Unravel installation directory>/unravel/manager config license set
<license filename>
This command takes a filename as input and performs the following actions:
Reads the license file path and the license file
The license YAML file contains product licensing information, license validity and expiration date, and the licensed number of clusters and nodes.
Verifies whether it is a valid license
Adds the com.unraveldata.license.file property to the
unravel.properties
file. For information, see License property.
Note
If you do not provide the license filename, the
manager config license set
command prompts for the license information. You can copy the content of the license file.Sample content of the license file:
##### BEGIN UNRAVEL LICENSE Licensee : ACME Disintegrating Pistol Manufacturing Valid from : 2022-12-16 00:00:00 UTC Expire after : 2023-10-16 23:59:00 UTC License type : Enterprise Licensed number of nodes : 1000000 Signature : c2Uvb2JqLnRhcmdldC92OF9pbml0aWFsaXplcnMvZ2VuL3RvcnF1ZS Revision : 1 ##### END UNRAVEL LICENSE #####
Apply the changes and restart.
<unravel_installation_directory>
/unravel/manager config apply then startCheck the status of services.
<unravel_installation_directory>
/unravel/manager reportThe following service statuses are reported:
OK: Service is up and running.
Not Monitored: Service is not running. (Has stopped or has failed to start)
Initializing: Services are starting up.
Does not exist: The process unexpectedly disappeared. Restarts will be attempted 10 times.
You can also get the status and information for a specific service. Run the manager report command as follows:
<unravel_installation_directory>
/unravel/manager report <service> ## For example: /opt/unravel/manager report auto_actionAccess Unravel UI using this link, http://<FQDN>:3000.
The Precheck output displays the issues that prevent a successful installation and provides suggestions to resolve them. You must resolve each of the issues before proceeding. See Precheck filters.
After the prechecks are resolved, you must re-login or reload the shell to execute the setup command again.
Note
In certain situations, you can skip the precheck using the setup --skip-precheck command
For example:
/opt/unravel/versions/<Unravel version>
/setup --skip-precheck
You can also skip the checks that you know can fail. For example, if you want to skip the Check limits option and the Disk freespace option, pick the command within the parenthesis corresponding to these failed options and run the setup command as follows:
setup --filter-precheck ~check_limits,~check_freespace
Tip
Run --help with the setup command and any combination of the setup command for complete usage details.
<unravel_installation_directory>
/unravel/versions/<Unravel version>
/setup --help
/opt/unravel/versions/abcd.1004/setup 2021-04-05 15:51:30 Sending logs to: /tmp/unravel-setup-20210405-155130.log 2021-04-05 15:51:30 Running preinstallation check... 2021-04-05 15:51:31 Gathering information ................. Ok 2021-04-05 15:51:51 Running checks .................. Ok -------------------------------------------------------------------------------- system Check limits : PASSED Clock sync : PASSED CPU requirement : PASSED, Available cores: 8 cores Disk access : PASSED, /opt/unravel/versions/develop.1004/healthcheck/healthcheck/plugins/system is writable Disk freespace : PASSED, 229 GB of free disk space is available for precheck dir. Kerberos tools : PASSED Memory requirement : PASSED, Available memory: 79 GB Network ports : PASSED OS libraries : PASSED OS release : PASSED, OS release version: centos 7.6 OS settings : PASSED SELinux : PASSED -------------------------------------------------------------------------------- Healthcheck report bundle: /tmp/healthcheck-20210405155130-xyz.unraveldata.com.tar.gz 2021-04-05 15:51:53 Prepare to install with: /opt/unravel/versions/abcd.1004/installer/installer/../installer/conf/presets/default.yaml 2021-04-05 15:51:57 Sending logs to: /opt/unravel/logs/setup.log 2021-04-05 15:51:57 Instantiating templates ................................................................................................................................................................................................................................ Ok 2021-04-05 15:52:05 Creating parcels .................................... Ok 2021-04-05 15:52:20 Installing sensors file ............................ Ok 2021-04-05 15:52:20 Installing pgsql connector ... Ok 2021-04-05 15:52:22 Starting service monitor ... Ok 2021-04-05 15:52:27 Request start for elasticsearch_1 .... Ok 2021-04-05 15:52:27 Waiting for elasticsearch_1 for 120 sec ......... Ok 2021-04-05 15:52:35 Request start for zookeeper .... Ok 2021-04-05 15:52:35 Request start for kafka .... Ok 2021-04-05 15:52:35 Waiting for kafka for 120 sec ...... Ok 2021-04-05 15:52:37 Waiting for kafka to be alive for 120 sec ..... Ok 2021-04-05 15:52:42 Initializing pgsql ... Ok 2021-04-05 15:52:46 Request start for pgsql .... Ok 2021-04-05 15:52:46 Waiting for pgsql for 120 sec ..... Ok 2021-04-05 15:52:47 Creating database schema ................. Ok 2021-04-05 15:52:50 Generating hashes .... Ok 2021-04-05 15:52:52 Loading elasticsearch templates ............ Ok 2021-04-05 15:52:55 Creating kafka topics .................... Ok 2021-04-05 15:53:36 Creating schema objects ....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... Ok 2021-04-05 15:54:03 Request stop ....................................................... Ok 2021-04-05 15:54:16 Done [unravel@xyz ~]$
The configurations for setting up Unravel for Snowflake vary based on the connection method that you have chosen to link the database for metadata collection. Based on your connection method, configure Unravel for Snowflake.
With the Real-time connection, you can directly connect to your Snowflake database and, after configuring your Snowflake account details, get live data insights. The real-time connection allows you to access all Unravel features at once, perform real-time monitoring, receive notification alerts, and observe live insights without any requirement for account admin access.
Also, see Choosing a connection to Upload Snowflake data
The following steps are involved in setting up an Unravel account.
Access the Unravel UI using the login credentials you received in the email when you signed up for the Unravel Snowflake Standard (Free) account. The data fields are empty when you access Unravel UI for the first time.
Click on the upper right side and select Snowflake configuration.
Select the Real-time option.
Enter the following details:
Field
Description
Snowflake account
User
Enter the username to access the Snowflake account.
Password
Enter the passcode to access the Snowflake account.
Account
Specify the Snowflake user account from where you want to get live data for monitoring.
Role
Specify a role associated with the user account for Unravel to access the correct Snowflake data warehouse.
Parameters
Database name
Specify the Snowflake database where Unravel can access the metadata.
Schema name
Specify the Snowflake schema in your database.
Warehouse name
Specify a warehouse for Unravel to run queries on your metadata (monitor and telemetry data). The minimum warehouse size required by Unravel is X-small.
Look back
Set how many days of metadata should be processed from when this configuration is saved. You can set the look-back period to a maximum of 180 days.Maximum is 180 das
Set Snowflake credit cost
Credit cost
Set the USD cost of your Snowflake credits.
Click Save to save the settings. After you save the setting, Unravel connects to the configured Snowflake account and starts polling the data. New data can be seen on the Queries and Warehouse pages on Unravel UI.
The duration of the entire data loading process depends on both the volume of data and the specified look-back period.
With the Snapshot connection, you can manually download the metadata or telemetry data from your Snowflake account and upload this metadata to the Unravel Snowflake account. The Snapshot connection allows you to unlock Unravel features with one-time data access without any serverless cost or the need for account admin access from the Unravel Snowflake account. However, real-time monitoring is not available with this connection.
The download and upload data tasks can be executed using scripts. After uploading the data, you access the Unravel Snowflake account, detect the metadata, and start the data ingestion. The data for the last 14 days get ingested into the Unravel Snowflake account.
Also see, Choosing connection to Upload Snowflake data
The following steps are involved in setting up an Unravel account.
The Snowflake metadata can be uploaded using one of the following methods:
This tool can be used in Linux, MAC, and Windows. You can execute this tool and retrieve account usage information from your Snowflake account and upload it to the Unravel Snowflake account.
You can use the download scripts to download metadata from your Snowflake account and then run the upload scripts to upload the metadata to the Unravel Snowflake account.
This tool enables you to retrieve account usage information from one Snowflake account and upload it to a second account.
This tool does the following:
Creates a stage in the source Snowflake account.
Saves source Snowflake account usage to stage.
Downloads account usage information from the source Snowflake account to local.
Creates a stage in the target Snowflake account.
Uploads the account usage information to the target Snowflake account.
The source account user must have the following permissions:
Access SNOWFLAKE.ACCOUNT_USAGE and SNOWFLAKE.INFORMATION_SCHEMA schema
Create Stage permission
Create File Format permission
Use a command line to run the snowflake-data-loader tool. You must pass the following arguments from the command line. Some of these are mandatory and some are optional arguments. You are prompted to address the missing mandatory arguments.
Mandatory arguments | Optional arguments | ||
---|---|---|---|
| Your source Snowflake account username. |
| The login method for the source account. Possible options are password (default), oauth, sso, okta, or keypair. |
| Your source Snowflake account password. If source_login_method is password, this argument is required. |
| The login method for the target account. Possible options are password (default), oauth, sso, okta, or keypair. |
| The path to your private key file. If source_login_method or target_login_method is The key will be used for both source and target accounts. |
| The private link for the source account for example: testaccount.us-east-1.privatelink.snowflakecomputing.com. |
| Your source Snowflake account ID. |
| The private link for the target account for example: testaccount.us-east-1.privatelink.snowflakecomputing.com. |
| The name of the source warehouse from where you want to retrieve details. |
| The okta URL for the source account for example: https://testaccount.okta.com. |
| The name of the source account database where the stage is created. |
| The okta URL for the target account for example: https://testaccount.okta.com. |
| The name of the source account schema where the stage is created. |
| Your source Snowflake account MFA password. |
| The name of the source account role. |
| Your target Snowflake account MFA password. |
| Your target Snowflake account username. |
| The name of the stage. The default is unravel_stage. |
| Your target Snowflake account password. |
| The directory to save output files. The default is the current directory. |
| Your target Snowflake account ID. |
| The name of the file format. The default is |
| The name of the target warehouse from where you want to retrieve details. |
| Prints debug messages when set. |
| The name of the target database. |
| This flag saves all queries as SQL files instead of running them. |
| The name of the target schema. |
| This will skip the local temporary file cleanup process. |
| The name of the target role. |
| The number of days to look back for account usage information. Default is 15 days. |
If any of the required arguments are missing, you will be prompted to enter them.
The script will also replace -
with _
for the value of --stage argument
.
You can run the snowflake-data-loader tool on any of the following operating systems:
Download the latest release of the snowflake-data-loader tool from the following location:
https://github.com/unraveldata-org/snowflake-data-loader/releases
Using a command line, execute the snowflake-data-loader script with the required arguments as follows:
./snowflake-data-loader \ --source_login_method keypair \ --target_login_method password \ --source_user
<source_user>
\ --private_key_path<private_key_path>
\ --source_account<source_account>
\ --source_warehouse<source_warehouse>
\ --source_database<source_database>
\ --source_schema<source_schema>
\ --source_role<source_role>
\ --target_user<target_user>
\ --target_password<target_password>
\ --target_account<target_account>
\ --target_warehouse<target_warehouse>
\ --target_database<target_database>
\ --target_schema<target_schema>
\ --target_role<target_role>
For example:
./snowflake-data-loader --source_user sf-user --source_account xyz12345.us-east-1 --source_warehouse unraveldata --source_database unraveldb --source_schema unravelschema --source_role SYSADMIN --target_user 6087e389979f870926a1dbf8d41d52b8 --target_password Cl9ATHlRf6mh --target_account ytb00868 --target_warehouse USER_WH --target_database TRIAL_DB --target_schema unravel_qayt00yvh1gru1p_sf --target_role 47572BE3A1393DDCE452900795978313
Download the latest release of the snowflake-data-loader tool from the following location:
https://github.com/unraveldata-org/snowflake-data-loader/releases
Note
In MAC, you may be prompted to trust the binary. Run the following command to trust the binary:
xattr -d com.apple.quarantine <path_to_the_binary_directory>/snowflake-data-loader
Using a command line, execute the snowflake-data-loader script with the required arguments as follows:
./snowflake-data-loader \ --source_login_method keypair \ --target_login_method password \ --source_user
<source_user>
\ --private_key_path<private_key_path>
\ --source_account<source_account>
\ --source_warehouse<source_warehouse>
\ --source_database<source_database>
\ --source_schema<source_schema>
\ --source_role<source_role>
\ --target_user<target_user>
\ --target_password<target_password>
\ --target_account<target_account>
\ --target_warehouse<target_warehouse>
\ --target_database<target_database>
\ --target_schema<target_schema>
\ --target_role<target_role>
For example:
./snowflake-data-loader --source_user sf-user --source_account xyz12345.us-east-1 --source_warehouse unraveldata --source_database unraveldb --source_schema unravelschema --source_role SYSADMIN --target_user 6087e389979f870926a1dbf8d41d52b8 --target_password Cl9ATHlRf6mh --target_account ytb00868 --target_warehouse USER_WH --target_database TRIAL_DB --target_schema unravel_qayt00yvh1gru1p_sf --target_role 47572BE3A1393DDCE452900795978313
Download the latest release of the snowflake-data-loader tool from the following location:
https://github.com/unraveldata-org/snowflake-data-loader/releases
Sign in to Windows with a password. Using a command line, execute the snowflake-data-loader script, with the required arguments from the list, as follows:
snowflake-data-loader.exe \ --source_user
<source_user>
\ --source_password<source_password>
\ --source_account<source_account>
\ --source_warehouse<source_warehouse>
\ --source_database<source_database>
\ --source_schema<source_schema>
\ --source_role<source_role>
\ --target_user<target_user>
\ --target_password<target_password>
\ --target_account<target_account>
\ --target_warehouse<target_warehouse>
\ --target_database<target_database>
\ --target_schema<target_schema>
For example:
snowflake-data-loader.exe --source_user sf-user --source_account xyz1234.us-east-1 --source_warehouse UNRAVELDATA --source_database SEMI_STRUCTURED_DB --source_schema SEMI_STRUCTURED_SCHEMA --source_role sysadmin --target_user 3e5968c0661f5c37b51aca02bcf827e1 --target_password BNcc5LIIISjc --target_account ytb00868 --target_warehouse USER_WH --target_database TRIAL_DB --target_schema unravel_foqt3sagsf1k1q8_sf --target_role 47572BE3A1393DDCE452900795978313
The following scripts let you retrieve account usage information from your Snowflake account to the Unravel Snowflake account.
snowsql_download_data.sql:
snowsql_show_wareshouses.sql
snowflake_query.py
snowsql_upload_data.sql
The Snowflake metadata or telemetry data, warehouse data, and warehouse parameters data are downloaded/uploaded with these scripts.
Ensure to install the following before you start to download Snowflake metadata:
An account with access to creating state and file format.
Account user with permission to access the account_usage schema and all warehouses.
Unravel provides the following download scripts that you can use to download Snowflake metadata, warehouse data, and warehouse parameters data.
snowsql_download_data.sql: Downloads Snowflake metadata or telemetry data.
snowsql_show_wareshouses.sql: Downloads Snowflake warehouse's data.
snowflake_query.py: Downloads Snowflake warehouse parameters
Unravel provides the following upload scripts to upload the downloaded Snowflake metadata to the Unravel Snowflake account:
snowsql_upload_data.sql
Do the following to download Snowflake metadata:
Download the download/upload scripts from this location.
Using SnowSQL connect to a Snowflake account from where you want to download the Snowflake metadata. This account must have access to the creating stage and file format.
Execute the snowsql_download_data.sql script with the required arguments to download the metadata from Snowflake #account_usage views. Refer to the following list of arguments for more details:
snowsql -f /opt/script/snowsql_download_data.sql -d ${
db}
-s ${schema}
-r${role}
-a ${account}
-u ${user}
-w ${warehouse}
-o variable_substitution=true -o log_file=${script output file path/filename}
--variable path=$(local/path/to store/downloaded metadata)
--variable stage_name=unravel_stage_name
--variable file_format=unravel_file_format
For example:
snowsql -f /opt/script/snowsql_download_data.sql -d sf_source_database -s sf_schema -r sysadmin -a sf_account -u sf_user -w sf_warehouse -o variable_substitution=true -o log_file=/opt/script/snowsql_download_data.log --variable path=/opt/download_path/ --variable stage_name=unravel_stage --variable file_format=unravel_file_format
Parameter
Description
-o variable_substitution
Enable the variable substitution switch in the script. Some variables are used in the script for which the values must be passed from the CLI. Set this to true.
--variable path
Specify the local path to store the downloaded Snowflake system metadata.
--variable stage_name
Specify the stage name, which is used to keep the temporary files for download and upload.
--variable file_format
Specify the file format name, which is used by upload/download scripts.
-f
Specify the file name of the script that is executed.
-d
Specify the database used for the script execution.
-s
Specify the name of the schema name used for the script execution.
-r
Specify the role of the user who executes the script.
-a
Specify the Snowflake account, which will be used for the script execution.
-u
Specify the username of the Snowflake user who executes the script.
-w
Specify the Snowflake warehouse.
-o log_file
Specify the path to the log file that will be generated when you execute the script.
-o
Provide the output-related arguments to get the logs in the specified path and format.
After the script is executed, the Snowflake metadata gets downloaded to the specified location.
Execute the snowsql_show_wareshouses.sql script with the required arguments to download the warehouse data. Refer to the following list of arguments for more details:
snowsql -f /opt/script/snowsql_show_wareshouses.sql -d $
{db}
-s ${schema}
-r${role}
-a ${account}
-u ${user}
-w ${warehouse}
-o output_format=csv -o output_file=${path}
/warehouses.csv -o variable_substitution=truesnowsql -f /opt/script/snowsql_show_wareshouses.sql -d database -s database-schema -r sysadmin -a useraccount -u user1 -w warehouse1 -o output_format=csv -o output_file=/opt/unravel/warehouses.csv -o variable_substitution=true
Parameter
Description
-o variable_substitution
Enable the variable substitution switch in the script. Some variables are used in the script for which the values must be passed from the CLI. Set this to true.
-f
Specify the file name of the script that is executed.
-d
Specify the database used for the script execution.
-u
Specify the username of the Snowflake user who executes the script.
-s
Specify the name of the schema name used for the script execution.
-r
Specify the role of the user who executes the script.
-a
Specify the Snowflake account, which will be used for the script execution.
-o
Provide the output-related arguments to get the logs in the specified path and format.
-o output_file
Specify the path to the output file generated on script execution.
Note
The
output_file
path should be the same as the path mentioned in the snowsql_download_data.sql command-o output_format
Specify the output file format on script execution. This is in CSV format.
After the script is executed, the warehouse data gets downloaded in CSV format at the specified output location.
Execute the snowflake_query.py script with the required arguments to download the warehouse parameters data. Refer to the following list of arguments for more details:
python3 /opt/script/snowflake_query.py --user '
${user}
' --password '${password}
' --account '${account
}' --warehouse '${warehouse}
' --database '${db}
' --schema '${schema}
' --out '/opt/unravel' --role${role}
For example:
python3 /opt/script/snowflake_query.py --user 'sf_user' --password 'sf_password' --account 'sf_account' --warehouse 'sf_warehouse' --database 'sf_source_database' --schema 'sf_schema' --out '/opt/download_path' --role sysadmin
Parameter
Description
--user
Specify the name of the Snowflake user for the script execution.
--password
Specify the Snowflake user account passcode.
--schema
Specify the name of the schema that must be used for the script execution.
--role
Specify the role of the user.
--account
Specify the Snowflake account, which will be used for the script execution.
--out
Specify the path to the output folder.
--database
Specify the database used for the script execution.
After the script is executed, the warehouse parameters data gets downloaded in CSV format at the specified output location.
Unravel provides download scripts that you can use to upload Snowflake metadata, warehouse data, and warehouse parameters data to the Unravel Snowflake account. The download scripts should be executed in SnowSQL with an account that has access to the following:
Creating a stage in the specified Snowflake database and schema
Creating file format in the specified snowflake database and schema.
Do the following to upload the Snowflake metadata:
Download the snowsql_upload_data.sql from this location:
https://github.com/unraveldata-org/snowflake-data-loader/tree/main/script
From the same location, execute the create table script: prepare_schema.sql. After the stored procedure is created, execute the stored procedure with the following command:
call prepare_replication_schema(DbName, SchemaName)
Execute the snowsql_upload_data.sql script to upload the metadata to Unravel with the required arguments. Refer to the following list of arguments for more details:
snowsql -f /opt/script/snowsql_upload_data.sql -d $
{db}
-s ${schema}
-r ${role}
-a ${account}
-u ${user}
-w ${warehouse}
-o variable_substitution=true -o log_file=/opt/script/snowsql_upload_data.log --variable path=${path}
--variable stage_name=unravel_stage_upload --variable file_format=unravel_file_format_uploadsnowsql -f /opt/script/snowsql_upload_data.sql -d unravel_sf_database -s unravel_sf_schema -r unravel_sf_role -a unravel_sf_account -u unravel_sf_user -w unravel_sf_warehouse -o variable_substitution=true -o log_file=/opt/script/snowsql_upload_data.log --variable path=/opt/download_path/ --variable stage_name=unravel_stage_upload --variable file_format=unravel_file_format_upload
Parameter
Description
-o variable_substitution
Enable the variable substitution switch in the script. Some variables are used in the script for which the values must be passed from the CLI. Set this to true.
-o log_file
Specify the name of the log file that must be generated on command execution.
--variable path
Specify the local path to where the snowflake system metadata is downloaded.
--variable stage_name
Specify the stage name, which is used to keep the temporary files for upload.
--variable file_format
Specify the file format name, which is used by the upload scripts.
-f
Specify the script file that you want to execute.
-d
Specify the Unravel provided Trial Snowflake Account database name.
-u
Specify the Unravel provided Trial Snowflake Account User name
-s
Specify the Unravel provided Trial Snowflake Account Schema name.
-r
Unravel provided Trial Snowflake Account Role
-a
Specify the Unravel provided Trial Snowflake Account name
-o
Provide the output-related arguments to get the logs in the specified path and format.
After the script is executed, the Snowflake metadata gets uploaded in the specified format to the selected location.
The following configurations must be set in the Unravel Snowflake account. For this, you must sign in to Unravel using the login credentials you have received in the email.
Access the Unravel UI using the login credentials you received in the email when you created the Unravel Snowflake free account. The data fields are empty when you access Unravel UI for the first time.
Click on the upper right side and select Snowflake configuration.
Enter the following details:
Field
Description
Snowflake account
User
Enter the username to access the Snowflake account.
Password
Enter the passcode to access the Snowflake account.
Account
Specify the Snowflake user account from where you want to get live data for monitoring.
Role
Specify a role associated with the user account for Unravel to access the correct Snowflake data warehouse.
Parameters
Database name
Specify the Snowflake database where Unravel can access the metadata.
Schema name
Specify the Snowflake schema in your database.
Warehouse name
Specify a warehouse for Unravel to run queries on your metadata (monitor and telemetry data). The minimum warehouse size required by Unravel is X-small.
Look back
Set how many days of metadata should be processed from when this configuration is saved. You can set the look-back period to a maximum of 180 days.
Set Snowflake credit cost
Credit cost
Set the USD cost of your Snowflake credits.
Click the Detect Metadata button to check the connection and table metadata access. After a successful connection and metadata access, you can start the data ingestion.
Click Start data ingestion. The data ingestion process begins. The data for up to 14 days gets ingested into the Unravel Snowflake account.
Click the Monitor data ingestion link. You can monitor the data ingestion status from the Data ingestion dashboard. All the dashboards will be visible under the Dashboard tab.