Skip to main content

Home

Upgrade the Unravel init script and remote sensors

Follow these steps to upgrade your environment and ensure sensors attach correctly to your Spark application. Unravel versions 4.7.9.8-hotfix (build 10806) or later, changes to the init script and remote sensors that are not backward compatible. This update requires a one-time manual action to upgrade the init script and remote sensors together to ensure sensors attach correctly to your Spark application.

Prerequisites

Before you begin the upgrade, ensure you have the following:

  • The update package: Locate the new version of the init script and the remote sensor package provided for your build.

  • Account permissions: Ensure you have administrative access to the Databricks workspace, specifically the ability to manage Metastore settings if you are using Unity Catalog Volumes.

  • Build verification: Confirm that your Unravel environment is being updated to version 4.7.9.8-hotfix (build 10806) or later.

Step 1: Secure the sensor package

The sensor package now includes a SHA-256 checksum manifest (sensor_package.sha256) and a detached signature (sensor_package.sha256.sig). The init script verifies these files to ensure the package is authentic. To enable this verification, you must use certificate fingerprint pinning:

  1. On the Unravel core node, run the following command to generate the SHA-256 fingerprint:

    openssl x509 -in /opt/unravel/data/certificates/internal_cert.pem -noout -fingerprint -sha256
  2. Copy the printed fingerprint.

  3. In your init script, locate the PAYLOAD_SIGNER_CERT_FPS variable and paste the fingerprint as its value.

Step 2: Configure the sensor source directory

You must define the location of your sensor files. In the init script, set the UNRAVEL_SOURCE_DIR variable to one of the following paths:

  • Option 1: Unity Catalog Volume (Recommended)

    readonly UNRAVEL_SOURCE_DIR="/Volumes/main/unravel_sensors/unravel"
  • Option 2: DBFS (Backward Compatibility)

    readonly UNRAVEL_SOURCE_DIR="/dbfs/databricks/unravel"

    Note

    For the highest level of security, store the init script in a separate Unity Catalog Volume with restrictive permissions (ideally writable only by administrators). This prevents users from bypassing integrity checks.

Step 3: Update the Metastore allowlist

If you use Unity Catalog Volumes, Databricks requires you to allow-list the script path before it can be used on clusters.

  1. In Databricks, go to Catalog Explorer > Metastore > Allowed JARs/Init Scripts.

  2. Select Add, and then enter the full Volume path for your init script.

    Example: Volumes/main/unravel_init_script/unravel/install-unravel.sh

    Note

    You must allow the init script in the Metastore for sensors installed from a Unity Catalog Volume. Without this configuration, Databricks blocks the init script from running on clusters even if file permissions are correct.

Improved sensor configuration validation

The init script (version 2.1 and later) uses internal variables to manage compatibility and validation automatically:

  • INIT_SCRIPT_VERSION: Identifies the current script version.

  • MIN_AGENT_PACK_VERSION and MAX_AGENT_PACK_VERSION: Enforces a supported version range for the agent pack to prevent silent mismatches.

Validation results:
  • Success: If the configuration is valid and all mandatory options are present, the sensors attach to the Spark application.

  • Failure: If validation fails or versions are incompatible, the sensors do not attach, and the script logs a clear error message.

    Note

    A sensor validation failure does not impact cluster initialization; the cluster will still start normally.