Skip to main content


Databricks FAQ

  1. Sign in to Unravel.

  2. Click Manager > Workspace and check if the corresponding Databricks workspace is shown in the Workspace list.

  1. Check if Spark conf, logging, and init scripts are present.

    For Spark conf, check if each and every property is correct.

  2. Refer Unravel > Workspace Manager > Cluster Configurations.

  1. Check if content in Spark conf, logging, and init scripts are correct.

  2. Refer Unravel > Workspace Manager > Cluster Configurations.

  1. Refer Unravel > Workspace Manager > Cluster Configurations.

  2. Check if spark.unravel.server.hostport is a valid address.

  1. Check if the host/IP is accessible from the Databricks notebook.

  2. Check if port 4043 is open for outgoing traffic on Databricks.

    nc -zv 4043
  3. Check if port 4043 is open for incoming traffic on Unravel.

    <see administrator>

Check if port 443 is open for outgoing traffic on Unravel

curl -X GET -H "Authorization: Bearer <token-here>" 'https://<instance-name-here>/api/2.0/dbfs/list?path=dbfs:/'
  1. Access the file in Unravel and get the workspace token.

  2. Run the following to check if the token is valid and works:

    curl -X GET -H "Authorization: Bearer <token>" 'https://<instance-name>/api/2.0/dbfs/list?path=dbfs:/'
  1. Run the following to get workspace token from file from DBFS.

    dbfs cat  

    Following is a sample of the output:

    #Thu Sep 30 20:06:42 UTC 2021
  2. Check if the token is valid using the following command:

    curl -X GET -H "Authorization: Bearer <token>" 'https://<instance-name>/api/2.0/dbfs/list?path=dbfs:/'
  3. If the token is invalid, you can regenerate the token and update the workspace from Unravel UI > Manage > Workspace.

You can register a workspace in Unravel from the command line with the manager command.

  1. Stop Unravel

    <Unravel installation directory>/unravel/manager stop
  2. Switch to Unravel user.

  3. Add the workspace details using the manager command as follows from the Unravel installation directory:

    source <path-to-python3-virtual environment-dir>/bin/activate
    <Unravel_installation_directory>/unravel/manager config databricks add --id <workspace-id> --name <workspace-name> --instance <workspace-instance> --access-token <workspace-token> --tier <tier_option>
    ##For example:
    /opt/unravel/manager config databricks add --id 0000000000000000 --name myworkspacename --instance --access-token xxxx --tier premium
  4. Apply the changes.

    <Unravel installation directory>/unravel/manager config apply
  5. Start Unravel

    <Unravel installation directory>/unravel/manager start
  1. On Databricks console, go to Workspace > Settings > Admin Console > Global init scripts tab.

  2. Click + Add.

  3. In the Add Script text box, using an editor open and copy the contents of the following file:

    dbfs cat  

Cluster init script applies the Unravel configurations for each cluster. To setup cluster init scripts from the cluster UI, do the following:

  1. Go to Unravel UI, click Manage > Workspaces > Cluster configuration to get the configuration details.

  2. Follow the instructions and update each cluster (Automated /Interactive) that you want to monitor with Unravel


To add Unravel configurations to job clusters via API, use the JSON format as follows:

    "settings": {
        "new_cluster": {
            "spark_conf": {
                // Note: If extraJavaOptions is already in use, prepend the Unravel values. Also, for Databricks Runtime with spark 2.x.x, replace "spark-3.0" with "spark-2.4"
                "spark.executor.extraJavaOptions": "-javaagent:/dbfs/databricks/unravel/unravel-agent-pack-bin/btrace-agent.jar=config=executor,libs=spark-3.0",
                "spark.driver.extraJavaOptions": "-javaagent:/dbfs/databricks/unravel/unravel-agent-pack-bin/btrace-agent.jar=config=driver,script=StreamingProbe.btclass,libs=spark-3.0",
                // rest of your spark properties here ...
            "init_scripts": [
                    "dbfs": {
                        "destination": "dbfs:/databricks/unravel/unravel-db-sensor-archive/dbin/"
                // rest of your init scripts here ...
            "cluster_log_conf": {
                "dbfs": {
                    "destination": "dbfs:/cluster-logs"
            // rest of your cluster properties here ...

Follow the instructions in this file to update instances and prices.

  1. On Microsoft Azure, go to the resource group where the Unravel marketplace app is deployed and click Settings > Deployments.

  2. Locate the deployment created by the marketplace, which appears similar to unravel-data.unravel-databricks-app.

  3. Remove the resources highlighted in the following image: