Skip to main content




This tab is visible only for Databricks clusters.

In this section, you are provided with instructions to add and set up your workspace.

  1. On the Unravel UI, click Workspaces. The Workspaces manager page is displayed.

  2. Click Add Workspaces. The Add Workspace dialog is displayed.

  3. Enter the following details and click Add.



    Workspace Id

    Databricks workspace ID.

    Workspace Name

    Databricks workspace name.

    Instance (Region) URL

    Regional URL where the Databricks workspace is deployed.


    Select a subscription option:

    • Standard

    • Premium

    • Enterprise (AWS)

    • Dedicated (AWS)


    Personal access token to authenticate to and access Databricks REST APIs. Refer to Authentication using Databricks personal access tokens to create personal access tokens.

  4. From the Workspace manager, click Configure Cluster . The Setup Databricks with Unravel page is displayed.

    You must update the following settings under Advanced options for every cluster (Automated /Interactive) in your workspace. Use configurations in Spark 2.4.x and below tab or Spark 3.0.x and above tab, whichever is applicable.

    1. Spark/SparkConfig

      Copy the following snippet to Spark > Spark Conf. Replace <Unravel DNS or IP Address>.


      For spark-submit jobs, click Configure spark-submit and copy the following snippet in the Set Parameters > Parameters text box as spark-submit parameters. Replace <Unravel DNS or IP Address>.

      "--conf", "spark.eventLog.enabled=true",
      "--conf", "spark.eventLog.dir=dbfs:/databricks/unravel/eventLogs/",
      "--conf", "",
      "--conf", "spark.unravel.server.hostport=<Unravel DNS or IP Address>:4043",
      "--conf", "spark.executor.extraJavaOptions= -javaagent:/dbfs/databricks/unravel/unravel-agent-pack-bin/btrace-agent.jar=config=executor,libs=spark-version (2.4/3.0)",
      "--conf", "spark.driver.extraJavaOptions= -javaagent:/dbfs/databricks/unravel/unravel-agent-pack-bin/btrace-agent.jar=config=driver,script=StreamingProbe.btclass,libs=spark-version (2.4/3.0)"
      spark.eventLog.enabled true
      spark.eventLog.dir dbfs:/databricks/unravel/eventLogs/
      spark.unravel.server.hostport <Unravel DNS or IP Address>:4043 300
      spark.executor.extraJavaOptions -javaagent:/dbfs/databricks/unravel/unravel-agent-pack-bin/btrace-agent.jar=config=executor,libs=spark-version (2.4/3.0)
      spark.driver.extraJavaOptions -javaagent:/dbfs/databricks/unravel/unravel-agent-pack-bin/btrace-agent.jar=config=driver,script=StreamingProbe.btclass,libs=spark-version (2.4/3.0)
    2. Logging

      Select DBFS as Destination, and copy the following as Cluster Log Path.

    3. Init Script

      In the Init Scripts tab, set Destination to DBFS. Copy the following as the Init script path and click Add.