Skip to main content

Home

v4.7.6.0 Release notes

Software version

Release date: 02/November/2022

See 4.7.6.0 for download information.

Software upgrade support

The following upgrade paths are supported:

  • v4.7.0.x, v4.7.1.x, v4.7.2.x, v4.7.3.x, v4.7.4.x, v4.7.5.x v4.7.6.0

  • v4.6.1.9 v4.7.6.0

  • v4.6.1.8 or earlierv4.6.1.9v4.7.6.0

For instructions to upgrade to Unravel v4.6.1.9, see Upgrading Unravel server.

For instructions to upgrade to Unravel v4.7.6.x, see Upgrading Unravel.

For fresh installations, see Installing Unravel.

Sensor upgrade

  • Sensor upgrade is mandatory for on-prem platforms when you upgrade to Unravel v4.7.6.0 from Unravel v4.7.5.0 or lower versions. Refer to Upgrading Sensors.

Certified platforms

The following platforms are tested and certified in this release:

Cloud platform

  • Databricks (Azure, AWS)

Review your platform's compatibility matrix before you install Unravel.

Updates to Unravel's configuration properties

Updates to upgrading Unravel to v4.7.6.0

  • In the previous releases, the bootstrap file generated during interactive precheck installation was placed in the /tmp/unravel-interactive-precheck directory. In this release, for better security, this file is moved to the $HOME directory of the Unravel user.

    After you upgrade to Unravel v4.7.6.0, ensure to run the following command to delete the unravel-interactive-precheck directory from the /tmp directory.

    rm -rf /tmp/unravel-interactive-precheck
  • If you have created a JSON file for handling the API tokens persistently, then you must empty this JSON file after the upgrade. This is because the JWT secret has changed after the upgrade, so the earlier generated tokens are no longer valid.

    Caution

    Ensure that the JSON file is NOT placed in the /tmp directory or any other directory accessed by users other than Unravel users. If the JSON file is in a directory accessed by other users, move it to the $HOME directory of the Unravel user.

    Also, after you move the JSON file to the $HOME directory, configure the persistent API authorization tokens and update the new file path. Refer to User Guide> Manage > API authorization token > Storing persistent API authorization tokens topic for more details.

  • Reset the global init scripts as follows:

New features

  • Structured streaming

    Unravel now supports Apache Spark structured streaming applications. From the Spark structured streaming application details page, you can now monitor the concurrent structured streaming queries and long-running structured streaming jobs.

    You can view structured streaming queries and track statistics. These metrics help you debug anomalies in query processing. After the streaming query execution is complete, you can view event details.

    For information about using the structured streaming feature, see the Spark structured streaming details page in the User Guide.

  • AutoActions (Databricks)

    • An updated user interface is provided for AutoActions, along with improved usability for Databricks clusters.

      For information about using the AutoActions feature for Databricks, see the AutoActions section in the Databricks User Guide.

  • Security

    • The API tokens and the login tokens in Unravel are JSON web tokens (JWT). The JWT secret, which is used to sign the JWT token, should be periodically rotated to increase security. You can now set the rotation of the JWT secret to prevent the violation of the JWT secret.

      For information, see the Rotating the JSON web token (JWT) secret section in the Configuration Guide

    • An API token, required while interacting with Unravel APIs, can get deleted after you restart Unravel. You can now configure Unravel to store the token persistently in a JSON file.

      For information, see the Storing persistent API authorization tokens section in the User Guide.

    • Passwords/tokens pushed by Unravel onto DBFS can be decrypted with an AES key in a Unravel manager command.

      For information, see the Decrypting passwords/tokens pushed by Unravel to Databricks DBFS section in the Configuration Guide.

    • An interactive password prompt is provided for the manager commands that require passwords from users.

Improvements and enhancements

  • Installation

    • In the Interactive Precheck installation, the default location of the working directory default is moved from /tmp to $HOME. (SEC-39)

    • By default, Heap dump on out-of-memory (OOM) is now disabled. You can re-enable it (if needed) by setting the UNRAVEL_DEBUG_HEAPDUMP=1 environment variable. (INSTALL-2796)

    • Performance improvement of the manager utility responsiveness. The response to most commands is now 2 to 5x faster. (INSTALL-2771)

    • When you upgrade Unravel from 4.7.6.0, the manager activate command picks the most recent build when only the version is provided. (INSTALL-2755)

    • The port has been made optional for the set-lr-endpoint command. 4043 or 4443 port is used if tls is enabled and if the tls related switches are used. (INSTALL-2590)

  • UI enhancements (DT-1226)

    • Job page:

      • Renamed the Jobs menu to Workflows

      • Renamed the Runs tab to Job Runs

    • Compute page

      • A new Cost filter: You can now filter Databricks clusters by cost (based on the dollar). (UIX-4607)

      • New Copy and Download icons on the Configuration tab: In the upper-right corner of the Configuration tab, a Copy icon has been added to copy the cluster configuration, and the Download icon has been added to download the configuration in a JSON file format. (UIX-4813)

    • Budget page

      • The budget categories (Active, Upcoming, and Expired) are now displayed as tabs on the Cost > Budget page.

    • The Copy API Token text is replaced with Copy Login Token for the copy the Login token to the clipboard feature. (UIX-4903)

  • AutoAction enhancements

    • AutoAction templates (Databricks clusters and Databricks job runs) are added in the templates section with enhanced UI for Databricks. (AA-496)

    • A blank template option is added in the AutoAction templates for on-prem clusters. (UIX-4952) .

    • A plus (+) button is displayed for the trigger condition for Databricks. (UIX-4953)

    • A new com.unraveldata.auto.action.use.cli.enabled configuration property has been provided to enable the use of CLI (Command Line Interface) for AutoActions. By default, the property is set to true. You can use it for yarn (on-prem) applications. (ASP-1628)

  • Other enhancements

    • Provided support for Databricks clusters with spark.databricks.acl.dfAclsEnabled. (DT-1011)

    • For Databricks/tagged workflows, the Daggraph tab is hidden. The Daggraph tab is displayed only for the Oozie workflow. (DT-1210)

Unsupported

  • Unravel does not support Billing for on-prem platforms.

  • On the Data page, File Reports, Small File reports, and file size information are not supported for MapR, EMR, and Dataproc clusters.

  • Impala jobs are not supported on the HDP platform.

  • Monitoring the expiration of the SSL Certificates and Kerberos principals in Unravel multi-cluster deployments.

  • Sustained Violation is not supported for Databricks AutoAction.

The following features are not supported for MapR:

  • Impala applications

  • Kerberos

  • The following features are supported on the Data page:

    • Forecasting

    • Small Files

    • File Reports

  • The following reports are not supported on MapR:

    • File Reports

    • Small Files Report

    • Capacity Forecasting

    • Migration Planning

    The Tuning report is supported only for MR jobs.

  • Migration Planning

  • AutoAction is not supported for Impala applications.

  • Migration

  • Billing

  • Insights Overview

  • Unravel does not support the Insights Overview tab on the UI for the Amazon EMR platform.

  • Migration planning is not supported for the following regions for Azure Data Lake:

    • Germany Central (Sovereign)

    • Germany Northeast (Sovereign)

  • Forecasting and Migration: In a multi-cluster environment, you can configure only a single cluster at a time. Hence, reports are generated only for that single cluster.

  • Migration planning is not supported for MapR.

  • Unravel does not support multi-cluster management of combined on-prem and cloud clusters.

  • Unravel does not support apps belonging to the same pipeline in a multi-cluster environment but is sourced from different clusters. A pipeline can only contain apps that belong to the same cluster.

  • All the reports, except for the TopX report, are not supported on Databricks and EMR.

In Jobs > Sessions, the feature of applying recommendations and running the newly configured app is not supported.

  • Pig and Cascading applications are not supported.

Bug fixes

  • Applications

    • When executing a Spark job that exceeds the maximum result size at the driver, an incorrect value is displayed for the Getting Results metrics on the Stages > Timeline tab as compared to the value in the Spark UI. (ASP-1594)

    • This issue fixes a possible loss of data in case of multiple threads started in a single spark application. (CDI-633)

  • AutoActions

    • When creating AutoActions, if you have specified comma-separated email IDs, an email alert is not sent. (AA-482)

    • The AutoActions page redirects to the Unravel home page. (SUPPORT-1549)

  • Cost

    • Some apps show higher costs on the Compute page while in a running state. (SUPPORT-1502)

  • Databricks

    • Azure Databricks clusters and jobs are randomly missing on the Unravel UI due to Azure Databricks File System (DBFS) mount issues. (DT-1183)

    • DBFS issues are fixed that occur when running the global init scripts. (CDI-635)

    • The Copy URL option has been removed from the Cost > Chargeback because the option did not copy any URL. (DT-1198)

    • When the run comes from a nested notebook in Databricks, the value for the Run ID is displayed as 1 on the Jobs > Runs page. As a result, the Start Time is displayed as zero (0). (DT-1284)

  • Data page

    • When a dataset contains more than 1000 tables, the partition query throws an error, and table_worker stops processing the remaining datasets and projects. (DATAPAGE-592)

  • EMR

    • Terminated clusters can appear within the active clusters list on the Active tab of the Clusters page. ( (EMR-446)

  • Insights

    • The SlowSqlStage event shows wrong data for the Shuffled Data parameter. (INSIGHTS-335)

  • Installation

    • Auto-configuration: Configurations are correctly generated for database access over SSL. (INSTALL-2763)

    • Ansible script fails on Unravel multi-node setup. (INSTALL-2749)

    • The Interactive precheck bootstrap fails when configuring an external MySQL database on HDP. (INSTALL-2730)

  • TopX report

    • The Top-X report fails when there is a ' symbol in the cluster ID. (REPORT-2058)

  • UI

    • Unable to disable the Insights Overview tab using the com.unraveldata.valuedashboard.enabled property. (UIX-5032)

    • On the value dashboard, in the Top Clusters by Cost section, if two workspaces have the same cluster name, then the cost is aggregated into one sum. (UIX-5020)

    • An incorrect cluster count is displayed on the Chargeback page, and the tooltip is missing for cluster names. Sometimes, the tooltip does not show full text for any cluster. (UIX-4938)

    • The projected cost end date points to the first day of next month. (UIX-4931)

  • Workflow/Job page displays empty for Analysis, Resources, Daggraph, and Errors tabs. (DT-1093)

  • Event logs and YARN logs are not loaded for some applications in Google Dataproc clusters. (PG-170)

  • When multiple AutoActions policies are created with the Overlapping ruleset and scopes, only one of the AutoAction policies is triggered. (AA-498)

  • App store tasks fail to start with SSL. (APP-614)

    Workaround

    To resolve this issue, do the following:

    1. Stop Unravel.

      <Unravel installation directory>/unravel/manager stop
      
    2. Use an editor to open <Installation_directory>/unravel/data/conf/unravel.yaml file.

    3. In the unravel.yaml file, under the database > advanced > python_flags block, enter the path to the trusted certificates. For example, if Unravel is installed at /opt/unravel, you must edit the unravel.yaml file as follows:

      unravel:
      ...snip...
        database:
      ...snip...
          advanced:
            python_flags:
              ssl_ca: /opt/unravel/data/certificates/trusted_certs.pem
    4. Use the manager utility to upload the certificates.

      <Unravel installation directory>/manager config tls trust add --pem /path/to/certificate

      For example: /opt/unravel/manager config tls trust add --pem /path/to/certificate

    5. Enable the Truststore.

      <Unravel installation directory>/manager config tls trust enable
    6. Apply the changes and restart Unravel.

      <Unravel installation directory>/unravel/manager config apply --restart
      
  • Incorrect data is displayed in the Number of Queries KPI/Trend graph on the Overview page. (DATAPAGE-502)

  • Wrong data displayed for Number of Partitions Created KPI/trend graph under Partitions KPIs - Last Day section in the Data page. (DATAPAGE-473)

  • Errors and Logs data are missing in specific scenarios for a failed Spark application, such as an application failing with OutOfMemoryError. (ASP-1624)

  • On the Chargeback page when you group by clusters, Unravel has a limitation of grouping a maximum of 1000 clusters only. (SUPPORT-1570)

  • On the Cost > Trends and Cost > Chargeback pages, the tooltip for the Last <number of> days field includes more days than the displayed days. (UIX-5042)

  • When the Interactive cluster is restarted, the cluster count is increased on the Databricks Cost > Trends page. (DT-1275)

  • The duplicate job runs (with the same run IDs) are generated on the Job Runs page. (DT-1190)

  • After navigating from Trends and Chargeback pages with Tag filters, the No data available message is displayed on the Compute page. (DT-1094)

  • Inconsistent data is displayed for the cluster Duration and Start Time on the Compute page. (ASP-1636)

  • The DriverOOME and ExecutorOOME events are not generated for the Databricks notebook task. (DT-533)

  • When a job fails to submit a Spark application, the failed DataBricks job is missing from the Unravel UI. (ASP-1427)

  • In Databricks, when a job in a workflow fails and a new job is launched instead of a new attempt, the new job cannot be part of the same workflow. (PG-269)

  • In the Databricks view, the application is shown in a running state, even though the corresponding Spark application is marked as finished. (ASP-1436)

  • Google Cloud Dataproc: Executor Logs are not loaded for Spark applications. (PG-229)

  • The workflow of multiple transient clusters (EMR) is not supported. (ASP-1424)

  • Unable to run Spark applications on all the master nodes after Unravel bootstrap for high availability clusters. (EMR-49)

  • Unravel node fails to send email notifications. (INSTALL-1694)

  • An exception occurs when installing Unravel version 4.7.6.0 with the Azure MySQL database (SSL Enabled). (INSTALL-2799)

  • During precheck and healthcheck, the Hadoop check fails for the MapR cluster. You can ignore the messages. (INSTALL-2603)

  • The Insights Overview tab uses UTC as the timezone, while other pages use local time. Hence, the date and time shown on the Insights Overview tab and the other pages after redirection can differ. (UIX-4176)

  • Kerberos can only be disabled manually from the unravel.yamlfile.

     kerberos:
          enabled: False
  • WorkloadFit report

    • A large number of tags can cause the Workload Fit report to fail. (PG-265, CUSTOMER-2084)

    • WorkloadFit report > Heatmap: The job count has data, but Vcore and memory are empty. (MIG-262)

  • Cluster discovery

    • The On-prem Cluster Identity might show an incorrect Spark version on CDH. The report may incorrectly show Spark 1 when Spark 2 is installed on the CDH cluster. (REPORT-1702)

  • TopX report

    • The TopX report email links to the Unravel TopX report instead of showing the report content in the email as in the old reports.

  • Queue analysis:

    • The log file name (unravel_us_1.log) displayed in the error message is incorrect. The correct name of the log file is unravel_sensor.log. (REPORT-1663)

  • Cloud Mapping Per Host report scheduled in v4.6.1.x does not work in v4.7.1.0. Users must organize a new report. (REPORT-1886)

  • When using PostgreSQL, the % sign is duplicated and displayed in the Workload Fit report > Map to single cluster tab. (MIG-42)

  • A blank page is displayed on the Databricks Run Details page for Spark structured streaming applications. (ASP-1629)

  • If the Spark job is not running for Databricks, the values for the Duration and End time fields are not updated on the Databricks Run Details page. (ASP-1616)

  • You can see a lag for SQL Streaming applications. (PLATFORM-2764)

  • If the customer uses an active directory for Kerberos and the samAccountName and principal do not match, this can cause errors when accessing HDFS. (DOC-755)

  • For PySpark applications, the processCPUTime and processCPULoad values are not captured properly. (ASP-626)

  • SQL events generator generates SQL Like clause event if the query contains a like pattern even in the literals. (TEZLLAP-349)

  • After upgrading from v4.7.1.1 to v4.7.5.0, the Hive jobs running with the Tez application as an execution engine are not linked. (EMR-406)

  • After upgrading to v4.7.1.0, Notebooks do not work. You can configure them separately. (REPORT-1895)

  • After upgrading from v4.6.x to v4.7.1.0, the Tez application details page does not initially show DAG data. The DAG data is visible only after you refresh the page. (ASP-1126)

  • You can access the new user interface (UI) only from Chrome.

  • In the App summary page for Impala, the Query> Operator view is visible after scrolling down. (UIX-3536).

  • Issue: When you return from the application details > SQL tab> Stage page to the application details > Attempt page, the Duration, Data I/O, and Jobs Count fields are not displayed. (UIX-5048)

    Workaround: You can reload or refresh the application details > Attempt page to restore all the KPIs.

  • Jobs are falsely labeled as a Tez App for Oozie Sqoop and Shell actions. (PLATFORM-2403)

  • The following manager command to generate an API token is deprecated. You can create the API tokens from the Unravel UI (manage-icon.png > API tokens).

    manager generate api_token

Support

For support issues, contact Unravel Support.