Skip to main content

Home

Databricks Release Notes

v4.7.9.4 Release notes

Release information

Release date

22 February 2024

Software

Downloads

Configuration properties

Properties

Announcements

Postgres upgrade to version 15.5

The bundled Postgres database has been upgraded to version 15.5. This version supports new installations and upgrades on all platforms. If any database errors are encountered during installation or upgrade, please reach out to Unravel support for assistance.

New features
  • Seamless integration with Azure billing

    Unravel has transitioned from using an approximation algorithm to compute costs for all Databricks entities to integrating with Azure billing APIs. Starting this release, integration with Azure billing APIs provides DBU usage, DBU cost, and VM costs that match with the cloud provider values.

  • New Home page with insights

    Home page is introduced in this release which offers detailed insights into your cloud environment.

    • Easily assess your cloud spending, resource utilization, and potential savings opportunities at a glance.

    • With intuitive visuals and actionable data, make informed decisions to optimize your cloud resources effectively.

    • Explore TopX to identify top cost drivers and performance bottlenecks, and delve into optimization strategies to enhance efficiency and drive savings.

  • New Unravel Billing page

    Unravel has introduced a new billing page from this release to support our new pricing model. Previously, Unravel employed a flat pricing model for different compute types, charging customers a fixed rate. However, recognizing the evolving landscape of compute type usage and the need for greater flexibility and accuracy in billing, Unravel has introduced a new pricing strategy aligned with market trends. Unravel now charges our customers a different pricing for each compute type.

  • UI enhancements in the Cost Explorer page

    Experience a more intuitive navigation in the Cost Explorer page previously known as the Cost page. With renamed pages and refined table headers, navigating through cost-related insights is more intuitive than ever. Drill down into your cloud spending data and delve deeper into cost allocation and resource utilization.

The following table contains key issues addressed in the 4.7.9.4 release.

ID

Description

App Store

APP-774

External Elasticsearch integration not supported within the App Store environment.

APP-775

Databricks cost anomaly detection did not function as expected when integrated with external Elasticsearch.

Cost

UIX-6310

On the Chargeback page, when tag is not provided for any application, NULL is displayed. Upon redirection from the Optimize link for the NULL tag, the Compute page shows all applications for the selected duration instead of specifically displaying applications with no tags.

Compute

DT-2094

The cluster ID is displayed instead of the cluster name for certain clusters.

Spark

DT-2141

In the Program tab, when you click the line number, the actual line of the code is not highlighted in the Spark details App Summary page .

DT-1404

Jobs created for the PySpark application using User-Defined Functions on a job cluster fail after applying the recommendations for node downsizing.

PLATFORM-2764

There is a lag for SQL Streaming applications.

UX-632

The timeline histogram does not generate correctly on the Spark application details page.

Workflows

PIPELINE-1626, PIPELINE-1946

The Unravel user interface may experience issues where certain Azure Databricks jobs are missing and duplicate entries appear in Databricks workflows under specific circumstances.

The upcoming releases will include the following key fixes to enhance user experience. It is important to note that while these issues exist, there is no immediate critical impact on using the product, and users can continue to utilize its functionality with confidence.

ID

Description

Cost

UIX-6305

The Others category is displayed twice in legends when the number of clusters exceeds 1000 in the Chargeback page.

Compute

DT-2083

The Total Allocated Key Performance Indicators (KPIs) for Vcore and memory are not visible in the Compute > Trends page.

UIX-6321

All jobs in the running status are displayed in the Finished tab under Job Runs instead of showing only the finished jobs.

Insights

DT-2006

Recommendations are provided for a failed pipeline when users utilize multiple tasks with shared job clusters, and one of the tasks fails.

DT-2125

The UI shows a cost discrepancy for the Executor Idle time detected insight in the Databricks version 14.2 with Photon enabled.

Spark

DT-1742

The timezone for the NodeRightSizing insight event is inconsistent in the Spark details page.

DT-2029

Applications in a success state may inaccurately display an associated job in a running state instead of transitioning to a failed state.

UIX-6523

The Sort by Write feature is currently not functioning as expected in the Spark details page.

SaaS (Free)

DT-2037

In the Databricks Standard (free) environment, there is an issue where the User Flow badge obstructs pagination.

UI

UIX-6281

The cost comparison for all the instances is not displayed on the Pipeline detail page.

Workflows

DT-2104

Sorting is incorrect when the list contains both strings starting with capital and small case letters,

  • Billing

    Some discrepancies may occur in cost calculations due to differences between the user time zone displayed on the Compute page and the UTC-based aggregation on the Billing page. (DT-2350)

  • Compute

    Jobs by status graphs in the Trends tab display spark application details and not the job details. Our development is actively looking into this design limitation and efforts are underway to address this in future updates to enhance the product's capabilities. (DT-2008)

  • Data

    If tables are created with the same name, accessed, deleted, and re-created, and if those tables are re-accessed, then their query and app count do not match. (DATAPAGE-502)

  • Home

    Home page does not display alerts on the UI when there is missing ROI data for a single day. (DT-2509)

    Hovering on Total Cost Trend on the Summary tab of the Home page may display inaccurate date information. (DT-2408)

  • Workflows

    The current implementation has a limitation where the wrong run count is displayed for the job ID when sorting by run count in the Workflows > Jobs section. This discrepancy is currently under investigation by our development team, and active efforts are being taken to resolve this issue. (UIX-6526)

Our development team is actively investigating the following Known issues and are working towards resolving them. It is important to note that while these issues exist, there is no immediate critical impact on using the product, and users can continue to utilize its functionality with confidence.

Bug ID

Description

Workaround

App Store

APP-614

App Store tasks fail to start with SSL enabled on the MySQL database.

Workaround

Compute

PIPELINE-1636

Inconsistent data is displayed for the cluster Duration and Start Time on the Compute page.

NA

Cost

UIX-5624

Data is not displayed when you click the Optimize button corresponding to OTHERS for the Cost > Chargeback results shown in the table.

NA

DT-1094

The No data available message is displayed on the Compute page after navigating from the  Trends and  Chargeback  pages with  Tag  filters.

NA

Datapage

DATAPAGE-473

For Hive metastore 3.1.0 or earlier versions, the creation time of partitions is not captured if a partition is created dynamically.  Therefore, the Last Day KPI for the partition section is not shown in Unravel.

NA

Insights

DT-1987

There is a mismatch in the computation of costs for fleet and spot instances in Databricks clusters. This issue arises due to the unavailability of the exact node type in the cluster info response.

NA

Performance

PIPELINE-1926

The Insight Worker daemon is experiencing performance lag, causing delays in processing insights and data analytics tasks.

NA

ASI-933

In the Lag setup, the Duration is not updated for running applications. The Duration should be updated every 15 minutes.

NA

ASI-936

In the Lag setup, the App Time data is missing in the Timing tab of many applications.

NA

Spark

PIPELINE-1616

If the Spark job is not running for Databricks, the values for the Duration and End time fields are not updated on the Databricks Run Details page.

NA

DT-2012

Incorrect details are displayed on the AppSummary > Job Run page when a user repairs a previously failed job. The displayed information may not accurately reflect the repaired job's details.

NA

UI

PIPELINE-1935

In the Pipeline details page, when you select the data for a specific date, all instances are displayed instead of displaying only the instances within a selected date.

NA

UIX-6321

In the Workflow section, instead of displaying only jobs completed within the selected time frame, it currently displays jobs running within the selected duration.

NA

UIX-6263

The WhiteCross.png cross button on the Pipeline details page does not close the detail page when you click the bars inside the Gantt chart.

NA

Workflows

DT-1461, PIPELINE-1939, PIPELINE-1940, DT-1093, UIX-6274, PIPELINE-1924

The UI and data exhibit inconsistencies, including problems with job run details, issues related to multiple workflow runs and UTC timestamps , empty content in workflow job pages and issues with filter values and duration display.

NA

App Store

App Store tasks fail to start with SSL enabled on the MySQL database. (APP-614)

  1. Stop Unravel.

    <Unravel installation directory>/unravel/manager stop
  2. Use an editor to open <Installation_directory>/unravel/data/conf/unravel.yaml file.

  3. In the unravel.yaml file, under the database > advanced > python_flags block, enter the path to the trusted certificates. For example, if Unravel is installed at /opt/unravel, you must edit the unravel.yaml file as follows:

    unravel:
    ...snip... database:
    ...snip... advanced: python_flags: ssl_ca: /opt/unravel/data/certificates/trusted_certs.pem
  4. Use the manager utility to upload the certificates.

    <Unravel installation director>/manager config tls trust add --pem /path/to/certificate

    For example: /opt/unravel/manager config tls trust add --pem /path/to/certificate

  5. Enable the Truststore.

    <Unravel installation directory>/manager config tls trust enable
  6. Apply the changes and restart Unravel.

    <Unravel installation directory>/unravel/manager config apply --restart

v4.7.9.3 Release notes

Software version

Release date: January 25, 2024

See v4.7.9.3 for download information.

See also Unity App release notes

Software upgrade support

The following upgrade paths are supported:

  • 4.7.9.24.7.9.3

  • 4.7.8.0 Hotfix → 4.7.9.3

  • 4.7.8.0 4.7.9.3

  • 4.7.x (Databricks)4.7.9.3

For instructions to upgrade to Unravel v 4.7.9.3, see Upgrade to Unravel 4793

For fresh installations, see Deploy Unravel

Announcements
  • End of Support Announcement for RHEL 6

    Red Hat Enterprise Linux 6 (RHEL 6) is no longer supported with Unravel. If you are currently using RHEL 6, Unravel recommends that you plan an upgrade to a supported operating system to continue receiving updates and support. Contact support for any further assistance.

  • CPU Speed host metrics collection is not supported

    Starting from the 4793 release, we have deprecated the collection of CPU Speed host metrics.

Certified platforms

The following platforms are tested and certified in this release:

  • Databricks (Azure, AWS)

Review your platform's compatibility matrix before you install Unravel.

Updates to Unravel's configuration properties
Updates to upgrading Unravel to v4.7.9.3
  1. Go to {unravel_install_dir}/versions/{unravel_version}/core/etc/dbx/cost

  2. Copy the following files:

    • prices_workload_tier_aws.tsv

    • prices_workload_tier_azure.tsv

  3. Paste the copied files and replace the existing files in this location:

    {unravel_install_dir}/data/conf/cost

The insight_upgrade.sh script be run after the upgrade. This script performs the following tasks:

  • Deletes older RealTimeLightProcessorEvent entries from the database and elasticsearch index.

  • Regenerates new NodeRightSizing events for certain clusters.

  1. Go to {unravel_install_dir}/unravel/services/insights_worker_1_1

  2. Run the insights_upgrade.sh script.

New features
  • Healthcheck ROI report

    A new Healthcheck ROI report is launched and is available as an App Store app. The app provides a comprehensive view of the Databricks environment, focusing on performance, costs, and potential savings. With this app, you can get insights into daily costs, hierarchical cost distribution, user, workspace, cluster, and job metrics. You can identify opportunities for workload optimization, worker resource classification, and migration savings through detailed analytics and recommendations. You can also have a holistic view of cluster metrics, including session costs, wastage analysis, and potential migration savings.

  • Support for Databricks 13.x and above

    Databricks Runtime 13.x and above is supported from this release.

    Note

    Databricks does not provide Ganglia metrics for Databricks Runtime 13 and above. Unravel now gathers all host-level metrics in real time from the /proc filesystem. There might be variations in the metrics collection approach of Unravel and Databricks itself.

  • Observability on Databricks SaaS is available with Standard (Free) tier

    Unravel has introduced observability on Databricks SaaS for free. You can now access essential observability features at no cost, allowing you to monitor your Databricks environment without incurring additional charges.

Improvements and enhancements
  • Improved On-demand Insights

    The on-demand Insights feature is now significantly faster, providing users with access to the most recent and relevant insights within 10 seconds, and enabling an intuitive comparison of resources that facilitates quick decision making. This update improves the user experience by streamlining the process of obtaining valuable insights.

  • Python upgrade

    In this release, Python is upgraded to version 3.8.12.

  • Backend updates to improve performance

    This release includes significant backend improvements aimed at enhancing overall system performance. These updates contribute to a more responsive and efficient system, ensuring a smoother experience.

The following table contains key issues addressed in the 4.7.9.3 release.

ID

Description

App Store

IMP-1089

Incorrect duration values are noted for the Interesting Apps data in specific applications.

Compute

IMP-1239

Modify the parsing logic for driver host metrics in the Spark Details page of Compute.

Insights

DT-1519

The Nodedownsizing event recommends a $0 cost saving for a successful job.

IMP-1217

Streaming applications are incorrectly generating RealtimeLightProcessor insights.

IMP-1272

An exception occurs while fetching feature data from the feature store.

Jobs

PIPELINE-1982

In a Spark application, there is a discrepancy in the displayed name on the Jobs page.

Kafka

CPLANE-2649

The Refresh Kafka command failed to start Zookeeper before initiating Kafka, resulting in an incomplete initialization.

Security

CUSTOMER-2584

Bind password is exposed in plain text within the AutoAction (AA) logs.

Sensor

CPLANE-3427

In the Unravel sensor logs, there is an occurrence of java.lang.NumberFormatException.

Workflows

CUSTOMER-2544

The sort functionality in the cost filter under the Workflow tab is not functioning as expected.

PIPELINE-2021

The cost filter under the Workflow tab is not functioning as expected.

The upcoming releases will include the following key fixes to enhance user experience. It is important to note that while these issues exist, there is no immediate critical impact on using the product, and users can continue to utilize its functionality with confidence.

ID

Description

Cost

DT-1879, DT-1871, DT-1853

The following issues are observed in the Budget page.

  • Redirection to the Compute page displays empty data for selected tags.

  • Redirection to the Chargeback page fails to populate scope filters correctly.

  • An incorrect date range is selected when redirecting to other pages from the Budget page.

UIX-6305

The Others category is displayed twice in legends when the number of clusters exceeds 1000 in the Chargeback page.

Compute

DT-2094

The cluster ID is displayed instead of the Cluster name for certain clusters.

DT-2079

The cluster cost displayed does not match the Azure billing report in some scenarios.

DT-2083

The Total Allocated Key Performance Indicators (KPIs) for Vcore and memory are not visible in the Compute > Trends page.

UIX-6321

All jobs in the running status are displayed in the Finished tab under Job Runs instead of showing only the finished jobs.

Insights

DT-2006

Recommendations are provided for a failed pipeline when users utilize multiple tasks with shared job clusters, and one of the tasks fails.

DT-2125

The UI shows a cost discrepancy for the Executor Idle time detected insight in the Databricks version 14.2 with Photon enabled.

Reports

DT-1841

The TopX Report displays an incorrect count of events.

Spark

DT-1742

The timezone for the NodeRightSizing insight event is inconsistent in the Spark details page.

DT-2012

Incorrect details are displayed on the AppSummary > Job Run page when a user repairs a previously failed job. The displayed information may not accurately reflect the repaired job's details.

/DT-2029

Applications in a success state may inaccurately display an associated job in a running state instead of transitioning to a failed state.

DT-2141

Clicking the line number in the Program tab does not highlight the actual line of code in the Spark details App Summary page

UIX-6523

The Sort by Write feature is currently not functioning as expected in the Spark details page.

SaaS (Free)

DT-2037

In the Databricks Standard (free) environment, there is an issue where the User Flow badge obstructs pagination.

Workflows

DT-2104

Sorting is incorrect when the list contains both strings starting with capital and small case letters,

  • Compute

    Jobs by status graphs in the Trends tab display spark application details and not the job details. Our development is actively looking into this design limitation and efforts are underway to address this in future updates to enhance the product's capabilities. (DT-2008)

  • Workflows

    The current implementation has a limitation where the wrong run count is displayed for the job ID when sorting by run count in the Workflows > Jobs section. This discrepancy is currently under investigation by our development team, and active efforts are being taken to resolve this issue. (UIX-6526)

Our development team is actively investigating the following Known issues and are working towards resolving them. It's important to note that while these issues exist, there is no immediate critical impact on using the product, and users can continue to utilize its functionality with confidence.

Bug ID

Description

Workaround

App Store

APP-614

App Store tasks fail to start with SSL enabled on the MySQL database.

Workaround

Compute

PIPELINE-1636

Inconsistent data is displayed for the cluster Duration and Start Time on the Compute page.

NA

Cost

UIX-5624

Data is not displayed when you click the Optimize button corresponding to OTHERS for the Cost > Chargeback results shown in the table.

NA

DT-1094

The No data available message is displayed on the Compute page after navigating from the  Trends and  Chargeback  pages with  Tag  filters.

NA

UIX-6310

On the Chargeback page, when no tag is provided for any application, NULL is displayed. Upon redirection from the Optimize link for the NULL tag, the Compute page shows all applications for the selected duration instead of specifically displaying applications with no tags.

NA

Datapage

DATAPAGE-502

If tables are created with the same name, accessed, deleted, and re-created, and if those tables are re-accessed, then their query and app count do not match.

NA

DATAPAGE-740

The query to fetch tableDailyKPIs is getting timed out when dealing with a huge table partition of 27 million records. From a threshold perspective, it has been verified that the API functions without issues for partition sizes up to 18 million.

NA

DATAPAGE-473

For Hive metastore 3.1.0 or earlier versions, the creation time of partitions is not captured if a partition is created dynamically.  Therefore, the Last Day KPI for the partition section is not shown in Unravel.

NA

Insights

DT-1987

There is a mismatch in the computation of costs for fleet and spot instances in Databricks clusters. This issue arises due to the unavailability of the exact node type in the cluster info response.

NA

UIX-5127, INSIGHTS-324,UIX-4176

Link re-direction issues, such as incorrect data filters for viewing Top Groups by Cost and Top Clusters by Cost, as well as missing re-direction links in the App Acceleration section.

NA

Performance

PIPELINE-1926

The Insight Worker daemon is experiencing performance lag, causing delays in processing insights and data analytics tasks.

NA

ASI-933

In the Lag setup, the Duration is not updated for running applications. The Duration should be updated every 15 minutes.

NA

ASI-936

In the Lag setup, the App Time data is missing in the Timing tab of many applications.

NA

Spark

DT-1404

Jobs created for the PySpark application using User-Defined Functions on a job cluster fail after applying the recommendations for node downsizing.

Workaround

PIPELINE-1616

If the Spark job is not running for Databricks, the values for the Duration and End time fields are not updated on the Databricks Run Details page.

NA

PLATFORM-2764

You can see a lag for SQL Streaming applications.

NA

UX-632

The timeline histogram needs to be generated correctly on the Spark application details page.

NA

PIPELINE-626

For PySpark applications, the processCPUTime and processCPULoad values are not captured properly.

NA

UI

UIX-5581

The job run count displayed on the Chargeback page differs from the job count shown on the Workflow page.

NA

PIPELINE-1935

In the Pipeline details page, when you select the data for a specific date, all instances are displayed instead of displaying only the instances within a selected date.

NA

UIX-6281

The cost comparison for all the instances is not displayed on the Pipeline detail page.

NA

PIPELINE-1934

On the Pipeline details page, the arrows must point only to the latest run instead of all the runs.

NA

UIX-6321

In the Workflow section, instead of displaying only jobs completed within the selected time frame, it currently displays jobs running within the selected duration.

NA

UIX-6263

The WhiteCross.png cross button on the Pipeline details page does not close the detail page when you click the bars inside the Gantt chart.

NA

UIX-3536

In the App summary page for Impala, the Query> Operator view is visible after scrolling down.

NA

Workflows

DT-1461, PIPELINE-1939, PIPELINE-1940, DT-1093, UIX-6274, PIPELINE-1924

The UI and data exhibit inconsistencies, including problems with job run details, issues related to multiple workflow runs and UTC timestamps , empty content in workflow job pages and issues with filter values and duration display.

NA

PIPELINE-1626, PIPELINE-1946

The Unravel UI has the issue of missing some Azure Databricks jobs and duplicate entries in Databricks workflow in certain scenarios.

NA

App Store

App Store tasks fail to start with SSL enabled on the MySQL database. (APP-614)

  1. Stop Unravel.

    <Unravel installation directory>/unravel/manager stop
  2. Use an editor to open <Installation_directory>/unravel/data/conf/unravel.yaml file.

  3. In the unravel.yaml file, under the database > advanced > python_flags block, enter the path to the trusted certificates. For example, if Unravel is installed at /opt/unravel, you must edit the unravel.yaml file as follows:

    unravel:
    ...snip... database:
    ...snip... advanced: python_flags: ssl_ca: /opt/unravel/data/certificates/trusted_certs.pem
  4. Use the manager utility to upload the certificates.

    <Unravel installation director>/manager config tls trust add --pem /path/to/certificate

    For example: /opt/unravel/manager config tls trust add --pem /path/to/certificate

  5. Enable the Truststore.

    <Unravel installation directory>/manager config tls trust enable
  6. Apply the changes and restart Unravel.

    <Unravel installation directory>/unravel/manager config apply --restart
Spark

Jobs created for the PySpark application using User-Defined Functions on a job cluster fail after applying the recommendations for node downsizing. (DT-1404)

  1. In your Databricks workspace, go to Configure Cluster > Advanced Options > Spark config .

  2. Add and set the following property to true for spark.driver.extraJavaOptions and spark.executor.extraJavaOptions spark configurations:

    • Dcom.unraveldata.metrics.proctree.enable=true

      For example:

      spark.executor.extraJavaOptions -Dcom.unraveldata.metrics.proctree.enable=true -javaagent:/dbfs/databricks/unravel/unravel-agent-pack-bin/btrace-agent.jar=config=executor,libs=spark-3.0 spark.driver.extraJavaOptions -Dcom.unraveldata.metrics.proctree.enable=true -javaagent:/dbfs/databricks/unravel/unravel-agent-pack-bin/btrace-agent.jar=config=driver,script=StreamingProbe.btclass,libs=spark-3.0
  • App-store does not support PostgreSQL over SSL.

  • Sustained Violation is not supported in AutoActions for Databricks. This is a type of violation that triggers the AutoAction.

  • All the reports, except for the TopX report, are not supported on Databricks.

Red Hat Enterprise Linux 6 (RHEL 6) is no longer supported with Unravel. If you are currently using RHEL 6, Unravel recommends that you plan an upgrade to a supported operating system to continue receiving updates and support. Contact support for any further assistance.

Starting from the 4793 release, the collection of CPU Speed host metrics is deprecated.

Support

For support issues, contact Unravel Support.