Skip to main content


v4.3.1.9 Release notes

Software version

Release Date: 11/05/2018 for

For details on downloading updates see Downloads.

Certified platforms
  • HDP: On-premise (up to v2.6.3) with Kerberos + spnego enabled.

  • MapR: 6.0.0 with MapR Expansion Packs 4.1.1

  • CDH: On-premise 5.13 & 5.14(including Hive 2.3.2 and Spark 2.3.0) with Kerberos

Unravel sensor upgrade
  • No

New features
  • None

Improvements and bug fixes
  • Special characters are now allowed in Workflow names. (CUSTOMER-377)

  • Added Hive on Spark Support in Airflow. (PLATFORM-770)

  • Fixes for Airflow Monitoring issues under Workflow. (CUSTOMER-304 & PLATFORM-772)

Bug fixes
  • Airflow WorkFlow

    • Auto action policy now triggers only specified workflows. (CUSTOMER-400)

    • Miscellaneous issues with Workflow compare graph issue are fixed. (UIX-1448)

    • AutoAction for Airflow Workflow fixes. (PLATFORM-852)

    • MR jobs associated with workflow are missing WFI icon/link in Application tab. (CUSTOMER-274)

    • Application tab filter for Pending Status is irrelevant; accepted status should be added. (CUSTOMER-278)

Known issues
  • Unravel does not support "Hive-on-Spark" out of the box, but will show all related Hive and Spark jobs triggered as part of the "Hive-on-Spark" job.

  • Some Spark apps are not showing up as part of Airflow workflow. (PLATFORM-780)

  • Out Of Memory, When polling large amount of past Airflow instances. (PLATFORM-913)

  • Add workflow pop up hangs for 2000+ workflows. (REPORT-116)

  • Cluster Summary and Cluster compare hangs for large dataset. (REPORT-117)

  • Date - time picker is missing for reports. (UIX-1201)

  • RBAC: Issue with view by details in infrastructure. (UIX-942)

  • Workflow Pagination issues. (UIX-1339)

  • Add workflow pop up hangs for 2000+ workflows. (PLATFORM-882)

Configuration properties
  • com.unraveldata.airflow.http.max.body.size.byte: Maximum size (bytes) of response from the HTTP calls to Airflow. 0 = unlimited. Default=0.

  • com.unraveldata.airflow.status.timeout.sec: Maximum time (seconds) Unravel waits to hear back from Airflow server for any RUNNING workflow. When the time is exceeded Unravel updates the workflow status to UNKNOWN. Default=3600 (1 hour).

  • None

Software upgrade support
  • Support for upgrade from 4.2.6 (4.2.6-1128) or 4.2.7 (4.2.7-1154)

  • Support for upgrade from 4.3.0.X to

    Upgrading from 4.2.6 or 4.2.7 with this RPM causes a background process to run for about 15 minutes. During which data is migrated in the unravel_s_1 daemon. To see status of this migration, see /usr/local/unravel/install_bin/es_migrate_4.3.1.0_from42.out and /usr/local/unravel/logs/es_ migrate_4.3.1.0_from42.log ; this migration can be re-run using/usr/local/unravel/install_bin/ (download from here) in the background with nohup. When the migration finishes successfully, disk space can be saved by running /usr/local/unravel/install_bin/ (download here).

    An RPM upgrade triggers a temporary background process so changes can take effect. You can monitor it or wait for these with /usr/local/unravel/install_bin/ which prints DONE when it finishes. Tt takes anywhere from 2-15 minutes depending on your data size.

    After upgrade:

    Run /usr/local/unravel/install_bin/await_fixups.shand wait for it to finish. It can take up to 15 minutes. If shell is terminated, it can be run again at any time.

    There should be 47 tables. Check the table count manually with and 'show tables;if fewer tables are present, run: sudo /usr/local/unravel/dbin/


Upgrade to 4.2.6 or later before upgrade to 4.3.x
Upgrading 4.2.x to 4.3.x, one-time potential must-so steps

When upgrading from 4.2.x to 4.3.x, you need to be cognizant of the run-as user for Unravel server daemons. Unravel Server 4.2.x and earlier runs daemons with 2 different local users (usually 'unravel' and 'hdfs' or 'mapr'). In 4.3.x, Unravel server simplifies this to run-as the single user 'unravel' by default. During an upgrade from 4.2.x to 4.3.x, all the daemons are converted over to user 'unravel'. Other changes that occur because of this:/srv/unravel/log_hdfs/* logs are moved to /usr/local/unravel/logs

/srv/unravel/log_hdfs/ directory is removed

/srv/unravel/tmp_hdfs/ is no longer needed, so it is removed (/srv/unravel/tmp/ is used instead)

env vars HDFS_KEYTAB_PATH and HDFS_KERBEROS_PRINCIPAL in /usr/local/unravel/etc/ no longer used

For Kerberos, com.unraveldata.kerberos.principal and com.unraveldata.kerberos.keytab.path are used instead.

After the first 4.3.x RPM upgrade, you MUST evaluate whether the must be run in order for Unravel Server to load logs via HDFS and to access kerberos-spnego protected REST endpoints. If MapReduce or Spark jobs stop loading after upgrade, that is a strong indication that you need to switch users. This script need only be run once. See Run Unravel Daemons with Custom User for more details. Subsequent upgrades 4.3.x to 4.3.y will not need to run script.

Upgrade process

An RPM upgrade will trigger a temporary background process so changes can take effect. You can monitor or wait for these with /usr/local/unravel/install_bin/ which prints "Done" when it finishes. The script waits and prints ps of the background process. It might take 2-5 min. depending on your data size.

The background fixups will produce log file output in:






For further support, contact Unravel Support.