Skip to main content

Home

Kafka security

You can improve the Kafka cluster security by having Kafka authenticate connections to brokers from client using either SSL or SASL.

SSL + Kerberos for Kafka clients

Prerequisite: Kafka brokers are configured with SSL and Kerberos. Refer to your Hadoop providers documentation for configuring SSL and Kerberos for Kafka brokers.

SSL+Kerberos is supported by new Kafka consumers and producers. The configuration is the same for consumers and producers. Replace items in red with values specific/relevant to your environment.

For single Kafka clients
  1. Create a file named consumerConfig.properties. Add the following properties and copy/move the file /usr/local/unravel/etc. You can locate your SSL + Kerberos configuration.

    security.protocol=SASL_SSL
    sasl.mechanism=GSSAPI
    sasl.kerberos.service.name=kafka
    ssl.enabled.protocols=TLSvl.2,TLSvl.l,TLSvl
    ssl.truststore.location=/usr/java/latest/jre/lib/security/jssecacerts
    ssl.truststore.password=changeit
    ssl.keystore.location=/opt/cloudera/security/pki/keystore.jks
    ssl.keystore.password=changeit
    sasl.jaas.config=\
    com.sun.security.auth.module.Krb5LoginModule required \
    useKeyTab=true \
    useTicketCache=true \
    keyTab="/opt/unravel/kafka.keytab" \
    principal="kafka/kafkavm. unraveldata.com@UNRAVELDATA.COM";

    Note

    Only if Kerberos is enabled, the security.protocol is SASL_PLAINTEXT and the SSL properties can be removed.

  2. Copy/move consumerConfig.properties to /usr/local/unravel/etc.

  3. Edit /usr/local/unravel/unravel.properties. Search for com.unraveldata.ext.kafka.clusters.

    com.unraveldata.ext.kafka.clusters=ClusterName

    Add the following property using the ClusterName from above.

    com.unraveldata.ext.kafka.ClusterName.consumer.config=/usr/local/unravel/etc/consumerConfig.properties
  4. Restart the Kafka monitor daemon unravel_km.

    service unravel_km restart
For multiple Kafka clients

Warning

Each cluster must have a separate consumerConfig.properties files.

  1. Open /usr/local/unravel/unravel.properties. Search for com.unraveldata.ext.kafka.clusters.

    The property should be defined with a comma separated list. If there is only one cluster name see above.

    com.unraveldata.ext.kafka.clusters=ClusterName1,ClusterName2,ClusterName3
  2. Create a file named consumerConfigClusterName.properties for each cluster.

    security.protocol=SASL_SSL
    sasl.mechanism=GSSAPI
    sasl.kerberos.service.name=kafka
    ssl.enabled.protocols=TLSvl.2,TLSvl.l,TLSvl
    ssl.truststore.location=/usr/java/latest/jre/lib/security/jssecacerts
    ssl.truststore.password=changeit
    ssl.keystore.location=/opt/cloudera/security/pki/keystore.jks
    ssl.keystore.password=changeit
    sasl.jaas.config=\
    com.sun.security.auth.module.Krb5LoginModule required \
    useKeyTab=true \
    useTicketCache=true \
    keyTab="/opt/unravel/kafka.keytab" \
    principal="kafka/kafkavm. unraveldata.com@UNRAVELDATA.COM";

    Note

    Only if Kerberos is enabled, the security.protocol is SASL_PLAINTEXT and the SSL properties can be removed.

  3. Copy/move each file to /usr/local/unravel/etc.

  4. Edit /usr/local/unravel/unravel.properties. For each cluster add the following property.

    com.unraveldata.ext.kafka.ClusterName.consumer.config=/usr/local/unravel/etc/consumerConfigClusterName.properties
  5. Restart the Kafka monitor daemon unravel_km.

    service  unravel_km restart
Kafka authorizations

Unravel consumes messages to topic __consumer_offsets using consumer group UnravelOffsetConsumer.

Sentry authorization

The following privilege must be granted using sentry:

HOST=*->CONSUMERGROUP=UnravelOffsetConsumer→action=read
HOST=*->CONSUMERGROUP=UnravelOffsetConsumer→action=write
HOST=*->CONSUMERGROUP=UnravelOffsetConsumer→action=describe
HOST=*->TOPIC=__consumer_offsets→action=read
HOST=*->TOPIC=__consumer_offsets→action=write
HOST=*->TOPIC=__consumer_offsets->action=describe

For further details see Using Kafka with Sentry Authorization in the Cloudera Distribution of Apache Kafka documentation.

Kafka with Ranger authorization

The following privilege must be granted using Ranger for the topic __consumer_offsets.

Publish
Consume
Describe

For further details, see Security - Create a Kafka Policy in the HDP Security Guide.

References

For further information see Apache Kafka documentation chapter # 7 Security.