Apache Kafka

Detailed documentation on the Apache Kafka pubsub component

Component format

To setup Apache Kafka pubsub create a component of type pubsub.kafka. See this guide on how to create and apply a pubsub configuration.

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: kafka-pubsub
  namespace: default
spec:
  type: pubsub.kafka
  version: v1
  metadata:
      # Kafka broker connection setting
    - name: brokers
      value: "dapr-kafka.myapp.svc.cluster.local:9092"
    - name: authRequired
      value: "true"
    - name: saslUsername
      value: "adminuser"
    - name: saslPassword
      value: "KeFg23!"
    - name: maxMessageBytes
      value: 1024

Spec metadata fields

FieldRequiredDetailsExample
brokersYComma separated list of kafka brokerslocalhost:9092, dapr-kafka.myapp.svc.cluster.local:9092
authRequiredNEnable authentication on the Kafka broker. Defaults to "false"."true", "false"
saslUsernameNUsername used for authentication. Only required if authRequired is set to true."adminuser"
saslPasswordNPassword used for authentication. Can be secretKeyRef to use a secret reference. Only required if authRequired is set to true. Can be secretKeyRef to use a secret reference"", "KeFg23!"
maxMessageBytesNThe maximum message size allowed for a single Kafka message. Default is 1024.2048

Per-call metadata fields

Partition Key

When invoking the Kafka pub/sub, its possible to provide an optional partition key by using the metadata query param in the request url.

The param name is partitionKey.

Example:

curl -X POST http://localhost:3500/v1.0/publish/myKafka/myTopic?metadata.partitionKey=key1 \
  -H "Content-Type: application/json" \
  -d '{
        "data": {
          "message": "Hi"
        }
      }'

Create a Kafka instance


You can run Kafka locally using this Docker image. To run without Docker, see the getting started guide here.


To run Kafka on Kubernetes, you can use any Kafka operator, such as Strimzi.