Post

How to Deploy Kafka on Docker

Learn how to deploy Apache Kafka on Docker with this step-by-step guide. Simplify your Kafka setup with Docker containers.

How to Deploy Kafka on Docker

what is kafka ?

Kafka’s uses revolve around its ability to handle real-time data streaming, event-driven architectures, data integration, and stream processing. Its scalability, fault tolerance, and durability make it a go-to solution for modern data-driven applications. Whether you’re building real-time analytics, microservices, or IoT systems, Kafka provides a robust foundation for handling high-volume, high-velocity data streams.

Prerequisites

  • docker
  • docker-compose

Steps to Deploy Kafka on Docker

Create folder kafka

1
mkdir kafka && cd kafka

Create docker-compose.yaml

Docker Compose file to set up Kafka. It includes:

  • Zookeeper (needed for Kafka)
  • Kafka (for event streaming)
  • Kafka UI (Optional) (for easy monitoring of Kafka topics)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
services:
  zookeeper:
    image: confluentinc/cp-zookeeper:latest
    container_name: zookeeper
    ports:
      - "2181:2181"
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000

  kafka:
    image: confluentinc/cp-kafka:latest
    container_name: kafka
    depends_on:
      - zookeeper
    ports:
      - "9092:9092"
    environment:
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1

  kafka-ui:
    image: provectuslabs/kafka-ui:latest
    container_name: kafka-ui
    depends_on:
      - kafka
    ports:
      - "8080:8080"
    environment:
      KAFKA_CLUSTERS_0_NAME: local
      KAFKA_CLUSTERS_0_BOOTSTRAPSERVERS: kafka:9092

Starting the Kafka Cluster

1
docker-compose up -d

Test kafka

  1. Creating a Kafka Topic Once the services are running, create a new topic named my-first-topic:
    1
    
    docker-compose exec kafka kafka-topics --create --topic my-first-topic --partitions 1 --replication-factor 1 --if-not-exists --zookeeper zookeeper:2181
    
  2. Producing Messages: Start a producer to send messages to my-first-topic:
    1
    
    docker-compose exec kafka kafka-console-producer --topic my-first-topic --bootstrap-server localhost:9092
    

    Type your messages and press Enter to send each one.

  3. Consuming Messages: Start a consumer to read messages from my-first-topic:
    1
    
    docker-compose exec kafka kafka-console-consumer --topic my-first-topic --from-beginning --bootstrap-server localhost:9092
    

    This command displays all messages from the beginning of the topic.

Clean Up

When you’re done, you can stop and remove the Docker containers:

1
docker-compose down

What’s Next?

  • add database for persistent data
This post is licensed under CC BY 4.0 by the author.