Apache Kafka is a distributed streaming platform that enables you to build real-time streaming data pipelines and applications. Setting up Kafka can be complex, but Docker Compose simplifies the process by defining and running multi-container Docker applications. This guide provides a step-by-step approach to creating a Kafka topic using Docker Compose, making it accessible for developers and DevOps professionals alike.

Advertisement

Prerequisites

Before diving into the creation process, ensure you have the following prerequisites installed on your system:

  • Docker: Provides the ability to create, deploy, and run applications by using containers.
  • Docker Compose: A tool for defining and running multi-container Docker applications.

Step 1: Create a Docker Compose File

The first step involves creating a docker-compose.yml file. This file defines the Kafka and Zookeeper services necessary for your Kafka instance to run. Zookeeper is a centralized service for maintaining configuration information, naming, providing distributed synchronization, and providing group services.


version: '3'
services:
  zookeeper:
    image: wurstmeister/zookeeper
    container_name: zookeeper
    ports:
      - "2181:2181"
    networks:
      - kafka-net
  kafka:
    image: wurstmeister/kafka
    container_name: kafka
    ports:
      - "9092:9092"
    environment:
      KAFKA_ADVERTISED_LISTENERS: INSIDE://kafka:9092,OUTSIDE://localhost:9093
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: INSIDE:PLAINTEXT,OUTSIDE:PLAINTEXT
      KAFKA_LISTENERS: INSIDE://0.0.0.0:9092,OUTSIDE://0.0.0.0:9093
      KAFKA_INTER_BROKER_LISTENER_NAME: INSIDE
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_CREATE_TOPICS: "YourTopicName:1:1"
    networks:
      - kafka-net
networks:
  kafka-net:
    driver: bridge

Replace YourTopicName with the desired name for your Kafka topic. The KAFKA_CREATE_TOPICS environment variable format is TopicName:NumberOfPartitions:ReplicationFactor.

Step 2: Running Docker Compose

Navigate to the directory containing your docker-compose.yml file and run the following command in your terminal:

docker-compose up -d

This command will download the necessary Docker images for Kafka and Zookeeper, and then start the containers in detached mode.

Step 3: Verifying the Topic Creation

To verify that your Kafka topic has been created, you can use the Kafka topics command-line tool that comes with Kafka. Execute the following command to list the topics to verify your topic was created:

docker-compose exec kafka kafka-topics.sh --list --zookeeper zookeeper:2181 

You should see YourTopicName listed among the topics.

Step 4: Producing and Consuming Messages

To further test your setup, you can produce and consume messages with the Kafka console producer and consumer scripts.

Producing Messages:

In the Kafka container’s bash, run:

docker-compose exec kafka kafka-console-producer.sh --broker-list localhost:9092 --topic YourTopicName 

After executing the command, you can type messages into the console. Press Ctrl+D to send the messages.

Consuming Messages:

Open another terminal session, access the Kafka container again, and run:

docker-compose exec kafka kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic YourTopicName --from-beginning 

You should see the messages you produced earlier.

Step 4: Creating a Kafka Topic (Optional)

By default the docker creates topics defined with KAFKA_CREATE_TOPICS variable in docker-compose.yaml file. But still you can create new Kafka topics with the following command:

docker-compose exec kafka kafka-topics.sh --create --topic NewTopicName --partitions 1 --replication-factor 1 --bootstrap-server kafka:9092 

Replace “NewTopicName” with your new topic name. The command above initializes a new topic and configured with a single partition and a single replica, through a Kafka broker on port 9092.

Next, list the topics to verify your topic is created:

docker-compose exec kafka kafka-topics.sh --list --zookeeper zookeeper:2181 

This will list all topics including created above.

NOTE: Due to limitations in metric names, topics with a period (‘.’) or underscore (‘_’) could collide. To avoid issues it is best to use either, but not both.

Conclusion

You’ve now successfully created a Kafka topic using Docker Compose and verified its functionality by producing and consuming messages. This setup not only simplifies the process of managing Kafka but also provides a scalable and easily reproducible environment for your streaming applications. Whether you’re developing locally or deploying in a production environment, Docker Compose with Kafka offers a powerful toolset to streamline your data streaming pipelines.

Share.
Leave A Reply


Exit mobile version