Apache Kafka provides support for multiple programming languages, enabling developers to build Kafka producers and consumers using their preferred language. This flexibility allows for seamless integration of Kafka into various application ecosystems. In this article, we will explore the process of creating Kafka producers and consumers in different programming languages. We will provide code samples, reference links, and resources to guide you through the implementation.

Creating Kafka Producers:

  1. Java:
    Java is the primary language for Kafka, and the Kafka Java API provides robust support for creating producers. Here’s a code snippet to create a Kafka producer in Java:
Java<span role="button" tabindex="0" data-code="import org.apache.kafka.clients.producer.*; import java.util.Properties; public class KafkaProducerExample { public static void main(String[] args) { Properties properties = new Properties(); properties.put("bootstrap.servers", "localhost:9092"); properties.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer"); properties.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer"); Producer<string, String> producer = new KafkaProducer<>(properties); String topic = "my_topic"; String message = "Hello, Kafka!"; ProducerRecord<string, String> record = new ProducerRecord
import org.apache.kafka.clients.producer.*;

import java.util.Properties;

public class KafkaProducerExample {
    public static void main(String[] args) {
        Properties properties = new Properties();
        properties.put("bootstrap.servers", "localhost:9092");
        properties.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
        properties.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");

        Producer<String, String> producer = new KafkaProducer<>(properties);

        String topic = "my_topic";
        String message = "Hello, Kafka!";

        ProducerRecord<String, String> record = new ProducerRecord<>(topic, message);

        producer.send(record);

        producer.close();
    }
}
  1. Python:
    Python has become increasingly popular for Kafka development. The confluent-kafka-python library provides an efficient Kafka client for Python. Here’s an example of creating a Kafka producer in Python:
Bash
from confluent_kafka import Producer

producer = Producer({'bootstrap.servers': 'localhost:9092'})

topic = "my_topic"
message = "Hello, Kafka!"

producer.produce(topic, value=message)

producer.flush()

Creating Kafka Consumers:

  1. Java:
    In Java, creating Kafka consumers is straightforward using the Kafka Java API. Here’s an example of creating a Kafka consumer in Java:
Java<span role="button" tabindex="0" data-code="import org.apache.kafka.clients.consumer.*; import org.apache.kafka.common.TopicPartition; import java.util.Arrays; import java.util.Properties; public class KafkaConsumerExample { public static void main(String[] args) { Properties properties = new Properties(); properties.put("bootstrap.servers", "localhost:9092"); properties.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer"); properties.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer"); properties.put("group.id", "my_consumer_group"); KafkaConsumer<string, String> consumer = new KafkaConsumer<>(properties); String topic = "my_topic"; consumer.subscribe(Arrays.asList(topic)); while (true) { ConsumerRecords<string, String> records = consumer.poll(100); for (ConsumerRecord
import org.apache.kafka.clients.consumer.*;
import org.apache.kafka.common.TopicPartition;

import java.util.Arrays;
import java.util.Properties;

public class KafkaConsumerExample {
    public static void main(String[] args) {
        Properties properties = new Properties();
        properties.put("bootstrap.servers", "localhost:9092");
        properties.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
        properties.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
        properties.put("group.id", "my_consumer_group");

        KafkaConsumer<String, String> consumer = new KafkaConsumer<>(properties);

        String topic = "my_topic";
        consumer.subscribe(Arrays.asList(topic));

        while (true) {
            ConsumerRecords<String, String> records = consumer.poll(100);

            for (ConsumerRecord<String, String> record : records) {
                System.out.println("Received message: " + record.value());
            }
        }
    }
}
  1. Python:
    Python also provides support for Kafka consumers using the confluent-kafka-python library. Here’s an example of creating a Kafka consumer in Python:
Bash
from confluent_kafka import Consumer, KafkaException

consumer = Consumer({
    'bootstrap.servers': 'localhost:9092',
    'group.id': 'my_consumer_group',
    'auto.offset.reset': 'earliest'
})

topic = "my_topic"
consumer.subscribe([topic])

while True:
    message = consumer.poll(1.0)

    if message is None:
        continue

    if message.error():
        if message.error().code() == KafkaError._PARTITION_EOF:
            continue
        else:
            print("Error: {}".format(message.error().str()))
            break

    print("Received message: {}".format

(message.value().decode('utf-8')))

consumer.close()

Reference Link: Apache Kafka Documentation – https://kafka.apache.org/documentation/

Helpful Video: “Apache Kafka Producer and Consumer using Java” by Simplilearn – https://www.youtube.com/watch?v=Rt1pT6e-aKQ

Conclusion:

Creating Kafka producers and consumers in different programming languages allows developers to integrate Kafka seamlessly into their application ecosystems. Java remains the primary language for Kafka development, offering robust support through the Kafka Java API. Python has also gained popularity, providing efficient Kafka clients such as confluent-kafka-python.

In this article, we provided code samples for creating Kafka producers and consumers in both Java and Python. The provided reference link to the official Kafka documentation and the suggested video resource offer further guidance for implementation. By leveraging the power of Kafka producers and consumers, developers can build scalable and real-time data streaming applications, unlocking the full potential of Apache Kafka.