Console Producer and Consumer Basics, no (de)serializers

Question:

What is the simplest way to write messages to and read messages from Kafka?

Edit this page

Example use case:

So you are excited to get started with Kafka and you'd like to produce and consume some basic messages and you want to do so quickly. In this tutorial we'll show you how to produce and consume messages from the command line with no code!

Code example:





Short Answer

With Confluent Cloud, you can use the Confluent Cloud CLI to produce and consume messages.

Producer:

ccloud kafka topic produce order-detail --value-format avro --schema order-detail-schema.json

Consumer:

ccloud kafka topic consume order-detail --value-format avro

Try it

1
Initialize the project

To get started, make a new directory anywhere you’d like for this project:

mkdir ccloud-produce-consume && cd ccloud-produce-consume

2
Sign up for Confluent Cloud and provision resources

Sign up for Confluent Cloud, a fully-managed Apache Kafka service. Then provision your resources:

  1. After you log in to Confluent Cloud, click on Add cloud environment and name the environment learn-kafka. Using a new environment keeps your learning resources separate from your other Confluent Cloud resources.

  2. From the Billing & payment section in the Menu, apply the promo code CC100KTS to receive an additional $100 free usage on Confluent Cloud (details).

  3. Click on LEARN and follow the instructions to launch a Kafka cluster and to enable Schema Registry.

Confluent Cloud

3
Download and setup the Confluent Cloud CLI

Instructions for installing Confluent Cloud CLI and configuring it to your Confluent Cloud environment is available from within the Confluent Cloud UI. Navigate to your Kafka cluster, click on the CLI and tools section, and run through the steps in the CCloud CLI tab.

4
Create the Kafka topic

Create a Kafka topic called order-detail in Confluent Cloud.

ccloud kafka topic create order-detail

This should yield the following output:

Created topic "order-detail".

5
Create a schema for your records

We are going to use the Confluent Cloud managed Schema Registry to control our record format. The first step is creating a schema definition which we will use when producing new records.

Create the following order-detail-schema.json file:

{
"type": "record",
"namespace": "io.confluent.tutorial",
"name": "OrderDetail",
"fields": [
    {"name": "number", "type": "long", "doc": "The order number."},
    {"name": "date", "type": "long", "logicalType": "date", "doc": "The date the order was submitted."},
    {"name": "shipping_address", "type": "string", "doc": "The shipping address."},
    {"name": "subtotal", "type": "double", "doc": "The amount without shipping cost and tax."},
    {"name": "shipping_cost", "type": "double", "doc": "The shipping cost."},
    {"name": "tax", "type": "double", "doc": "The applicable tax."},
    {"name": "grand_total", "type": "double", "doc": "The order grand total ."}
    ]
}

6
Start a console consumer

Next, let’s open up a consumer to read records from the new topic.

From the same terminal you used to create the topic above, run the following command to start a console consumer with the ccloud CLI:

ccloud kafka topic consume order-detail --value-format avro

You will be prompted for the Confluent Cloud Schema Registry credentials as shown below. Enter the values you got from when you enabled Schema Registry in the Confluent Cloud UI.

Enter your Schema Registry API key:
Enter your Schema Registry API secret:

The consumer will start up and block waiting for records, you won’t see any output until after the next step.

7
Produce events to the Kafka topic

Now we are going to produce records to our new topic using the schema created a few steps back. Open a second terminal window and start the producer:

ccloud kafka topic produce order-detail --value-format avro --schema order-detail-schema.json

The producer will start with some information and then wait for you to enter input.

Successfully registered schema with ID 100001
Starting Kafka Producer. ^C or ^D to exit

Below are example records in JSON format with each line representing a single record. In this case we are producing records in Avro format, however, first they are passed to the producer in JSON and the producer converts them to Avro based on the order-detail-schema.json schema prior to sending them to Kafka.

Copy each line and paste it into the producer terminal, pressing enter after each one to produce the new record.

{"number":1,"date":18500,"shipping_address":"ABC Sesame Street,Wichita, KS. 12345","subtotal":110.00,"tax":10.00,"grand_total":120.00,"shipping_cost":0.00}
{"number":2,"date":18501,"shipping_address":"123 Cross Street,Irving, CA. 12345","subtotal":5.00,"tax":0.53,"grand_total":6.53,"shipping_cost":1.00}
{"number":3,"date":18502,"shipping_address":"5014  Pinnickinick Street, Portland, WA. 97205","subtotal":93.45,"tax":9.34,"grand_total":102.79,"shipping_cost":0.00}
{"number":4,"date":18503,"shipping_address":"4082 Elmwood Avenue, Tempe, AX. 85281","subtotal":50.00,"tax":1.00,"grand_total":51.00,"shipping_cost":0.00}
{"number":5,"date":18504,"shipping_address":"123 Cross Street,Irving, CA. 12345","subtotal":33.00,"tax":3.33,"grand_total":38.33,"shipping_cost":2.00}

As you produce records you can observe them in the consumer terminal.

8
Teardown Confluent Cloud resources

You may try another Kafka tutorial, but if you don’t plan on doing other tutorials, use the Confluent Cloud UI or CLI to destroy all the resources you created. Verify they are destroyed to avoid unexpected charges.