Kafka Streams Quick Start for Confluent Cloud¶
Confluent for VS Code provides project scaffolding for many different Apache Kafka® clients, including Kafka Streams. The generated project has everything you need to compile and run a simple Kafka Streams application that you can extend with your code.
This guide shows you how to build a Kafka Streams application that connects to a Kafka cluster. You’ll learn how to:
- Create a Kafka Streams project using Confluent for VS Code
- Process streaming data with Kafka Streams operations
- Run your application in a Docker container
Confluent for VS Code generates a project for a Kafka Streams application that consumes messages from an input topic and produces messages to an output topic by using the following code.
builder.stream(INPUT_TOPIC, Consumed.with(stringSerde, stringSerde))
.peek((k, v) -> LOG.info("Received raw event: {}", v))
.mapValues(value -> generateEnrichedEvent())
.peek((k, v) -> LOG.info("Generated enriched event: {}", v))
.to(OUTPUT_TOPIC, Produced.with(stringSerde, stringSerde));
Tip
In this guide, you run shell commands for tasks like starting Docker containers, and you can run these commands in VS Code’s Terminal pane. For more information, see Terminal Basics.
Prerequisites¶
- Confluent for VS Code: Follow the steps in Installation.
- Docker installed and running in your development environment.
- A Kafka cluster running in Confluent Cloud.
- Kafka bootstrap server
host:post
, for example,pkc-abc123.<cloud-provider-region>.<cloud-provider-region>.confluent.cloud:9092
, which you can get from the Cluster Settings page in Cloud Console. For more information, see How do I view cluster details with Cloud Console?. - Kafka cluster API key and secret, which you can get from the Cluster Overview > API Keys page in Cloud Console.
- Kafka bootstrap server
Step 1: Create the Kafka Streams project¶
Create the Kafka Streams project by using the Kafka Streams Application template and filling in a form with the required parameters.
Open the template in VS Code directly¶
To go directly to the Kafka Streams Application template in VS Code, click this button:
Open template in VS CodeThe Kafka Streams Application form opens.
Skip the manual steps and proceed to Step 2: Fill in the template form.
Open the template in VS Code manually¶
Follow these steps to open the Kafka Streams Application template manually.
Open VS Code.
In the Activity Bar, click the Confluent icon. If you have many extensions installed, you may need to click … to access Additional Views and select Confluent from the context menu.
In the extension’s Side Bar, locate the Support section and click Generate Project from Template.
The palette opens and shows a list of available project templates.
Click Kafka Streams Application.
The Kafka Streams Application template opens.
Step 2: Fill in the template form¶
The project needs a few parameters to connect with your Kafka cluster.
- In the Kafka Streams Application form, provide the following values.
- Kafka Bootstrap Server: Enter the
host:port
string from the Cluster Settings page in Cloud Console. - Kafka Cluster API Key: Enter the Kafka cluster API key.
- Kafka Cluster API Secret: Enter the Kafka cluster API secret.
- Input Topic: The name of a topic that the Kafka Streams application consumes messages from. Enter input_topic. You create this topic in a later step.
- Output Topic: The name of a topic that the Kafka Streams application produces messages to. Enter output_topic. You create this topic in a later step.
- Kafka Bootstrap Server: Enter the
- Click Generate & Save, and in the save dialog, navigate to the directory in your development environment where you want to save the project files and click Save to directory.
Confluent for VS Code generates the project files.
- The Kafka Streams code is saved in the
src/main/java/examples
directory, in a file namedKafkaStreamsApplication.java
. - A
docker-compose.yml
file declares how to build the Kafka Streams code. - Configuration settings, like
bootstrap.servers
, are saved in a file namedconfig.properties
. - Secrets, like the Kafka cluster API key, are saved in a file named
.env
. - A README.md file has instructions for compiling and running the project.
Step 3: Connect to Confluent Cloud¶
In the extension’s Side Bar, click Sign in to Confluent Cloud.
In the dialog that appears, click Allow.
A browser window opens to the Confluent Cloud login page.
Enter your Confluent Cloud credentials, and click Log in.
When you’re authenticated, you’re redirected back to VS Code, and your Confluent Cloud resources are displayed in the extension’s Side Bar.
Step 4: Create topics¶
Confluent for VS Code enables creating Kafka topics easily within VS Code.
In the extension’s Side Bar, open Local in the Resources section and click cluster-local.
The Topics section refreshes, and the cluster’s topics are listed.
In the Topics section, click + to create a new topic.
The palette opens with a text box for entering the topic name.
In the palette, enter input_topic. Press ENTER to confirm the default settings for the partition count and replication factor properties.
The new topic appears in the Topics section.
Repeat the previous steps for another new topic named output_topic.
Step 5: Compile and run the project¶
Your Kafka Streams project is ready to build and run in a Docker container.
In your terminal, navigate to the directory where you saved the project.
The Confluent for VS Code extension saves the project files in a subdirectory named
kafka-streams-simple-example
. Run the following command to navigate to this directory.cd kafka-streams-simple-example
Run the following command to build and run the Kafka Streams application.
docker compose up --build
Docker downloads the required images and starts a container that compiles the project.
Step 6: Produce messages to the input topic¶
Confluent for VS Code enables producing messages to Kafka topics from within VS Code.
In this step, you create a file that has an example message that you send to the input topic.
Copy the following example message into a file named
test-message.json
and save the file.{ "headers": [ { "key": "task.generation", "value": "350" }, { "key": "task.id", "value": "0" }, { "key": "current.iteration", "value": "39067914" } ], "key": 39067914, "value": { "id": "123e4567-e89b-12d3-a456-426614174000", "timestamp": 1638360000000, "customer": { "name": "John Smith", "email": "john.smith@example.com", "address": "123 Main St, Suite 456, Anytown, ST 12345", "phone": "(555) 123-4567" }, "order": { "orderId": "AB123456", "product": "Ergonomic Steel Keyboard", "price": "199.99", "department": "Electronics", "quantity": 2 } } }
In the extension’s Side Bar, hover over input_topic and click Send Message(s) to Topic.
The palette opens with a textbox for entering the path to the message file.
In the palette, navigate to the
test-message.json
file, click Select a File, and click OK.A notification reports that you have successfully produced a message to input_topic.
In the Side Bar, hover over input_topic and click View Messages.
The message viewer opens and shows the message you sent in the previous step.
In the Side Bar, hover over output_topic and click View Messages.
The message viewer opens and shows the message that was produced by the Kafka Streams application in response to the message sent to input_topic.
Step 7: Clean up¶
To clean up your development environment, simply stop the Docker container that’s running the Kafka Streams application.
In the terminal, run the following command to stop the Kafka Streams application.
docker compose down
Your output should resemble:
✔ Container kafka-streams-app Removed
Note
This website includes content developed at the Apache Software Foundation under the terms of the Apache License v2.