As a developer of applications and services, you can connect Node.js applications to Kafka instances in OpenShift Streams for Apache Kafka. Node.js is a server-side JavaScript runtime that is designed to build scalable network applications. Node.js provides an I/O model that is based on events and non-blocking operations, which enables efficient applications.
In this quick start, you’ll use the Streams for Apache Kafka web console to collect connection information for a Kafka instance. Then you’ll manually configure a connection from an example Node.js application to the Kafka instance and start producing and consuming messages.
Note
|
When you’ve completed this quick start and understand the required connection configuration for a Kafka instance, you can use the OpenShift Application Services command-line interface (CLI) to generate this type of configuration in a more automated way. To learn more, see Connecting client applications to OpenShift Application Services using the rhoas CLI. |
-
You have a running Kafka instance in Streams for Apache Kafka (see Getting started with OpenShift Streams for Apache Kafka).
-
You have a command-line terminal application.
-
Git is installed.
-
You have an IDE such as IntelliJ IDEA, Eclipse, or Visual Studio Code.
-
Node.js 14 or later is installed.
Note
|
The example Node.js application in this quick start uses the KafkaJS client by default. If you want to use the node-rdkafka client, you must install some development tools locally on your computer or use a container runtime such as Podman or Docker to run a specified container image and configure a development environment. To learn more, see the documentation for the example Node.js application. |
For this quick start, you’ll use sample code from the Nodeshift Application Starters reactive-example repository in GitHub. After you understand the concepts and tasks in this quick start, you can use your own Node.js applications with OpenShift Streams for Apache Kafka in the same way.
-
On the command line, clone the Nodeshift Application Starters reactive-example repository from GitHub.
$ git clone https://github.com/nodeshift-starters/reactive-example.git
-
In your IDE, open the
reactive-example
directory of the repository that you cloned.
To enable your Node.js application to access a Kafka instance, you must configure a connection by specifying the following details:
-
The bootstrap server endpoint for your Kafka instance
-
The generated credentials for your OpenShift Streams for Apache Kafka service account
-
The Simple Authentication and Security Layer (SASL) mechanism that the client will use to authenticate with the Kafka instance
In this task, you’ll create a new configuration file called rhoas.env
. In the file, you’ll set the required bootstrap server and client credentials as environment variables.
-
You have the bootstrap server endpoint for your Kafka instance. To get the server endpoint, select your Kafka instance in the Streams for Apache Kafka web console, select the options icon (three vertical dots), and click Connection.
-
You have the generated credentials for your service account. To reset the credentials, use the Service Accounts page in the Application Services section of the Red Hat Hybrid Cloud Console.
-
In your IDE, create a new file. Save the file with the name
rhoas.env
, at the root level of thereactive-example
directory for the cloned repository. -
In the
rhoas.env
file, set the SASL authentication mechanism and the Kafka instance client credentials as shown in the following configuration. Replace the client ID and client secret values with your own credential information. The configuration uses SASL/OAUTHBEARER authentication, which is the recommended authentication mechanism to use in Streams for Apache Kafka.Setting environment variables in the rhoas.env fileKAFKA_HOST=<bootstrap_server> RHOAS_SERVICE_ACCOUNT_CLIENT_ID=<client_id> RHOAS_SERVICE_ACCOUNT_CLIENT_SECRET=<client_secret> KAFKA_SASL_MECHANISM=oauthbearer RHOAS_TOKEN_ENDPOINT_URL=https://sso.redhat.com/auth/realms/redhat-external/protocol/openid-connect/token
-
Save the
rhoas.env
file.
The Node.js application in this quick start uses a Kafka topic called countries
to produce and consume messages. In this task, you’ll create the countries
topic in your Kafka instance.
-
You have a running Kafka instance in OpenShift Streams for Apache Kafka.
-
In the Streams for Apache Kafka web console, select Kafka Instances and then click the name of the Kafka instance that you want to add a topic to.
-
Select the Topics tab.
-
Click Create topic and follow the guided steps to define the topic details.
You must specify the following topic properties:
-
Topic name: For this quick start, enter
countries
as the topic name. -
Partitions: Set the number of partitions for the topic. For this quick start, set the value to
1
. -
Message retention: Set the message retention time and size. For this quick start, set the retention time to
A week
and the retention size toUnlimited
. -
Replicas: For this release of Streams for Apache Kafka, the replica values are preconfigured. The number of partition replicas for the topic is set to
3
and the minimum number of follower replicas that must be in sync with a partition leader is set to2
. For a trial Kafka instance, the number of replicas and the minimum in-sync replica factor are both set to1
.
After you complete the setup, the new topic appears on the Topics page. You can now run the Node.js application to start producing and consuming messages.
-
-
Verify that the
countries
topic is listed on the Topics page.
After you configure your Node.js application to connect to a Kafka instance, and you create the required Kafka topic, you’re ready to run the application.
In this task, you’ll run the following components of the Node.js application:
-
A
producer-backend
component that generates random country names and sends these names to the Kafka topic -
A
consumer-backend
component that consumes the country names from the Kafka topic
-
You’ve configured the Node.js example application to connect to a Kafka instance.
-
You’ve created the
countries
topic. -
You’ve set permissions for your service account to produce and consume messages in the
countries
topic. For the Node.js application in this example, the consumer group you must specify in your permissions is calledconsumer-test
. To learn how to configure access permissions for a Kafka instance, see Managing account access in OpenShift Streams for Apache Kafka.
-
On the command line, navigate to the
reactive-example
directory of the repository that you cloned.$ cd reactive-example
-
Navigate to the directory for the consumer component. Use Node Package Manager (npm) to install the dependencies for this component.
Installing dependencies for the consumer component$ cd consumer-backend $ npm install
-
Run the consumer component.
$ node consumer.js
You see the Node.js application run and connect to the Kafka instance. However, because you haven’t yet run the producer component, the consumer has no country names to display.
If the application fails to run, review the error log in the command-line window and address any problems. Also, review the steps in this quick start to ensure that the application and Kafka topic are configured correctly.
-
Open a second command-line window or tab.
-
On the second command line, navigate to the
reactive-example
directory of the repository that you cloned.$ cd reactive-example
-
Navigate to the directory for the producer component. Use Node Package Manager to install the dependencies for this component.
Installing dependencies for the producer component$ cd producer-backend $ npm install
-
Run the producer component.
$ node producer.js
When the producer component runs, you see output like that shown in the following example:
Example output from the producer componentGhana Réunion Guatemala Luxembourg Mayotte Syria United Kingdom Bolivia Haiti
As shown in the example, the producer component runs and generates messages that represent country names.
-
Switch back to the first command-line window.
You now see that the consumer component displays the same country names generated by the producer, and in the same order, as shown in the following example:
Example output from the consumer componentGhana Réunion Guatemala Luxembourg Mayotte Syria United Kingdom Bolivia Haiti
The output from both components confirms that they successfully connected to the Kafka instance. The components are using the Kafka topic that you created to produce and consume messages.
NoteYou can also use the OpenShift Streams for Apache Kafka web console to browse messages in the Kafka topic. For more information, see Browsing messages in the OpenShift Streams for Apache Kafka web console. -
In your IDE, in the
producer-backend
directory of the repository that you cloned, open theproducer.js
file.Observe that the producer component is configured to process environment variables from the
rhoas.env
file that you created. The component used the bootstrap server endpoint and client credentials stored in this file to connect to the Kafka instance. -
In the
consumer-backend
directory, open theconsumer.js
file.Observe that the consumer component is also configured to process environment variables from the
rhoas.env
file that you created.