Integrating Confluent Cloud with MuleSoft using Kafka Connector with SASL/Plain Configuration – Mule 4
- February 01, 2024
In a real-world scenario, an ecosystem is required to serve both message publishers and receivers, facilitating the sending and receiving of messages through various channels. This can be achieved through the use of the Publisher-Subscriber model. Publish-Subscribe (Pub-Sub) is an architectural design pattern employed in distributed systems to enable asynchronous communication between different components or services.
As part of this article, we’ll establish a workflow that involves publishing data on a specific topic and creating a subscriber flow that listens to these published messages. Once a message is successfully consumed and processed, the subscriber will acknowledge the receipt.
What is the Pub-Sub model?
Pub/Sub is an event-driven architecture that offers a framework for message exchange between components responsible for creating and sending messages, known as Producers or Publishers, and components that receive and consume messages, known as Consumers or Subscribers. In this model, Publishers send messages to a designated Topic (which acts as a messaging channel), and Subscribers must subscribe to the relevant Topic to access and read the published messages.
What is Kafka?
Kafka is a highly distributed system renowned for its ability to deliver high throughput and exceptional performance. This open-source system combines a Pub-Sub messaging system with a robust messaging queue, making it adept at efficiently managing vast volumes of streaming data, ensuring it moves seamlessly from source to destination.
Requirements
- Anypoint Studio
- Apache Kafka Connector (version 4.7.x)
- An active Confluent Cloud account
Confluent Cloud setup
Ensure you have an active Confluent Cloud account. For this demonstration, we’ll use a free trial account from Confluent. After creating the account, proceed to create a Cluster within Confluent Cloud. Create Topics within the Cluster to facilitate your messaging needs.
- Select cluster type, appropriate zone and name the cluster.
- Create a topic.
- Name the topic and select the number of partitions.
- Create an API key, which is required to connect Mule to the Confluent server.
- Copy the API key credentials, or you can download them as a file.
- Retrieve the Cluster endpoint details from the cluster settings. (The previously downloaded file also contains bootstrap server details.)
Mule Application setup
Now, create the project in Anypoint Studio to test the connectivity with Confluent Cloud using the credentials we generated.
- Create a simple Mule flow that will publish messages on a topic.
- Create a subscriber flow that will consume events from the topic, and if the message is processed successfully, a commit will be sent to acknowledge back to the listener.
Produce configuration
- Change connection drop down to Producer SASL/Plain Connection.
- Set the Bootstrap server.
- Provide the username and password of the API key created for the Kafka cluster.
- Under the security tab, set the endpoint identification algorithm as HTTPS, and provide TLS configuration or set it as an insecure connection.
- Test the connection.
Consumer configuration
- Change connection dropdown to Consumer SASL/Plain Connection.
- Set the Bootstrap server.
- Add the Group ID if created (Optional).
- Under the Topics section, add assignments – Topic to be listened to and Partition number.
- Provide the username and password of the API key created for the Kafka cluster.
- Under the security tab, set the endpoint identification algorithm as HTTPS and provide TLS configuration.
Publish Connector
- Select ‘Producer Configuration’ from the dropdown.
- Add the Topic Name on which you want to publish an event.
- Choose the partition number (optional field, default value = 1).
- Enter the Key (optional) and Message.
Message Listener
- Add Poll timeout if required.
- Choose the Acknowledgment mode:
- Manual: After successfully consuming data, users must manually commit the event through the Commit operation. To commit, users have to send the ‘Consumer commit key,’ which is available in the attributes after listening.
- Auto: Kafka messages get committed automatically if the Mule flow finishes successfully.
- Immediate: Once messages are received by the Mule flow, Mule immediately commits the messages without waiting for the flow to complete.
- DUPS_OK: Named as ‘Manual’ but operates asynchronously, which may lead to duplicate records.
- Specify the number of parallel consumers.
Once all the setup is complete, your flows are now prepared to perform Publish-Subscribe (Pub/Sub) operations.
To initiate the publishing process, use any REST client to trigger the publish flow with the desired payload. This action will publish data to the specified topic as configured, and upon the publication of an event to the topic, the Kafka Message listener will immediately start listening to the event and subsequently process it.
References
https://docs.mulesoft.com/kafka-connector/4.7/kafka-connector-reference
https://docs.mulesoft.com/kafka-connector/4.7/kafka-connector-upgrade-migrate
— By Nilesh Dhongade