Introduction

This document provides a detailed guide on setting up the Kafka target connector for the DataBrew project. The Kafka connector facilitates efficient and reliable streaming of processed data into your Kafka topics, ensuring that your data pipeline’s output is seamlessly integrated into your event streaming platform.

Kafka Connector is curretly avaialble as a destination connector. It means you will not be able to stream your data from Kafka

Requirements

Before setting up the Kafka target connector, ensure you meet the following requirements:

  • Access to your Kafka cluster.
  • Necessary permissions to produce messages to the Kafka topics.
  • Understanding of your Kafka cluster’s configuration, including topic setup and partitioning strategy.

Preparing Your Kafka Cluster

To ensure smooth integration:

  • Topic Configuration: Ensure that the topics you intend to produce messages to are properly configured in your Kafka cluster.
  • Access Control: Verify that the DataBrew project’s user account or service has the required permissions to produce messages, and create the desired topics.

Cloud Setup

This section guides you through setting up the Kafka target connector in the cloud for the DataBrew project.

Setting up in Cloud

  1. Access DataBrew Cloud Platform: Navigate to DataBrew Cloud App.
  2. Create a New Target Connector Instance: Follow these steps…
    • Step 1: Choose ‘Kafka’ from the list of available target connectors.
    • Step 2: Provide the necessary connection details, including your Kafka brokers, authentication details, and target topic.
    • Step 3: Configure additional settings like partitioning strategy, key serialization, and value serialization as required.
    • (Include screenshots or code snippets if necessary)

Was this page helpful?