Kafka Component Release Notes

This page lists the main features added to the Kafka component.

Feature Highlights

Version 2.0.2

Synchronous sending mode

Kafka component was sending messages asynchronously, without waiting for an answer from the Kafka broker (Fire and Forget).

This was allowing better performances when sending a lot of messages. However, this avoided to properly compute messages status and statistics.

A new mode has been added to choose how the messages are sent, which allows sending also messages synchronously if wanted.

  • The new synchronous mode is the default mode used now, it allows to have a better management and follow the status and statistics properly.

  • Asynchronous mode is still available if you want better performances and you do not need statistics about messages sent.

Defining the writing mode in Metadata

A new attribute is available in Metadata to choose the writing mode.

This allows to define a default value for the Mapping and Processes which will use this Metadata.

Note that you can define a default value in Metadata and override it in Mappings easily, as described in the next section.

Defining the writing mode in Mapping

Writing mode can be customized also in Mappings, on the Kafka Integration Templates.

The value defined on the Template overrides the default value defined in Metadata.

If no value is defined, the Metadata default value is used.

INTEGRATION Kafka To File Template

New Batch Size parameter

Creation of a new parameter "Batch Size" to defines how many messages must be read from the topic before writing to the file.

When reading messages, the messages will be written in the file each "n" messages received, "n" being the batch size.

When you define a large timeout so that the Mapping will read data continuously, make sure to define a proper value for this parameter.

As an example, you can define it to "1" for instance if you want each message to be written in the file as soon as it is read.

Change Data Capture (CDC)

Multiple improvements have been performed to homogenize the usage of Change Data Capture (CDC) in the various Components.

Parameters have been homogenized, so that all Templates should now have the same CDC Parameters, with the same support of features.

Multiple fixes have also been performed to correct CDC issues. Refer to the changelog for the exact list of changes.

Minor improvements and fixed issues

This version also contains some other minor improvements and fixed issues, which can be found in the complete changelog.

Version 2.0.1

Sample project

The component example project can now be imported directly in the "New" menu of the Project Explorer.

Change Log

Version 2023.1.16

Bug fixes

  • DI-9664: Updated Apache ZooKeeper third-party libraries.

Version 2023.1.13

Feature improvements

  • DI-9620: Added Semarchy Data Intelligence harvesting features.

Version 2023.1.9

Bug fixes

  • DI-8824: Update Apache ZooKeeper third-party libraries.

Version 2023.1.8

Bug fixes

  • DI-8759: Update Apache Avro third-party libraries.

  • DI-8830: Update JSON third-party libraries.

  • DI-8861: Temporary tables are not cleaned up correctly at the end of a process.

Version 2023.1.5 (Component Pack)

Bug fixes

  • DI-8220: Third-party library upgrade.

  • DI-7989: Third-party library upgrade.

  • DI-7908: Third-party library upgrade.

Version 2023.1.2 (Component Pack)

Bug fixes

  • DI-6324: When using Kafka structured as source, the Load Kafka Structure to RDBMS template does not create the Load_Temporal table and an insert error occurs at the integration step.

Version 2023.1.0 (Component Pack)

Feature improvements

  • DI-4661: The Create indexes on Target Table parameter has been added to the INTEGRATION Kafka to Rdbms template.

  • DI-6300: The Kafka 3.2 is now supported.

Bug fixes

  • DI-6298: Multiple third-party libraries upgrade.

  • DI-6316: SnakeYAML - Third-party library upgrade.

  • DI-6520: Jackson - Third-party library upgrade.

Version 5.3.8 (Component Pack)

Bug fixes

  • DI-6304: When using Kafka structured as source and an RDBMS table as stage target, the load template fails to load data into the stage as well as into the temporary load tables.

Version 3.0.0 (Component Pack)

Feature improvements

  • DI-3701: Allow Components to contribute to Designer monitored statistics

  • DI-4053: Query Editor menu renamed to "Launch Query Editor"

  • DI-4508: Update Components and Designer to take into account dedicated license permissions

  • DI-4813: Rebranding: Drivers classes and URLs

  • DI-4962: Improved component dependencies and requirements management

Version 2.0.2 (Kafka Component)

Feature improvements

  • DI-1298: INTEGRATION Kafka to File Template - Addition of the "Batch Size" parameter to define how many messages must be read from the topic before writing to the file

  • DI-1940: Support synchronous mode when sending messages (new attribute and parameter in Metadata and Templates are available to define the write mode)

  • DI-1909: Templates updated - New Parameters 'Unlock Cdc Table' and 'Lock Cdc Table' to configure the behaviour of CDC tables locking

Bug fixes

  • DI-1913: An error was thrown when trying to connect to a secure Kafka server, such as "KafkaException: javax.security.auth.login.LoginException: unable to find LoginModule class: org.apache.kafka.common.security.plain.PlainLoginModule"

  • DI-2162: The Component was not working on Topics having a dot in the name

  • DI-2658: Kafka Metadata - 'connect to database' context menu was not working

  • DI-1908: Templates updated - The 'Cdc Subscriber' parameter was ignored in some Templates on Lock / Unlock CDC steps

  • DI-1907: Templates updated - The 'Cdc Subscriber' parameter was ignored in some Templates when querying the source data

Version 2.1.0 (Kafka Component)

Feature improvements

  • DI-3713: Internal change on how some libraries are built to ease maintenance