Skip to main content

Overview

Centralized Management

Kafka Connection stores centralized topic access information for Integration Flow or Connector steps in a single location, making them reusable.

Standard Configuration

Manages various Kafka producer/consumer properties through propertiesMap to maintain standard configuration in the messaging infrastructure.

Security

When Enable Secure is activated, it enforces SSL/TLS parameters, guaranteeing encryption of critical data traffic.

Test Function

Since connection parameters are validated before Deployment through the Test Connection function, the error catching process is accelerated.

Connection Initiation

When a Kafka Message Queue connection is requested from within an Integration Flow or Connector, the system reads the configured connection parameters.

Connection Pool Management

The Kafka producer/consumer pool reuses existing connections or opens new connections using values such as max.block.ms, reconnect.backoff.max.ms defined in propertiesMap.

Authentication

If Enable Secure is active, mutual certificate-based Authentication is applied; optionally, SASL/SCRAM information is also read from propertiesMap.

Data Communication

Messages are sent to or consumed from the topic via Kafka wire protocol over TCP; serializer settings are taken from key.serializer and value.serializer fields in propertiesMap.

Connection Management

After the operation is completed, the connection returns to the pool; connections whose enable toggle is deactivated cannot be selected by any Flow.

Error Management

In case of connection error, timeout, or authentication error, retry is performed according to retry.backoff.ms and max.block.ms values, and the result is transferred to the Apinizer message service.

Integration Flow Messaging

Targeting uniform Kafka topics in “Send Message” or “Consume Message” steps within Integration Flow

Connector Data Collection

Sharing centralized Kafka configuration in connector-based data collection processes

Scheduled Job Telemetry

Sending telemetry or logs to Kafka at certain intervals in Scheduled Jobs

Technical Features and Capabilities

Topic-Based Messaging

Thanks to the mandatory topicName field, each Connection focuses on a specific topic and prevents incorrect topic selection in Integration Flow steps.

Dynamic Kafka PropertiesMap

Desired Kafka client parameters are stored in key-value structure through propertiesMap, and string, integer, or secret data are separated thanks to MapValue types.

Built-in Name Uniqueness Check

The debounce mechanism on the UI prevents creating a Connection with the same name and warns about conflicts early.

Environment-Based Configuration

Ability to define separate connection parameters for each environment (Development, Test, Production).

Enable/Disable Control

Activating or deactivating the Connection (enable/disable toggle). In passive state, the connection cannot be used but its configuration is preserved.

SASL/SCRAM Support

SASL mechanisms are added to propertiesMap, enabling encrypted Authentication scenarios without code changes.

Keystore-Truststore Management

When Enable Secure is enabled, both KeyStore and TrustStore can be selected, and if necessary, new records are created instantly in Secret Manager.

Dynamic Protocol Selection

Multiple SSLContext protocols (TLSv1.2, TLSv1.3, etc.) can be selected with MultiSelect and stored on the same Connection.

Connection Test Feature

Ability to validate connection parameters before saving with the “Test Connection” button.

Export/Import Feature

Exporting Connection configuration as a ZIP file. Importing to different environments (Development, Test, Production). Version control and backup capability.

Connection Monitoring

Monitoring connection health, pool status, and performance metrics.

Connection Parameters

Name

Description: Connection name (must be unique)
Example Value: Production_KafkaTopic01
Notes: Should not start with space, special characters should not be used

Environment

Description: Environment ID where the connection will be deployed
Example Value: Development
Notes: Changing environment provides parametric management within the same Connection

Topic Name

Description: Target topic name on Kafka
Example Value: audit.events.v1
Notes: The same topic should be used in producer and consumer steps

Properties Map

Description: Key-value pairs for Kafka client (bootstrap.servers, etc.)
Example Value: bootstrap.servers=broker1:9092
Notes: At least one record must exist; MapValue types must be selected correctly

Description

Description: Description about the Connection purpose
Default Value: (Empty)
Recommended Value: Prod audit topic publish connection

Enable Secure

Description: Enables SSL/TLS usage
Default Value: false
Recommended Value: true (Production)

Protocol Types

Description: Allowed SSLContext protocols when Enable Secure is enabled
Default Value: (Empty)
Recommended Value: TLSv1.2, TLSv1.3

KeyStoreId

Description: Keystore resource containing client certificate
Default Value: (Empty)
Recommended Value: ks-prod-clients

TrustStoreId

Description: Truststore resource verifying broker certificate
Default Value: (Empty)
Recommended Value: ts-shared-root

Deploy To Worker

Description: Comes as true in connectionConfigKafka.model.ts
Default Value: true
Recommended Value: true

Connection Timeout

Description: Maximum wait time for connection establishment
Default: 3000
Min: 1000 | Max: 60000
Unit: milliseconds

Request Timeout

Description: Maximum wait time for request response
Default: 3000
Min: 1000 | Max: 120000
Unit: milliseconds

Pool Size

Description: Maximum number of connections in Connection pool (client instance)
Default: 10
Min: 1 | Max: 200
Unit: count

Retry Backoff

Description: retry.backoff.ms value, wait between retries
Default: 3000
Min: 100 | Max: 10000
Unit: milliseconds

Usage Scenarios

Real-Time Audit

Situation: All services should publish messages to audit topic
Solution: topicName=audit.events, bootstrap.servers=cluster-prod:9092
Expected Behavior: Audit messages are collected in a single topic, audit team consumes from a single point

Multi-Environment Management

Situation: Same Connection should be used in different environments
Solution: Environment=Development, Enable Secure=false
Expected Behavior: Different broker URLs are managed with environment selection

Secure Production Publishing

Situation: Prod broker requires TLS
Solution: Enable Secure=true, ProtocolTypes=TLSv1.3, KeyStoreId=ks-prod
Expected Behavior: Certificate verification is ensured, messages are sent encrypted

High Traffic Queue

Situation: Sudden traffic increase
Solution: Pool Size=50, linger.ms=5, batch.size=32768
Expected Behavior: Producer batches grow, throughput increases

Retry Optimization

Situation: Broker occasionally does not respond
Solution: retry.backoff.ms=5000, retries=10
Expected Behavior: No message loss with automatic retries

SLA Monitoring (optional)

Situation: Message delays should be measured
Solution: delivery.timeout.ms=60000, enable.idempotence=true
Expected Behavior: Producer timeouts are logged, SLA reports are fed

Connection Configuration

Creating New Kafka

Image 2024 9 9 15 35 35 Pn

Configuration Steps

1

Navigate to Creation Page

  • Go to Connection → Kafka section from the left menu.
  • Click the [+ Create] button in the top right.
2

Enter Basic Information

Enable Status (Active Status):
  • Set active/passive status with toggle. New connections are active by default.
Name - Mandatory:
  • Example: Production_KafkaAudit
  • Enter a unique name, should not start with space.
  • System automatically checks. Green checkmark: available. Red cross: existing name.
Description:
  • Example: “Audit topic producer connection”
  • Max. 1000 characters.
  • Describe the purpose of the Connection.
3

Environment Selection

  • Select environment from dropdown menu: Development, Test, or Production.
  • Different connection parameters can be defined for each environment.
4

Kafka Specific Parameters - Properties & Topic

  • Enter broker URLs, serializer settings, and timeout values in the propertiesMap table.
  • Don’t forget to select valueType for each record; use INTEGER for numeric values.
  • Write the topic to connect to in the Topic Name field.
You can achieve high availability by adding multiple broker URLs.
5

Kafka Specific Parameters - Secure Messaging

  • Enable TLS by turning on the Enable Secure toggle.
  • Select supported SSLContext protocols in the Protocol Types field.
  • Select KeyStore and TrustStore or create a new keystore.
Always use SSL/TLS in Production environment and select secure protocols.
6

Timeout and Connection Pool Settings

  • Add parameters such as max.block.ms, request.timeout.ms, retry.backoff.ms to propertiesMap.
  • Determine settings such as linger.ms, batch.size, connections.max.idle.ms according to traffic volume.
7

Security and Authentication Settings

  • If using SASL/SCRAM, add sasl.mechanism, sasl.jaas.config keys.
  • Ensure the correct TrustStore is selected for broker certificates.
  • Store sensitive credential values as Secret type MapValue instead of plaintext.
Always store sensitive information as Secret type MapValue.
8

Test Connection

  • Click the [Test Connection] button.
  • Test whether connection parameters are correct.
  • Success: Green confirmation message
  • Failed: Error details are shown
9

Saving

  • Click the [Save and Deploy] button in the top right.
Checklist:
  • Unique name
  • Mandatory fields filled
  • Test connection successful (recommended)
Result:
  • Connection is added to the list
  • Becomes available for use in Integration Flow and Connector steps
  • Becomes active according to environment
Connection created successfully! You can now use it in Integration Flow and Connector steps.

Deleting Connection

Deletion Process

Select Delete from the menu at the end of the row or click the [Delete] button on the connection detail page

Deletion Tips

Check Before Deleting: It may be used in Integration Flow or Connector steps. If necessary, assign an alternative connection. Back up with Export before deletingAlternative: Deactivation
  • Use the Disable option instead of deleting.
  • Connection becomes passive but is not deleted.
  • Can be reactivated when needed.

Exporting/Importing Connection

In this step, users can export existing connections for backup, transfer to different environments, or sharing purposes, or import a previously exported connection again. This process is used to maintain data integrity in version management, transitions between test and production environments, or inter-team sharing processes.

Method 1

Select ⋮ → Export from the action menu. ZIP file is automatically downloaded.

Method 2

Click the [Export] button on the connection detail page. ZIP file is downloaded.

File Format

Format: Date-connection-ConnectionName-export.zip
Example: 13 Nov 2025-connection-Production_Kafka-export.zip

ZIP Contents

  • Connection JSON file
  • Metadata information
  • Dependency information (e.g., certificates, key store)

Usage Areas

  • Backup
  • Transfer between environments (Test → Prod)
  • Versioning
  • Team or project-based sharing

Import Steps

  • Click the [Import Kafka] button on the main list.
  • Select the downloaded ZIP file.
  • System checks: Is format valid? Is there a name conflict? Are dependencies present?
  • Then click the [Import] button.

Import Scenarios

Scenario 1: Name Conflict → Overwrite the old connection or create with a new name.Scenario 2: Missing Dependencies → Create missing certificates or key stores first or exclude them during import.

Usage Areas of Connection

Creating and Activating Connection

Steps:
  1. Create the Connection.
  2. Validate the connection with Test Connection.
  3. Save and activate with Save and Deploy.
  4. Ensure the Connection is in Enabled status

Usage in Integration / Connector Steps

Connection is selected in steps that require communication with external systems such as message queue (queue), topic, email, FTP/SFTP, LDAP, or similar. Example: Steps such as “Send Message”, “Consume Message”, “Upload File”, “Read Directory”. Connection selection is made from the Connection field in the configuration of these steps

Scheduled Job Usage

In scheduled tasks (e.g., sending messages at certain intervals, file processing, etc.), access to external systems is provided by selecting the connection. When the connection changes, the job execution behavior is updated accordingly

Usage for Testing Purposes

The correctness of the connection can be checked independently of the Integration Flow with the Connection Test feature. This test is critical in the debugging process

Best Practices

Topic and Partition Planning

Bad: Writing all messages to a single partition.
Good: Spreading traffic by increasing partition count.
Best: Determining partition plan per topic according to consumer count and throughput needs

PropertiesMap Versioning

Bad: Adding values randomly.
Good: Manually tracking changes.
Best: Keeping propertiesMap changes under version control with export files

Certificate Lifecycle Management

Bad: Not tracking keystore/truststore expiration dates.
Good: Keeping a manual calendar.
Best: Automatically planning certificate renewals with Secret Manager events and monitoring alarms

Monitoring and Alerting

Bad: Not monitoring connection health.
Good: Performing manual tests.
Best: Tracking Connection Monitoring metrics with APM/Prometheus and generating automatic alarms

Environment Management

Bad: Using the same connection parameters in all environments.
Good: Creating separate connections for each environment.
Best: Managing all environments in a single connection using the Environment option, only changing environment when transitioning between environments

Connection Test

Bad: Saving and deploying the connection without testing.
Good: Validating with Test Connection before saving.
Best: Testing after every parameter change, performing full integration test in test environment before going to production

SASL Configuration

Store SASL username/password or token information as Secret type MapValue; do not leave inline credentials in JAAS config

Broker Access Segmentation

Grant access to Kafka brokers only from whitelisted IP ranges, close unnecessary ports

Log Masking

Mask log lines containing bootstrap or credentials; do not leave plaintext credentials in debug logs

Credential Management

Store sensitive information such as usernames and passwords using environment variables or secret manager. Do not hardcode credentials in code or configuration files. Update passwords periodically

SSL/TLS Usage

Always enable SSL/TLS in Production environment. Use self-signed certificates only in development environment. Track certificate expiration dates and renew them on time

Access Control

Allow only authorized users to change Connection configuration. Store connection change logs. Apply change approval process for critical connections

Not Using Multiple Broker URLs

Why to avoid: Connection breaks in broker failure.
Alternative: Define multiple bootstrap.servers addresses

Selecting Wrong Serializer Types

Why to avoid: Messages cannot be deserialized, error occurs.
Alternative: Determine key/value serializers according to message format

Keeping SASL Parameters as Plaintext

Why to avoid: Risk of credential leakage.
Alternative: Use Secret MapValue

Using Production Connection in Test Environment

Why to avoid: Test data may be written to production system, real users may be affected, security risk occurs.
Alternative: Create separate connections for each environment, use environment parameter, separate connection names by adding prefix according to environment (Test_, Prod_)

Very Low Timeout Values

Why to avoid: Connection constantly times out in network delays, Integration steps fail.
Alternative: Adjust timeout values according to real usage scenarios, measure network latency and determine timeouts accordingly

Not Using Connection Pool

Why to avoid: New connection is opened for each request, performance decreases, resource consumption increases, target system load increases.
Alternative: Enable connection pool, adjust pool size according to traffic volume, set up pool monitoring

Batch Size

Recommendation: Adjust batch.size value according to message size (e.g., 32 KB).
Effect: Throughput increases with less network collision

Compression

Recommendation: Select snappy or lz4 as compression.type.
Effect: Bandwidth on broker decreases

Using Async Send

Recommendation: Prefer asynchronous production by optimizing acks and linger.ms settings.
Effect: Client wait time shortens

Connection Pool Optimization

Recommendation: Adjust pool size according to peak traffic (recommended: concurrent request count × 1.5), set idle connection timeouts, perform pool health check.
Effect: Connection opening cost decreases by 80%, response times decrease, resource usage is optimized

Timeout Values Optimization

Recommendation: Measure real network latency, adjust timeout values accordingly, avoid very low or very high timeouts.
Effect: Unnecessary waits are prevented, fast fail-over is provided, user experience improves

Connection Monitoring

Recommendation: Monitor connection pool usage, track timeout rates, perform connection health check, set up alerting.
Effect: Problems are detected proactively, performance bottlenecks are identified early, downtime decreases

Troubleshooting

Topic name may be incorrect, topic may not be created on Broker, or ACL permissions may be missing.
1

Topic Name

Verify the topicName field.
2

Topic Status

Check topic status from broker administrator.
3

ACL Permissions

Add the relevant user to ACL policies.
Wrong SASL information, certificate invalid, or TrustStore missing.
1

SASL Update

Update SASL MapValues.
2

Certificate Check

Check certificate expiration.
3

TrustStore Check

Ensure correct TrustStore is selected.
Network delay, target system responding slowly, or timeout value may be too low.
1

Network Check

Check network connectivity.
2

System Health

Check target system health.
3

Timeout Settings

Increase timeout values.
4

Log Review

Review connection logs.
Wrong username/password, expired credentials, or permission problem may exist.
1

Credentials

Verify credentials.
2

User Status

Check that the user is active in the target system.
3

Permission Check

Check that necessary permissions are granted.
4

Certificate Check

Check SSL/TLS certificates.
Pool size may be too low, connection leak exists, or traffic may be too high.
1

Pool Size

Increase pool size.
2

Connection Check

Check that connections are properly closed.
3

Idle Timeout

Set idle connection timeouts.
4

Metric Monitoring

Monitor connection usage metrics.
A different connection may be selected in Integration/Connector step, the step may be misconfigured, or Flow/Job may not be redeployed.
1

Enable Toggle

Check that the Connection’s enable toggle is active.
2

Connection Selection

Verify that the correct connection is selected in Integration Flow.
3

Connection Deploy

Redeploy the Connection.
4

Flow/Job Deploy

Redeploy Integration Flow or Job.
5

Log Check

Check Gateway logs.

Frequently Asked Questions (FAQ)

The same Connection focuses on a single topic, but can be quickly duplicated for different topics; this way permissions and monitoring are kept separate.
It is possible for Development environment but not recommended to close TLS in Production; keep it open according to security policies.
When adding MapValue, select SECRET as valueType; UI masks values and stores them encrypted during export.
Broker list is tried according to the first reachable server; DNS round-robin or bootstrap list is recommended for high availability.
You can, but it is not recommended; Integration Flow will continue to give errors after Deployment. Fix the error in the test first.
Yes, the same connection can be used in multiple Integration Flow or Connector steps. This provides centralized management and guarantees configuration consistency. However, changes made to the connection will affect all usage locations, so care should be taken.
Using Connection pool is not mandatory but strongly recommended in high-traffic systems. Reusing existing connections instead of opening a new connection for each request significantly increases performance.
Yes, it is recommended to create separate connections for each environment. Alternatively, you can manage all environments in a single connection using the environment parameter. This approach provides easier management and less error risk.
Several reasons may exist:
  1. Connection enable toggle may be passive
  2. A different connection may be selected in Integration step
  3. Connection may not be deployed
  4. Integration Flow may not have been redeployed yet