Configuring the Kafka Collector
You can configure Kafka Collector, to collect the data from the various data
sources.
- To configure the Kafka collector, navigate toAdministration>Configuration>Collectors and Connectors.
- From the Collectors section, clickAdd.
- From the Collector Selection page, selectkafka-collector.Thekafka-collectorpage is displayed.You must provide the source Kafka topic where data come from, and the respective mapping that will be used to process incoming records.Following table lists the parameter to configure the Kafka Collector.Input parameterDescriptionDefault ValueName of the collectorProvide name of the collectorNATypeChoose from following:
- Metrics
- Events
MetricsData CentreSelect the location of the collector data.NASource Kafka ConfigurationTopicProvide source Kafka topic where data can come from.metrics: If type of the collector is Metrics.events: If type of the collector is Events.Bootstrap serversProvide source Kafka broker detail.edge-kafka-bootstrap:9093Transport Layer Security (TLS)Select theTransport Layer Security (TLS)toggle button, and click Upload TLS certificate, to upload the mandatory security certificate from your local machine.AuthenticationSelect theAuthenticationtoggle button, to enable authentication.Select the authentication type from following and provide the respective details:- Plain: Provide user name and password.
- SCRAM-SHA-512: Provide user name and password.
- TLS: Upload TLS Certificate and TLS Key from your local machine.
- GSSAPI: Provides an alternate security protocol mechanism (Kerberos) for both client and server authentication. The Mandatory fields are mentioned below:
- Principal: Enter Kerberos principal is a unique identity to which Kerberos can assign tickets.
- Realm: Enter realm name.
- KDC: Enter Key Distribution center address.
- Kerberos Service Name: Enter identifier of a service instance.
- Keytab: Upload Keytab file from local machine.
The kerberos service DNS must be reachable from all the VMware Telco Cloud Service Assurance nodes. - OAuth: Depending on the selection of OAuth type, other mandatory fields are mentioned in the table:
- Client Secret: Provide Client ID and Client Secret details.
- Refresh Token: Provide Client ID and Refresh Token details.
- Access Token: Provide Access Token details.
Avro Schema SupportSelect theAvro Schema Supporttoggle button, to enable the Avro schema support.Only confluent schema registry is supported.NASchema Registry URLProvide the registry URL of the schema in theSchema Registry URL.Only confluent schema registry is supported.Schema Registry CertificateSelect theSchema Registry Certificatetoggle button, to upload the TLS certificate for schema registry.This selection is mandatory only when theSchema Registry URLisHTTPS.Client mTLSSelect theClient mTLStoggle button, to upload theTLS CertificateandTLS Keyfor client.AuthenticationSelect theAuthenticationtoggle button, to enable authentication.Select the authentication type from following and provide the respective details:- Plain: Provide user name and password.
Destination Kafka Configuration (available only if type of the collector is Events.)TopicProvide Kafka topic to which the processed Event data must be written.vsa_events_rawBootstrap serversProvide destination Kafka broker detail.edge-kafka-bootstrap:9093Transport Layer Security (TLS)Select theTransport Layer Security (TLS)toggle button, and click Upload TLS certificate, to upload the mandatory security certificate from your local machine.AuthenticationSelect theAuthenticationtoggle button, to enable authentication.Select the authentication type from following and provide the respective details:- Plain: Provide user name and password.
- SCRAM-SHA-512: Provide user name and password.
- TLS: Upload TLS Certificate and TLS Key from your local machine.
- GSSAPI: Provides an alternate security protocol mechanism (Kerberos) for both client and server authentication. The Mandatory fields are mentioned below:
- Principal: Enter Kerberos principal is a unique identity to which Kerberos can assign tickets.
- Realm: Enter realm name.
- KDC: Enter Key Distribution center address.
- Kerberos Service Name: Enter identifier of a service instance.
- Keytab: Upload Keytab file from local machine.
The kerberos service DNS must be reachable from all the VMware Telco Cloud Service Assurance nodes. - OAuth: Depending on the selection of OAuth Type, other mandatory fields are mentioned in the table:
- Client Secret: Provide Client ID and Client Secret details.
- Refresh Token: Provide Client ID and Refresh Token details.
- Access Token: Provide Access Token details.
Advanced ConfigurationApplication IdProvide an identifier for the stream processing application, must be unique within the Kafka cluster.Auto Offset ResetPossible values are:- Earliest: automatically reset the offset to the earliest offset.
- Latest: automatically reset the offset to the latest offset.
- None: throw exception to the consumer if no previous offset is found for the consumer's group.
- anything else: throw exception to the consumer.
LatestGroup IDProvide an unique string that identifies the Connect cluster group this worker belongs to.MapperKafka MapperName of the Kafka Mapper to be used with this collector. Drop down list of all configured Kafka Mapper based on the type selected earlier. Refer, Kafka Mapper section for more informationFirst entry of the drop down list.Mapping Definition PreviewPreview the selected mapping schema definition as configured in the Kafka Mapper.NA - ClickCreate Collector.