Good As Gold: Kinetica Kafka Connector Earns Top Certification from Confluent
We’re excited to announce that our Kinetica Kafka Connector has achieved Gold Certified status from Confluent! This enables you to use Apache Kafka to stream data into and out of Kinetica. Streaming data analytics is a top priority for real-time business so we’ve invested in meeting Confluent’s requirements for Gold Certification around code development, best practices, security, and documentation. While the Kinetica Connector previously met the Confluent Standard Certification, we knew we had to raise the bar.
Why Does This Matter?
Analyzing real-time streaming data at massive scale is critical for all digital businesses and a key component of active analytics. Your business needs to be able to analyze and react to data simultaneously. Active analytical applications constantly run and update, able to direct other systems based on real-time results. Streaming data helps to make that a reality.
Apache Kafka provides a unified, high-throughput, low-latency platform for handling real-time data feeds from internal and external systems, connected devices, and more. Our certified connector makes it easy to stream data directly between Kafka and Kinetica. Organizations using Kinetica can ingest real-time data streams from Kafka and provide a means for analysis and immediate action on incoming data using our Active Analytics Platform. For example you might ingest streaming data from sensors, mobile apps, IoT devices, and social media into Kinetica, combine it with data at rest, and analyze it in real time to improve the customer experience, deliver targeted marketing offers, and increase operational efficiency. This is ideal for use cases like real-time replenishment in retail, risk analysis in financial services, and threat assessment in cybersecurity. Leveraging the Kafka connector, Kinetica users can now analyze complex, multidimensional streaming and batch data interactively.
With the Gold Certification, we now fully support all of the latest Confluent Connect features in addition to the requirements of the Standard Certification.
- Schema Migration for the Sink Connector
- The Gold certified version of the Kinetica Kafka Connector has a new feature: operating a mix of data schema versions and keeping track of the latest changes made to data schemas through the use of the Confluent Schema Registry tool. Configuring the connector to validate schema compatibility allows parsing evolving data and saving it into a single Kinetica table.
- Single Message Transform
- The Kinetica Kafka Connector’s support of Avro Schema evolution and integration with Confluent Schema Registry allows the end user a lot of flexibility to change the original Kafka message format, adding or removing message fields without interrupting data ingest, and conforming with multiple message schemas at the same time without data loss.
- Control Center Integration
- Configuring the Kafka Connector can be done through the Confluent Control Center. Code packaging, documentation and licensing are done according to the Confluent standard. The Kinetica Kafka Connector can be managed via a REST interface, which allows integration into the Control Center, providing the end user with better visualization of configuration options and the overall outcome of data ingestion.
- Kafka Connect API
- We’ve added detailed Kinetica Kafka Connector documentation and test suites to allow end users to test data ingest in stand-alone mode. This can be done before staging the Kinetica Kafka Connector in a distributed multi-task mode in a production environment and scaling it for a better data throughput.
Interested in learning more about how we work with Confluent? Check out our Confluent Integration page.
Nataliya is a Sr. Software Engineer on the ecosystem team at Kinetica.