Quantyca Technologies banner

Data streaming, unlike integration paradigms based on batch streams or APIs that focus mainly on the state of the business entities being handled, focuses on the collection, management and exchange of events between applications and systems.

An event describes a change that has occurred, at a specific moment in time, in the state of an entity of interest. Since it is always possible to reconstruct the state of an entity from change events, streaming systems allow the state of applications to be transferred asynchronously and in a highly scalable manner between the systems concerned.

Kafka is one of the world’s most widely used platforms for managing streaming data and real time data integration. The Confluent platform, whose company was founded by the creators of Kafka, expands its benefits with enterprise-level functionality by reducing TCO and facilitating management and usage.


We consider it essential to evaluate the real-time component from the outset in the realisation of new use cases, since subsets and snapshots of data can always be derived from it for different types of consumption, whether real-time or batch. On the other hand, a flow of data managed in batch mode cannot produce a continuous flow of data as an output. This type of flow is preferred because there is usually a perception that realising real-time flows is more complex than managing batches of data, ignoring the fact that their future evolution (which may be required following a change in requirements) typically requires greater effort and cost than an additional initial effort to process data streams.

The greater speed of integration provided by stream-based architectures provides an important competitive advantage in terms of user satisfaction and the effectiveness of actions taken in response to what happens. Our conception of streaming platforms does not only envisage their feeding from operational data sources for consumption by operational and analytical systems, which is the standard use case, but is also open to the collection of information from analytical systems (e.g. collecting the output of a machine learning model) and its propagation to operational systems.

Consider, for example, how important it is in an omnichannel scenario to be able to align physical shop and e-commerce in real time, or to provide personalised recommendations to the customer while he or she is still in the shop or on the site and not the next day, or to detect fraud at the same time as banking transactions and not afterwards.


Confluent Platform

The Confluent platform sees at its foundation Apache Kafka, the most popular open source distributed streaming platform. Its key functionalities are:

  • Production and consumption of event streams
  • Reliable and scalable information storage
  • Confluent has built around Apache Kafka a platform capable of supporting all the needs of organisations working with event streams. The solution comes in two forms:
    – Confluent Platform, for on-premise or public cloud deployment
    – Confluent Cloud, with fully-managed services

Each release of the Confluent Platform makes use of the latest version of Apache Kafka and all the integrated services that interface with it. Some platform features are freely usable by the community, others are under commercial licence.

Among the main advantages for choosing to adopt the Confluent platform:

Productivity increase
enables a wider range of developers to use Kafka and accelerates the speed of creating streaming applications
Improved efficiency of operations
reduces operational complexity by ensuring high performance and scalability as the volume of events handled increases
Easy integration in complex environments
allows Kafka to be used within key constraints and requirements in a corporate production environment
Freedom of choice
ability to use Kafka at enterprise level in any environment whether on-premise, cloud (managed or not) or hybrid
in addition to the Confluent support included in the platform, you have at your disposal a large community of experts
Wide range of features
advanced and unique tools for managing data, metadata and applications

Confluent Platform

The Confluent platform edition for provisioning on public cloud or on-premise.

The key functionalities of the platform are:

  • Collection and distribution of events
  • Selection and transformation of events
  • Event validation and versioning

Confluent Cloud

The fully managed SaaS edition of the Confluent platform specifically designed to exploit the full potential of the cloud. In addition to covering the functionality offered by the Platform version, it offers a number of additional benefits including:

  • Choice of three major cloud providers (AWS, Azure and GCP)
  • More than 70 fully managed connectors to facilitate integration management while reducing TCO
  • Elastic and on-demand scaling of resources – eliminating classic up-front sizing efforts for self-managed solutions
  • Storage decoupled from computation and automatically scalable without limits – through the exploitation of cloud object stores
  • Resilience with 99.99% SLA and multi-zone replication
  • Automatic and transparent updates

The platform offers several possibilities for infrastructure management. The Confluent Cloud solution removes this responsibility altogether by offering a SaaS solution, while for the Confluent Platform you have the following options:

To support the development of real-time applications, several tools can be found including:

The simplification of platform management is certainly a differential aspect compared to the use of open versions or other vendors and is made possible by a series of tools, including

  • the Control Centre, a web interface from which the main management and monitoring operations of the Kafka cluster can be carried out
  • the Confluent Health+ service, which analyses Kafka cluster metrics and produces intelligent alerts to warn of potential problems along with some advanced monitoring functions
  • the Confluent CLI with which all administration tasks can be performed
  • the Tiered Storage, which allows automatic offloading based on retention and moves older data to object stores to reduce storage costs
  • Self-Balancing Clusters with automatic rebalancing of partitions to optimise throughput, facilitate scaling and reduce management effort
  • Cluster Linking for managing Kafka clusters distributed across multiple datacentres or regions or for deployment on multiple cloud providers
  • Multi-region clusters to enable consumers to read events even from brokers that are not leaders, drastically reducing cross-datacentre traffic between clients and brokers
  • the Replicator for replicating data and metadata between two Kafka clusters based on Kafka Connect

One of the most critical aspects for the success of the architecture is effective and efficient data governance, and Confluent offers several tools to support it, including

  • the Schema Registry for validating and versioning the schemas used by events circulating in the platform
  • the Stream Catalog, a data catalogue that provides the ability to search for data and topics in Kafka by examining their schemas or associated tags
  • the Stream Lineage, a data lineage solution applied to streaming applications to identify correlations between streams and data

As far as data security aspects are concerned, the following features are available:


Quantyca has been a Confluent Premium Partner since 2015, one of the first in Italy. As a certified partner, we have been able to work together with Confluent implementing different solutions for our clients in different sectors such as retail, utilities, industrial, insurance and financial.

We are resellers of the Confluent platform and offer the following consulting services:

  • assessment of existing solutions
  • design and implementation of solutions based on event driven architectures
  • start-up of new projects
  • design of disaster recovery and business continuity solutions
  • migration projects from Apache Kafka to the Confluent platform
  • migration projects from Confluent Enterprise to Confluent Cloud
  • remote or in-house training
+7 Partnership
Channel Manager: Pietro La Torre
+20 Projects
successfully delivered
+20 Certifications
Confluent Certified Administrator for Apache Kafka, Confluent Certified Developer for Apache Kafka

Success stories

Use cases

Onsite Event
Quantyca at Kafka Summit London 2024
Date and Time: 19/03/2024

Quantyca partecipated at Kafka Summit – London 2024, a cutting-edge event to explore the latest in Apache Kafka, data streaming, and real-time technologies. This summit is pivotal for staying ahead...

Onsite Event
Quantyca at Data in Motion 2023
Date and Time: 26/10/2023

As Gold Sposor at Data in Motion, we talked about Mainframe offloading and stream Processing to support BNL’s digital channels. Data in Motion is the annual event dedicated to the...

Onsite Event
Quantyca at Kafka Summit London 2022
Date and Time: 25/04/2022

There are several solutions used today to make data managed by legacy systems available in real time. Compared to the others, however, Change Data Capture (CDC) is one of the...

Onsite Event
Data in Motion 2022
Date and Time: 06/10/2022

Digital Integration Hub for near-real-time monitoring of logistics: The Arcese case study. We attended Data in Motion, the Confluent Conference for the first time in Italy: it was a unique...

Online Webinar
Digital Integration Hub 2021
Date and Time: 07/07/2021

Main topics In this webinar, DIH will be presented as an architectural pattern that can decouple legacy systems from consumers and, at the same time, make data available at various...

Video Talk
Real-time Inventory with Kafka and Kafka Stream at Rinascente 2021
Date and Time: 25/10/2021

Consumer purchasing habits have drastically changed in recent years. Going to a physical store is no longer the only option for retail shopping. Retailers have evolved to meet this new...

Onsite Event
Quantyca at Kafka Summit Europe 2021
Date and Time: 11/05/2021

Legacy systems are the kings of our IT architectures. They govern the evolution of the technology ecosystem that hosts them because of the control they have gained over time over...



Build with Confluent: Quality gate through centralized computational policy enforcement


Data Contracts Management: Schema Registry and Beyond


Data Contracts Management_Kafka Summit 2024


Digital Integration Hub – Kafka Summit London 2022


Digital Integration Hub per il monitoraggio in near-real time della logistica: il caso Arcese – Data in Motion Milan 2022


Digital Integration Hub – Webinar Slide Deck

Need personalised advice? Contact us to find the best solution!

This field is for validation purposes and should be left unchanged.

Join the Quantyca team, let's be a team!

We are always looking for talented people to join the team, discover all our open positions.