Anchor: Navigation

Financial Services Issue 3.0 - February 2025 Follow the Stream Your Dedicated Community Content Scroll 

The Year of DSP: Data Streaming in 2025 & Shift Left

Welcome to the third issue of Follow the Stream! To kick off 2025, this edition is dedicated to the concept of "Shift Left”—a data processing and governance paradigm that moves data processing closer to the source and ingests fresh trustworthy data into any downstream consumers. Covering data streaming predictions and trends for the upcoming year, we explore how Shift Left architecture is revolutionizing data quality and providing reusable data products for both operational and analytical use cases in financial services.
See Industry Updates

Data Streaming Fundamentals

Data Streaming Crash Course 

Let's explore the basics of Shift Left & Data Streaming Platforms.

Heading 1

Subtitle 1

Heading 2

Subtitle 2

Heading 3

Subtitle 3
Shift Left is a rethink of how to circulate, share, and manage data. Along with a data streaming platform, it enables organizations to build data once, build it right, and reuse it anywhere within moments of its creation.
Data governance plays an important role ensuring data quality, security, and compliance across the data lifecycle, while data products, built around real-time streams, facilitate reusable, high-quality data assets for efficiency and collaboration.
Subtitle 1
Subtitle 2
Subtitle 3
Anchor: General-Updates

Industry in Motion: General Updates

Available in German, French, and Spanish!

3 Data Engineering Trends Riding Apache Kafka®, Apache Flink®, and Apache Iceberg

The Apache Kafka, Flink, and Iceberg communities are continuously evolving, offering innovative ways for engineers to manage and process data at scale. From re-envisioning microservices with Flink streaming applications to enabling real-time AI model applications, these tools are reshaping data integration. With strong community contributions, especially to Iceberg, data governance and real-time analytics are set to accelerate, revolutionizing how businesses manage their data infrastructure.
Read Article

The Data Streaming Landscape 2025

“Data streaming is a new software category. It has grown from niche adoption to becoming a fundamental part of modern data architecture, leveraging open source technologies like Apache Kafka® and Apache Flink®. With real-time data processing transforming industries, the ecosystem of tools, platforms, and cloud services has evolved significantly.” Kai Waehner, Confluent’s Field CTO, “explores the data streaming landscape of 2025, analyzing key players, trends, and market dynamics shaping this space.”
Read Blog

Ushering in a New Era of Data Streaming Confluent Recognized as a Challenger in 2024 Gartner® Magic Quadrant™ for Data Integration Tools

Recognized as a Challenger in Gartner’s 2024 Magic Quadrant for Data Integration Tools, Confluent predicts that the future of data integration will focus on universal data products and event-driven flows, with real-time streaming bridging gaps between operational and analytical data. This shift will drive innovation in areas like fraud detection, personalized experiences, and supply chain optimization, making data streaming the standard for faster, more efficient decision-making.
Read Blog

New in Confluent Cloud: Extending Private Networking Across the Data Streaming Platform

Confluent’s Q4 2024 launch wrapped up the year with a host of powerful features paving the way for even more innovation in 2025. Focusing on delivering private networking and enhanced security across the data streaming platform, key features include mTLS authentication, AWS PrivateLink for Schema Registry, Apache Flink® UDFs for custom stream processing, and WarpStream Orbit for seamless Apache Kafka® migrations.
Read Blog
Anchor: Executives-Brief

Executive's Brief: Data Strategy & Leadership

Data Streaming in Real Life: Banking - Real-Time Data Processing in the Financial Services Industry

Some of the most common use cases for data streaming in financial services globally include fraud detection, mainframe offloading, and cross-country data mesh creation to unify streaming data for real-time action. Kai Waehner, Global Field CTO, walks through real-life examples from 10X Banking, Raiffeisen Bank International, Capital One, and Royal Bank of Canada, showcasing how the industry is leveraging data streaming to drive innovation, reduce costs, and scale revenue growth.
Watch on YouTube

Predictive Analytics: How GenAI and Data Streaming Work Together to Forecast the Future

Predictive analytics, powered by data streaming and GenAI, is transforming risk management, fraud detection, and investment strategies. By analyzing historical and real-time data with machine learning models, predictive analytics helps financial institutions forecast market trends, assess credit risk, and optimize customer portfolios with greater precision. Generative AI amplifies these capabilities by generating diverse scenarios, filling in missing data, and dynamically adjusting predictions as new information emerges.
Read Blog

Shift Left: Unifying Operations and Analytics with Data Products

The need for high-quality business data is greater than ever, so preventing and mitigating bad data—across the entire business—has become a critical capability. Shifting data processing and governance “left” eliminates duplicate pipelines, reduces the risk and impact of bad data at the source, and leverages high-quality data products for both operational and analytical use cases.
Read Ebook
Available in German and French!
Anchor: Customer-Experience

Data Streaming for Cybersecurity & Fraud

Data Streaming in Real Life: Fraud Detection

How can data streaming help detect fraud in real time at any scale? Kai Waehner, Global Field CTO at Confluent, shows how a fraud detection solution based on data streaming and Apache Kafka® can help banks reduce risk and actually save money. Based on real-life examples, Waehner discusses cost-reducing automated scenarios, how risk can be reduced when fraud is detected, as well as opportunities to increase customer satisfaction and experience.
Watch on YouTube

Three Ways Generative AI and Data Streaming are Enhancing Cybersecurity in EMEA

Generative AI and data streaming are enhancing cybersecurity by providing real-time threat detection and response, behavioral analysis, and threat intelligence sharing. By leveraging real-time data processing and AI-powered analytics, organizations can minimize risks and human error as well as enable better collaboration across borders.
Read Article
Anchor: Architects-Blueprint

Architect's Blueprint: Data Systems Design

The Shift Left Architecture – From Batch and Lakehouse to Real-Time Data Products with Data Streaming

“The Shift Left Architecture enables a data mesh with real-time data products to unify transactional and analytical workloads with Apache Kafka, Flink, and Iceberg. Consistent information is handled with streaming processing or ingested into Snowflake, Databricks, Google BigQuery, or any other analytics/AI platform to increase flexibility, reduce cost, and enable a data-driven company culture with faster time-to-market building innovative software applications.”
Read Blog
Available in German, French, Spanish, Japanese, and Korean!
Anchor: must-read

Data Streaming in Action at KOR Financial | IT Modernization, Observability, Data Governance

KOR, “a leading trade data repository and reporting service provider,” leverages data streaming technologies to improve transparency, efficiency, and scalability of its regulatory reporting environment. Their implementation approach focuses on ensuring accuracy and identifying discrepancies, which drives diverse business value, from enhanced compliance to customer trust. 
Read the full story and many more in the new book by Kai Waehner: The Ultimate Data Streaming Guide.” Download is available for free.
Download Ebook
Available in German, French, and Spanish!

AI Vishing Protection: Counteract Modern Vishing Maneuvers with AI Penetration Test

Recent advancements in voice cloning AI have shown both promise and peril. GoodLabs' vishing penetration test strategy helps financial institutions assess and strengthen their voice biometrics systems against growing AI-generated voice threats. Their AI vishing detector uses a multi-layered approach that includes acoustic analysis, linguistic pattern recognition, and behavioral anomaly detection to identify fake audio in real time.
Read Use Case

Real-Time Insurance Claims Processing with Data Streaming

Real-time data streaming is revolutionizing insurance claims processing by enabling faster, more efficient workflows. By continuously streaming claims data as it’s submitted, insurers can instantly enrich and validate it, speeding up decision-making. As claims move through each step of the process, built-in fraud detection and accurate loss assessments ensure claims are resolved quickly and accurately, enhancing both customer satisfaction and operational efficiency.
Read Blog
Anchor: Developers-Desk

Developer's Desk: Building Applications

How Apache Iceberg and Flink Can Ease Developer Pain

Although developing stateful applications can be chaotic due to complex upstream and downstream interactions, tools like Apache Iceberg and Apache Flink® help simplify this process. Iceberg optimizes data querying by defining efficient table structures, while Flink enables real-time data processing, improving speed and reliability. Together, they provide a robust framework that enhances both the efficiency and reliability of stateful application development.
Watch Podcast

Shift Processing and Governance Left: A DSP Product Demo

“Shift left" is gaining traction as a trend by addressing the increasing need for high-quality business data. This approach solves common challenges like redundant datasets, poor data quality, and high costs associated with data warehouses and data lakes by cleaning and aggregating data closer to the point of data generation.
Watch Demo

How Producers Work: Kafka Producer and Consumer Internals | Part 1

Apache Kafka is a powerful event streaming platform that efficiently stores and delivers event data with minimal effort from developers, functioning like a black box. However, when issues arise with Kafka, the challenging task of debugging requires a deeper understanding of the inner workings of Kafka to see how to actually interact with the black box through producers and consumers.
Read Blog

Key Concepts of a Schema Registry

Schema Registry is a tool that manages and enforces data schemas in the Apache Kafka ecosystem, ensuring data compatibility and quality across applications. Understanding key concepts like schemas as contracts, the role of Schema Registry, schema workflow, and integration with client applications can help developers enhance system reliability and overall data streaming capabilities.
Watch Free Course

How to Add a Connector to Confluent Cloud

Kafka Connectors provide a way to get data flowing between sources and sinks, and Confluent Cloud. Choosing between available options like fully managed connectors for easy setup, or custom connectors that suit specific requirements, ultimately depends on the required level of control and specific data pipeline goals.
Watch on YouTube

Learn About Data Streaming With Apache Kafka® and Apache Flink®

High-throughput low latency distributed event streaming platform. Available locally or fully-managed via Apache Kafka on Confluent Cloud.

High-performance stream processing at any scale. Available via Confluent Cloud for Apache Flink.

Explore Developer Hub

Request Flink Workshop or Tech Talk

Anchor: Innovation-Research

Data Streaming for Innovation & Research

Hybrid Cloud Streaming and Modernizing Payments at Lloyds Banking Group

Lloyds Banking Group has been incorporating near real-time data processing with Apache Kafka® since 2015. Now, Kafka is integral to dozens of streaming applications, including critical services like data replication for resilience and push notifications for digital banking users. More recently, the New Payments Architecture (NPA) platform demonstrates how event-driven processing helps achieve more resilient, transparent, and future-proof payments capabilities for customers.
Watch Tech Talk

Join the Community

Sign up for updates below and check out previous issues!

Share content suggestions and new uses cases in the Comments Section