Anchor: Navigation

Financial Services Issue 4.0 - June 2025 Follow the Stream Your Dedicated Community Content Scroll 

Unlocking New Data Streaming Use Cases!

Welcome to the fourth edition of the Follow the Stream content scroll! This issue highlights real-world applications and the newest advancements in data streaming within financial services. We’ll cover industry trends and 2025 projections, exploring how leaders harness real-time data for governance, fraud prevention and detection, and effective data management.
See Industry Updates

Data Streaming Fundamentals

Data Streaming Crash Course 

Heading 1

Heading 2

Heading 3

Let's explore some foundational concepts and tools in the world of data streaming.
Subtitle 1
Subtitle 2
Subtitle 3
Anchor: General-Updates

Industry in Motion: General Updates

2025 Data Streaming Report: Moving the Needle on AI Adoption, Speed to Market, & ROI

In the age of AI, real-time, contextual, and trustworthy data is crucial. The 2025 Data Streaming Report reveals how 4,175 IT leaders view data streaming platforms (DSPs) as pivotal for simplifying access to real-time data, accomplishing data-related goals, and enhancing AI adoption and innovation. With DSPs offering significant return on investment (ROI) and 44% of IT leaders reporting 5x returns, 90% are increasing their investments for faster innovation and data value gains. 
Key findings:
  • Data streaming ROI hits new highs.
  • DSPs emerge as business imperative.
  • DSPs enable AI success.
  • Shifting left maximizes data value.
Download Report
Available in German, French, Spanish, and Italian!

Introducing Apache Kafka® 4.0

Kafka 4.0 is here, and it’s a big leap forward. This release removes Apache ZooKeeper™, making Kafka simpler to run and scale with KRaft mode as the default. It also introduces a faster, more reliable consumer group protocol and early access to queue-style messaging. With cleaner APIs, updated Java requirements, and new features across Kafka Streams and Kafka Connect, the new version is built for the future of data streaming. 

The Data Streaming World Tour 2025 Is Hitting the Road!

Join for an opportunity to network with peers and ecosystem partners, hear directly from customers, learn from Confluent's team of Apache Kafka® experts, and roll up your sleeves in interactive demonstrations and hands-on labs.
Upcoming Dates: June 10: Rome - RSVP June 17: Cairo - RSVP June 17: Atlanta - RSVP June 18: Barcelona - RSVP
Learn More

Top Trends for Data Streaming With Apache Kafka® and Apache Flink® in 2025

"Apache Kafka® and Apache Flink® are leading open-source frameworks for data streaming that serve as the foundation for cloud services, enabling organizations to unlock the potential of real-time data. Over recent years, trends have shifted from batch-based data processing to real-time analytics, scalable cloud-native architectures, and improved data governance powered by these technologies. Looking ahead to 2025, the data streaming ecosystem is set to undergo even greater changes, featuring trends like Democratization of Kafka, BYOC Deployment Model, and Data Streaming Organizations.
Read Blog
Anchor: Executives-Brief

Executive's Brief: Data Strategy & Leadership

The Practical Guide to Building Data Products for IT Leaders

The challenge of "analysis paralysis" occurs when organizations are presented with a wide range of data product options. Sorting through complex systems to determine the most valuable data solutions can be time-consuming and costly. But a simple, strategic framework can help quickly identify the most impactful data products, focusing on value rather than just cost. This approach allows for efficient decision-making and prioritization, whether starting a new project or optimizing an existing system. 
Read eBook
Available in German, French, and Spanish!

Bitvavo Banks on Confluent to Boost Its Real-Time Data Streaming and Realize Robust Data Governance

Bitvavo, a leading European crypto exchange, uses Confluent’s managed DSP to securely process real-time trading data, enforce strong governance, and meet regulatory requirements. The transition allowed Bitvavo to deliver market data to customers in milliseconds, scale automatically to handle demand spikes, and improve operational efficiency—enabling reliable, fast growth without infrastructure complexity.
Read Case Study
Available in German, French, and Spanish!

Data Accessibility: The Key to a Data-Driven Business Strategy

Easy access to data is essential for smarter decisions, efficient operations, and innovation. While many organizations struggle with siloed, hard-to-use data, BMW Group overcame these challenges with a strategic approach to data accessibility. Its solution involved creating a central hub for real-time data streaming that connects edge Internet of Things (IoT) data from production plants to cloud applications and services. 
Read Blog

Building a Data-Driven Culture: How to Empower Teams With Insights

Leadership plays a crucial role in fostering a data-driven culture, beyond just providing the right tools. It requires an ongoing commitment to building systems and processes that turn data into actionable insights. Leaders must promote continuous learning, invest in infrastructure, support cultural changes, and reward data-driven decisions. By bridging the gap between technical and business teams, encouraging transparency, and celebrating data-driven successes, leaders create an environment where innovation thrives.
Read Blog
Anchor: Customer-Experience

Data Streaming for Cybersecurity & Fraud

5 Steps to Better Fraud Detection and Prevention

Some of the world’s leading financial institutions, powered by Confluent, have discovered that a robust real-time data platform that blends transactional and contextual data at any scale underpins every successful anti-fraud strategy. Inspired by their insights, here are five steps for building a better data architecture for fraud detection:
  1. Connect all the required data.
  2. Process, enrich, and aggregate data in the required time frame.
  3. Analyze both real-time and historical data.
  4. Combine stateless and stateful processing.
  5. Share data insights.
Download Datasheet
Available in German, French, Spanish, and Italian!

Fraud Prevention in Under 60 Seconds With Apache Kafka®: How A Bank in Thailand is Leading the Charge

Krungsri Bank, one of Thailand’s largest banks, uses Kafka-based data streaming to analyze payment transactions in real time and block fraud in under 60 seconds. This event-driven architecture enables immediate detection and response to suspicious activity, ensuring customer safety and trust while also supporting operational agility, compliance, and faster innovation through hybrid cloud integration and advanced analytics.  
Read Blog
Anchor: Architects-Blueprint

Architect's Blueprint: Data Systems Design

Anchor: must-read

Top 10 Use Cases for Getting Started With Data Streaming

Businesses are transforming from legacy, batch-based architectures to modern, real-time DSPs, leading to improved efficiency and decision-making across industries. Featuring 10 innovative use cases for real-time data streaming, this white paper highlights its impact on IT modernization, AI, environmental, social, and governance (ESG), data products, and multicloud synchronization.
Download eBook

Data Streaming in Action at Erste Group Bank | Data Products, IT Modernization, Data Sharing, Data Governance

To address increasing costs related to maintaining mainframe infrastructure, Erste Group Bank developed a strategy to offload workloads to a cloud-native, microservice-based architecture powered by data streaming. The implementation involved extracting data from the mainframe, streaming the data in real time to a cloud-native environment, and processing it with a suite of microservices to ensure seamless integration with existing applications and services. 
Download eBook

Real-Time Anomaly Detection for Card Fraud Prevention

An AI-driven anomaly detection system by Informula and Confluent analyzes transactional data in real time to prevent fraud instantly. Using ML and Confluent’s DSP, it flags suspicious activity and triggers immediate alerts. Continuous learning helps it adapt to new fraud tactics, minimizing losses and protecting accounts.
Read Use Case

3 Strategies for Achieving Data Efficiency in Modern Organizations

In today's digital age, data efficiency is essential for modern organizations to handle the exponential growth of data. Three strategies for achieving this are implementing no-copy and zero ETL solutions to minimize data duplication, shifting left data governance to address data quality issues closer to the source, and reducing data waste by processing data as close to the source as possible.
Read Blog

Data Streaming in Real Life: Data Mesh in Banking

Kai Waehner, Confluent’s Global Field CTO, explains how Raiffeisen Bank International uses a DSP to build a compliant, scalable data mesh across 13 countries. By integrating real-time and historical data, the bank accelerates innovation, reduces time to market for new applications, and supports initiatives like cloud migration and regulatory reporting.
Watch Video
Anchor: Developers-Desk

Developer's Desk: Building Applications

A Technical Guide: How to Optimize Data Ingestion into Lakehouses & Warehouses

In the rapidly evolving landscape of data management and processing, the concept of shifting left has emerged as a key strategy for enhancing data operations. Confluent’s guide outlines the principles, technical components, and governance practices essential to implementation of shifting left with a DSP. Designed for data and analytics professionals, it offers practical steps for enhancing data reliability and efficiency at scale as well as real-world examples demonstrating streamlined ingestion, enhanced governance, and simplified conversion of Kafka topics into Apache Iceberg™ tables.
Download White Paper

Leveraging Apache Flink® for Data Streaming in Financial Services

Confluent’s managed Flink capabilities are revolutionizing financial services by enabling real-time data streaming for fraud detection, enhanced customer experiences, and large language model (LLM) workflows. This webinar features live demos, interactive sessions with industry experts, and practical examples of how real-time data streaming can drive innovation, efficiency, and competitive advantage in financial services.
Watch Webinar

Processing Without Pause: Continuous Stream Processing and Apache Flink®

The second episode of Confluent’s new podcast “Life Is But a Stream” dives even deeper into the fundamentals of data streaming to explore stream processing—what it is, the best tools and frameworks, and its real-world applications. Anna McDonald, Confluent’s Distinguished Technical Voice of the Customer, and Abhishek Walia, Staff Customer Success Technical Architect at Confluent, break down what stream processing is, how it differs from batch processing, and why tools such as Flink are game changers. 
Watch Podcast

Building Streaming Data Pipelines, Part 1: Data Exploration With Tableflow

Tableflow enables seamless querying of Kafka topics by syncing them to Iceberg tables, making real-time data exploration easier using standard SQL tools such as Trino and PopSQL. In this example, sensor data from the U.K. Environment Agency is ingested into Kafka, exposed via Tableflow, and analyzed by joining related datasets to build a denormalized view. This process lays the groundwork for building streaming data pipelines with a clear understanding of the underlying data.
Read Blog

Ready to break up with Apache ZooKeeper™? KRaft, you had me at hello.

KRaft introduces a modern architecture for Kafka by replacing ZooKeeper with a built-in consensus protocol, streamlining metadata management and simplifying operations. Confluent’s new webinar features two product demos—initial setup and migration to KRaft—along with a readiness checklist for deploying ZooKeeper-less Kafka in self-managed environments. With production-ready KRaft, organizations gain improved scalability, reduced operational complexity, and enhanced system reliability.
Watch Online Talk
Available in Japanese!

Learn About Data Streaming With Apache Kafka® and Apache Flink®

High-throughput low latency distributed event streaming platform. Available locally or fully-managed via Apache Kafka on Confluent Cloud.

High-performance stream processing at any scale. Available via Confluent Cloud for Apache Flink.

Explore Developer Hub

Request Flink Workshop or Tech Talk

Anchor: Innovation-Research

Data Streaming for Innovation & Research

Modernize Financial Services With Data Streaming

Modern financial institutions often struggle to unlock the full value of real-time data due to fragmented, complex data systems. The key to overcoming this is adopting a DSP that enables secure, compliant, and scalable data movement across legacy and modern environments. This approach empowers organizations to generate faster insights, create reusable data products, and accelerate innovation across the business.
Download eBook

How Moniepoint Transformed Financial Services for Over 2 Million Businesses in Africa

Moniepoint transformed financial services for more than 2 million businesses in Africa by adopting Confluent Cloud to overcome scaling and data management challenges. By replacing its legacy infrastructure with a real-time data streaming architecture, Moniepoint eliminated replication lags, streamlined reporting, and decoupled services to support rapid growth, enhancing customer experience across its financial products.
Read Blog

How Data Streaming With Apache Kafka and Flink Drives the Top 10 Innovations in Finserv

Real-time data streaming with Kafka and Flink is transforming financial services by enabling instant transaction processing, fraud detection, and personalized customer experiences. From embedded finance and AI-powered banking to green fintech and central bank digital currency (CBDC), data streaming powers the top innovations driving efficiency, compliance, and scalability in the industry.
Read Blog

Join the Community

Sign up for updates below and check out previous issues!

Share content suggestions and new uses cases in the Comments Section