Anchor: Navigation

Financial Services Issue 2.0 - December 2024 Follow the Stream Your Dedicated Community Content Scroll 

The Eventing Issue: Events in Data-Streaming Innovation

Welcome to the second issue of Follow the Stream! This edition is dedicated to Events in Data Streaming. In software, any significant action can be recorded as an event. Also known as event stream processing (ESP), event streaming patterns can process a continuous flow of data as soon as an event or change happens. This scroll explores how event stream processing works, its major benefits, and how to get started building event-driven architectures.
See Industry Updates

Event Streaming Fundamentals

Events Crash Course

Let's explore the basics behind events in Apache Kafka®.

Heading 1

Subtitle 1

Heading 2

Subtitle 2

Heading 3

Subtitle 3
First looking at terminology, review concepts like event design, event streaming, and event-driven design, to get an understanding of the role that events play in each of these approaches to using Apache Kafka®.
Moving onto developing event-first thinking, consider the benefits of event-based systems, and try to understand how event-driven programming changes everything.
Subtitle 1
Subtitle 2
Subtitle 3
Anchor: General-Updates

Industry in Motion: General Updates

Past, Present and Future of Stream Processing

"Stream processing has existed for decades. However, it really kicks off in the 2020s thanks to the adoption of open source frameworks like Apache Kafka and Apache Flink®. Fully managed cloud services make it easy to configure and deploy stream processing in a cloud-native way; even without the need to write any code.” Confluent’s Field CTO and data streaming expert Kai Waehner explores the past, present and future of stream processing including "how machine learning and GenAI relate to stream processing, and the integration between data streaming and data lakes with Apache Iceberg.”
Read Blog

Tis’ the Season of Streaming 2024!

Tis’ the Season of Streaming, which means it's time to celebrate and innovate! Starting on December 9th, Confluent’s five-day learning event is an expert introduction to building with real-time and event-driven data flows. With 15 speakers and sessions on Kafka, Flink, data mesh, and GenAI, the event offers opportunities to enhance diverse skills, win attendance-based prizes, and support tech education with a $10 donation per registration.
Register Here

Reusable Data Streams Are Rising On C-Level IT Agendas

A reusable data stream is a high-quality, real-time data flow that is filtered, enriched, and optimized for sharing across multiple projects and systems. Executives are increasingly prioritizing reusable data streams due to their benefits in agility, cost efficiency, as well as their contributions to cleaner code and greener data practices.
Read Article

Empowering the First Generation of Real-Time AI Startups

As real-time AI continues to transform industries, Confluent is uniquely positioned to help startups harness the potential of data streaming to drive intelligent, automated decisions at scale. The AI Accelerator Program from Confluent for Startups is designed to support the next wave of innovation in artificial intelligence. This highly selective, 10-week virtual program seeks to collaborate with 10 to 15 early-stage AI startups that are building real-time applications utilizing the power of data streaming.

Confluent’s WarpStream Deal Reflects Market Shift in Data Streaming

Confluent's acquisition of WarpStream underscores a significant shift in the data streaming market, as companies increasingly favor "bring your own cloud" (BYOC) solutions to manage cloud infrastructure costs. This approach reflects the industry's broader push toward diversified and scalable data streaming strategies, to meet the evolving demands of high-volume, cost-sensitive environments.
Read Article
Anchor: Executives-Brief

Executive's Brief: Data Strategy & Leadership

Conquer Data Mess With Universal Data Products

In today’s fast-paced business environment, real-time data insights are crucial, but traditional batch processing often results in complex, tangled integrations. To resolve this, companies are shifting to a universal data products mindset, which means treating data as a reusable, discoverable asset. A data streaming platform enables businesses to build reliable data products that support real-time experiences efficiently and cost effectively.
Get Ebook
Available in German, French, Spanish, Italian, and Japanese!

Moving Up the Curve: 5 Tips For Enterprise-Wide Data Streaming

The Confluent maturity curve outlines the five stages of Apache Kafka® adoption. It tracks how deeply data streaming is integrated within business operations, which can help identify opportunities and challenges. Here are some tips for advancing through the stages of the maturity curve:
  • Show what Level 4 and Level 5 data streaming platforms look like
  • Make the business case (specifically for Levels 4+)
  • Demonstrate how to implement enterprise-wide data streaming with a Target Operating Model
  • List all stakeholders involved
  • Confirm when to implement—specifically at Level 3
Read Blog

Real-Time or Real Value? Assessing the Benefits of Event Streaming

Event streaming, particularly in the service of event-driven architectures, is not just about reducing latency, but also about enabling new ways of structuring software and teams. It allows companies to optimize data for universal consumption rather than specific query patterns, which supports innovation, improves responsiveness to business needs, and transforms how teams work together.
Read Blog

Award-Winning Banking: How EVO Banco Streamlines Innovation with Event Streaming

EVO Banco, a Spanish digital bank, implemented a real-time data strategy to improve infrastructure scalability and enhance customer experience. They reduced weekly fraud losses by 99%, blocking 500 fraudulent transactions daily, and increased product development speed—earning recognition as the best bank in Spain based on their Net Promoter Score (NPS). Their ongoing focus on leveraging real-time data continues to position EVO Banco as a leader in the banking sector.
Read Case Study
Available in Spanish!

The Data Streaming Organization | Driving Value & Competitive Advantage From Data Streaming

Maintaining high data quality is critical, especially with the rise of AI/ML, where poor data can lead to system failures, inaccurate decisions, and costly outages. By implementing best practices for stream governance—identifying schema issues, establishing data contracts, and adopting decentralized data ownership—organizations ensure the integrity of their data streams and significantly reduce risks.
Read Ebook
Available in German, French, Spanish, and Japanese!
Anchor: Customer-Experience

Data Streaming for Cybersecurity & Fraud

Anchor: must-read

How Real-Time Data Saved Citizens Bank $1M Annually With Data Streaming

Citizens Bank, with over $215 billion in assets, modernized its legacy systems using Confluent’s data streaming platform to meet the needs of digital-first customers. This shift to real-time data processing reduced IT costs by 30%, accelerated loan processing by 40%, and lowered operational expenses by 10%, positioning the bank for sustained innovation and increased efficiency.

Faster, Smarter, Real-Time Fraud Protection

Fraud is a significant risk across industries, with billions lost despite heavy investments in advanced tools. The main issue is a lack of timely, contextual data. Companies like Capital One, Instacart, and Bank Rakyat Indonesia address this by processing vast amounts of data in real time, detecting abnormal patterns, and combating fraud as it happens.
Get Solution Brief
Ebook available in German, and French!

Billion Dollar Frauds: Unleashing the Power of Confluent’s Streaming AI to Combat Voice Vishing

A recent scam where a Hong Kong branch manager lost $35 million to a deepfake voice emphasizes the urgent need for better protection against vishing. GoodLabs Studio is combating this by using Confluent and Flink's Streaming AI to analyze real-time audio from Genesys, and match voices with established voiceprints. The strategy improves model accuracy through continuous training and enables rapid detection of vishing attempts in an effort to mitigate financial losses.
Watch Session
Anchor: Architects-Blueprint

Architect's Blueprint: Data Systems Design

Supercharging Customer Onboarding with Event-Driven Microservices & Confluent Cloud

A leading financial institution transformed its customer onboarding process using Confluent Cloud, in an effort to address operational inefficiencies. By leveraging event streaming, they achieved real-time processing, integration across departments, and faster, more secure onboarding, ultimately enhancing their customer experience.
Read Blog

How To Migrate From Kafka To Confluent Cloud With Limited Downtime

Migrating from open-source Apache Kafka to Confluent can feel challenging, but success stories from companies like BigCommerce and Cerved highlight it as an effective and streamlined alternative to other deployment options. The migration process must be tailored to each environment and follows three simple phases: planning, setup, and migration/validation.
Watch Overview

Using Data Governance In Financial Services To Deliver Personalized Banking

Capital One leverages data governance and real-time streaming with Confluent Cloud to deliver personalized banking experiences, and strengthen security. By centralizing their data management in the cloud, the bank scales efficiently, which enables instant customer insights and rapid fraud detection for 100 million customers.
Read Blog

Unprecedented Trading Efficiency With a FIX Connector

In the trading world, speed is of the essence: even minor delays can result in missed opportunities or financial losses. To tackle these challenges, Synthesis developed a Kafka Pattern FIX (Financial Information eXchange) Connector, which streamlines the integration process with various financial entities. This connector addresses the growing demand for the real-time, high-frequency intraday data that is essential for regulatory compliance and risk management.
Read Use Case
Anchor: Developers-Desk

Developer's Desk: Building Applications

Everything You’ve Ever Wanted To Ask About Event-Driven Architectures

Anna McDonald (the Duchess) and Matthias J. Sax (the Doctor), from Confluent, answer user-submitted questions on all things events and eventing, including Apache Kafka, its ecosystem, and beyond! The discussion highlights why events are a mindset, why the Duchess thinks event immutability is relaxing, and why event streams sometimes need windows.
Watch on YouTube

Shift Left: Preventing and Fixing Bad Data in Event Streams

At a high level, bad data refers to any data that doesn’t conform to expected formats or standards. It can creep into data sets in a variety of ways, and cause serious issues for all downstream data users. In Apache Kafka®, event streams are built on an immutable log, meaning that once data is written, it cannot be edited or deleted. While this immutability is a core feature, it also introduces unique challenges, and requires extra caution when producing to, and managing data in Kafka.

Building an Event-Driven Architecture? Here Are Five Ways to Get It Done Right

Despite the widespread adoption of Apache Kafka, its integration with event-driven systems continues to present challenges for developers and architects. Some key factors to consider are the importance of schema management, when to use stream processing over bespoke consumers, and how to ensure systems scale elastically for the future.
Watch Online Talk

Let Flink Cook: Mastering Real-Time RAG with Flink

Large language models are advancing rapidly, but to move from prototype to production they need real-time, domain-specific data, and strong security. Retrieval-augmented generation (RAG) can address this by integrating live data streams using AI models for more accurate and relevant responses. Apache Kafka® and Apache Flink® can be used to implement RAG for various applications, enhancing the performance and reliability of generative AI systems.

Designing Event-Driven Microservices

The journey of transitioning from a monolithic to a microservice-based architecture starts with an exploration of asynchronous events, publish-subscribe frameworks, and patterns such as Strangler Fig and Branch by Abstraction. Using a real-world banking case study, Confluent’s Staff Software Practice Lead Wade Waldron highlights essentials for building systems that are scalable, resilient, and adaptable.
Watch Case Study

Learn About Data Streaming With Apache Kafka® and Apache Flink®

High-throughput low latency distributed event streaming platform. Available locally or fully-managed via Apache Kafka on Confluent Cloud.

High-performance stream processing at any scale. Available via Confluent Cloud for Apache Flink.

Explore Developer Hub

Request Flink Workshop or Tech Talk

Anchor: Innovation-Research

Data Streaming for Innovation & Research

Curve Opens Up a World of Payment Options with Real-Time, Event-Driven Data Transactions

Curve simplifies payments by unifying multiple cards into one app, enabling users to manage transactions seamlessly. To support this, Curve transitioned from a monolithic system to an event-driven, microservices architecture with Confluent, allowing real-time insights, scalability, and enhanced customer security.
Read Case Study

Modernize Payments Architecture for ISO 20022 Compliance

The payments industry is evolving rapidly, particularly with the introduction of ISO 20022, which mandates a standardized approach to payments data by November 2025. This transition requires significant updates to payment systems and processes, emphasizing the need for real-time transaction capabilities. Data streaming platforms have played a crucial role in facilitating this shift, allowing for the seamless integration and processing of various payment message formats.
Read Blog

Financial Services Reimagined with Apache Kafka®

Perhaps no other industry has been upended by data as profoundly as financial services. Today, customers expect instant, virtual insight into all of their accounts, from banking to investing to payment platforms. To address the challenges of data integration with Kafka, Confluent has invested 5M+ engineering hours in reinventing Apache Kafka® to deliver cloud-native, enterprise-grade experience that finserv companies need today.
Get Ebook

Join the Community

Sign up for updates below and check out previous issues!

Share content suggestions and new uses cases in the Comments Section