Anchor: Navigation

Healthcare Issue 2.0 - June 2025 Follow the Stream Your Dedicated Community Content Scroll 

Unlocking New Data Streaming Use Cases! 

Welcome to the second edition of the Follow the Stream! In this issue, we’ll focus on real-world use cases and the latest implementations of data streaming in healthcare. Featuring industry and 2025 data streaming trends, we’ll explore how industry leaders are leveraging real-time data for accelerating pharmaceutical research and development (R&D), advanced clinical analytics, and patient-centered care.
See Industry Updates

Data Streaming Fundamentals

Real-Time Data Crash Course

Heading 1

Heading 2

Heading 3

Let's explore some foundational concepts and tools in the world of data streaming.
Subtitle 1
Subtitle 2
Subtitle 3
Anchor: General-Updates

Industry in Motion: General Updates

2025 Data Streaming Report: Moving the Needle on AI Adoption, Speed to Market, & ROI 

In the age of artificial intelligence (AI), real-time, contextual, and trustworthy data is crucial. The 2025 Data Streaming Report reveals how 4,175 IT leaders view data streaming platforms (DSPs) as pivotal for simplifying access to real-time data, accomplishing data-related goals, and enhancing AI adoption and innovation. With DSPs offering significant return on investment (ROI) and 44% of IT leaders reporting 5x returns, 90% are increasing their investments for faster innovation and data value gains. 
Key findings:
  • Data streaming ROI hits new highs.
  • DSPs emerge as business imperative. 
  • DSPs enable AI success.
  • Shifting left maximizes data value.
Download Report
Available in German, French, Spanish, and Italian!

Introducing Apache Kafka® 4.0

Kafka 4.0 is here, and it’s a big leap forward. This release removes Apache ZooKeeper™, making Kafka simpler to run and scale with KRaft mode as the default. It also introduces a faster, more reliable consumer group protocol and early access to queue-style messaging. With cleaner APIs, updated Java requirements, and new features across Kafka Streams and Kafka Connect, the new version is built for the future of data streaming.

The Data Streaming World Tour 2025 Is Hitting the Road!

Join for an opportunity to network with peers and ecosystem partners, hear directly from customers, learn from Confluent's team of Apache Kafka® experts, and roll up your sleeves in interactive demonstrations and hands-on labs.
Upcoming Dates: June 10: Rome - RSVP June 17: Cairo - RSVP June 17: Atlanta - RSVP June 18: Barcelona - RSVP
Learn More

Top Trends for Data Streaming With Apache Kafka® and Apache Flink® in 2025

Apache Kafka and Apache Flink® are leading open source frameworks for data streaming that serve as the foundation for cloud services, enabling organizations to unlock the potential of real-time data. In recent years, trends have shifted from batch-based data processing to real-time analytics, scalable cloud-native architectures, and improved data governance powered by these technologies. Looking ahead to 2025, the data streaming ecosystem is set to undergo even greater changes, featuring trends such as the democratization of Kafka, Bring Your Own Cloud (BYOC) deployment model, and data streaming organizations. 
Read Blog
Anchor: Executives-Brief

Executive's Brief: Data Strategy & Leadership

The Practical Guide to Building Data Products for IT Leaders

The challenge of "analysis paralysis" occurs when organizations are presented with a wide range of data product options. Sorting through complex systems to determine the most valuable data solutions can be time-consuming and costly. But a simple, strategic framework can help quickly identify the most impactful data products, focusing on value rather than just cost. This approach allows for efficient decision-making and prioritization, whether starting a new project or optimizing an existing system. 
Download eBook
Available in German, French, and Spanish!

Why Healthcare Companies Love Data Streaming

Healthcare companies value data streaming because it unifies fragmented data into a real-time resource, enabling clinicians to access up-to-date insights across electronic health records, IoT devices, and operational systems. This integration leads to faster decisions, improved care quality, and greater satisfaction for patients and providers. 
Download eBook

Maximizing Data Value and Innovation Across the Organization

Maintaining high data quality is critical, especially with the rise of AI and machine learning (ML), where poor data can lead to system failures, inaccurate decisions, and costly outages. By implementing best practices for stream governance—identifying schema issues, establishing data contracts, and adopting decentralized data ownership—organizations can ensure the integrity of their data streams and significantly reduce risks.
Download Ebook

The Impact of Data Streaming in the Public Sector and Healthcare

Data streaming is revolutionizing healthcare by enabling real-time access to critical information, improving operational efficiency, and enhancing patient experiences. For example, CareFirst BlueCross BlueShield uses event streaming to speed up member enrollment, keep provider directories up to date, and enable real-time analytics for mental health services, leading to faster and higher quality care.
Read Blog

Moving Up the Curve: 5 Tips for Enabling Enterprise-Wide Data Streaming

The Confluent maturity curve outlines the five stages of Kafka adoption. It tracks how deeply data streaming is integrated into business operations, which can help identify opportunities and challenges. Here are some tips for advancing through the stages of the maturity curve:
  • Show what Level 4 and Level 5 data streaming platforms look like.
  • Make the business case (specifically for Levels 4+).
  • Demonstrate how to implement enterprise-wide data streaming with a target operating model.
  • List all stakeholders involved.
  • Confirm when to implement—specifically at Level 3.
Read Blog
Anchor: Architects-Blueprint

Architect's Blueprint: Data Systems Design

Anchor: must-read

Top 10 Use Cases for Getting Started With Data Streaming 

Businesses are transforming from legacy, batch-based architectures to modern, real-time data streaming platforms, leading to improved efficiency and decision-making across industries. Featuring 10 innovative use cases for real-time data streaming, this white paper highlights its impact on IT modernization, AI, environmental, social, and governance (ESG), data products, and multicloud synchronization.
Download eBook

Accelerating Pharmaceutical R&D With Data Streaming and AI/ML-Driven Insights

Data streaming and AI/ML technologies in pharmaceutical R&D enable event-driven workflows and rapid processing of biological images, lab results, and clinical trial data, significantly accelerating drug discovery and enhancing research efficiency. By ensuring consistent, real-time access to high-quality data across multiple systems, these technologies improve decision-making, reduce time to market, foster innovation, and support regulatory compliance while also lowering operational costs. 
Download eBook

Real-Time ECG Measurements Analysis

Streaming raw electrocardiogram (ECG) measurements from heart monitors and wearable devices into Confluent Cloud enables real-time analysis with Apache Flink, allowing healthcare providers to detect cardiac anomalies and critical health events as they happen. This architecture not only delivers immediate clinical insights for patient care but also offers operational visibility, helping identify devices that require servicing or are offline—all at scale and with low latency.
Read Use Case

Data Streaming in Healthcare and Pharma: Use Cases and Insights From Cardinal Health

Cardinal Health leverages Apache Kafka and Confluent Cloud to modernize its legacy systems and enable real-time data streaming across its pharmaceutical and medical divisions. This transformation supports key use cases like supply chain optimization, device management, contract automation, and real-time analytics, enhancing both operational efficiency and customer experience.
Read Blog
Anchor: Developers-Desk

Developer's Desk: Building Applications

A Technical Guide: How to Optimize Data Ingestion into Lakehouses and Warehouses

In the rapidly evolving landscape of data management and processing, the concept of shifting left has emerged as a key strategy for enhancing data operations. Confluent’s guide outlines the principles, technical components, and governance practices essential to implementation of shifting left with a DSP. Designed for data and analytics professionals, it offers practical steps for enhancing data reliability and efficiency at scale as well as real-world examples demonstrating streamlined ingestion, enhanced governance, and simplified conversion of Kafka topics into Apache Iceberg™ tables.
Download White Paper

Processing Without Pause: Continuous Stream Processing and Apache Flink®

The second episode of Confluent’s new podcast “Life Is But a Stream” dives even deeper into the fundamentals of data streaming to explore stream processing—what it is, the best tools and frameworks, and its real-world applications. Anna McDonald, Confluent’s Distinguished Technical Voice of the Customer, and Abhishek Walia, Confluent’s Staff Customer Success Technical Architect, break down what stream processing is, how it differs from batch processing, and why tools such as Flink are game changers.
Watch Podcast

Building Streaming Data Pipelines, Part 1: Data Exploration With Tableflow

Tableflow enables seamless querying of Kafka topics by syncing them to Iceberg tables, making real-time data exploration easier using standard SQL tools such as Trino and PopSQL. In this example, sensor data from the U.K. Environment Agency is ingested into Kafka, exposed via Tableflow, and analyzed by joining related datasets to build a denormalized view. This process lays the groundwork for building streaming data pipelines with a clear understanding of the underlying data.
Read Blog

NEW Webinar: Getting Started With Apache Kafka® and Real-Time Data Streaming

This session is an in-depth look at how Kafka supports modern DSPs and real-time business transformation. It covers Kafka’s architecture and a range of real-world use cases—from stock trading and fraud detection to transportation—highlighting how organizations are leveraging data streaming to improve agility and foster innovation.
Watch Webinar

What Is Tableflow? Materializing Apache Kafka® Topics as Apache Iceberg™ and Delta Lake Tables With Zero ETL

Tableflow, a new feature in Confluent Cloud, allows to completely skip the ETL and make Kafka topic data available as tables in the data lake.  
Tim Berglund, Confluent's Vice President of Developer Relations, walks through how Tableflow works and why it’s a big deal. There’s no data copying or transformation steps—just native open table formats, such as Iceberg and Delta Lake, right on top of Kafka data.
Watch Overview

Learn About Data Streaming With Apache Kafka® and Apache Flink®

High-throughput low latency distributed event streaming platform. Available locally or fully-managed via Apache Kafka on Confluent Cloud.

High-performance stream processing at any scale. Available via Confluent Cloud for Apache Flink.

Explore Developer Hub

Request Flink Workshop or Tech Talk

Anchor: Innovation-Research

Data Streaming for Innovation & Research

Recursion Accelerates Drug Discovery With New Data Pipelines Based on Confluent Cloud

Recursion Pharmaceuticals leveraged Confluent Cloud to build a robust, scalable data streaming system that dramatically accelerated its drug discovery process. This platform enabled the organization to process and analyze vast amounts of biological image data in real time, reducing experiment turnaround from hours to minutes and supporting the rapid advancement of new treatments to clinical trials.
Read Case Study

City of Hope Redefines Predictive Sepsis Detection Using Kafka

City of Hope, one of the leading cancer research and treatment centers in the U.S., developed a real-time, AI-powered sepsis prediction model tailored for bone marrow transplant patients. By leveraging Apache Kafka and Confluent Cloud, it streams live patient data—such as vitals and lab results—directly from electronic health records to its predictive models. This enables accurate, timely clinical interventions and reduces sepsis and intensive care unit escalations.
Read Blog

Care.com Accelerates Time to Market and Improves Customer Experience With Data Streaming 

Care.com migrated to Confluent Cloud to unify its fragmented IT systems into a scalable, agile microservices-based Bravo Platform. This transition enabled rapid feature deployment, improved operational efficiency through reduced DevOps overhead, and enhanced security and scalability. As a result, Care.com accelerated time to market and improved its ability to meet changing customer needs across its global marketplace.
Read Case Study

Join the Community

Sign up for updates below and check out previous issues!

Share content suggestions and new uses cases in the Comments Section