Anchor: Navigation

Healthcare Issue 1.0 - February 2025 Follow the Stream Your Dedicated Community Content Scroll 

Welcome to Your Data Streaming Learning Journey!

Welcome to Follow the Stream Healthcare! In this first issue, we invite you to explore the fundamentals of data streaming, what it is, and how it empowers businesses. After looking through the crash course, feel free to scroll through the whole issue or jump to role-specific sections using the navigation menu above.
See Industry Updates

Data Streaming Fundamentals

Real-Time Data Crash Course

Explore the basics of Apache Kafka®, Apache Flink®, and DSP.

Heading 1

Subtitle 1

Heading 2

Subtitle 2

Heading 3

Subtitle 3
Data streaming continuously collects, processes, and delivers real-time data, enabling systems to react instantly to changes. Powered by Apache Kafka®, it leverages distributed, fault-tolerant architecture to handle massive data flows with low latency and high scalability.
Check out the links below to get an understanding of how real-time data processing powers business productivity and innovation:
Subtitle 1
Subtitle 2
Subtitle 3
Anchor: General-Updates

Industry in Motion: General Updates

Available in German, French, and Spanish!

3 Data Engineering Trends Riding Apache Kafka®, Apache Flink®, and Apache Iceberg™

The Apache Kafka, Flink, and Iceberg communities are continuously evolving, offering innovative ways for engineers to manage and process data at scale. From re-envisioning microservices with Flink streaming applications to enabling real-time AI model applications, these tools are reshaping data integration. With strong community contributions, especially to Iceberg, data governance and real-time analytics are set to accelerate, revolutionizing how businesses manage their data infrastructure.
Read Article

Past, Present and Future of Stream Processing

"Stream processing has existed for decades. However, it really kicks off in the 2020s thanks to the adoption of open source frameworks like Apache Kafka and Apache Flink®. Fully managed cloud services make it easy to configure and deploy stream processing in a cloud-native way; even without the need to write any code.” Confluent’s Field CTO and data streaming expert Kai Waehner explores the past, present and future of stream processing including "how machine learning and GenAI relate to stream processing, and the integration between data streaming and data lakes with Apache Iceberg.”
Read Blog

Ushering in a New Era of Data Streaming Confluent Recognized as a Challenger in 2024 Gartner® Magic Quadrant™ for Data Integration Tools

Recognized as a Challenger in Gartner’s 2024 Magic Quadrant for Data Integration Tools, Confluent predicts that the future of data integration will focus on universal data products and event-driven flows, with real-time streaming bridging gaps between operational and analytical data. This shift will drive innovation in areas like fraud detection, personalized experiences, and supply chain optimization, making data streaming the standard for faster, more efficient decision-making.
Read Blog

The Data Streaming Landscape 2025

“Data streaming is a new software category. It has grown from niche adoption to becoming a fundamental part of modern data architecture, leveraging open source technologies like Apache Kafka® and Apache Flink®. With real-time data processing transforming industries, the ecosystem of tools, platforms, and cloud services has evolved significantly.” Kai Waehner, Confluent’s Field CTO, “explores the data streaming landscape of 2025, analyzing key players, trends, and market dynamics shaping this space.”
Read Blog
Anchor: Executives-Brief

Executive's Brief: Data Strategy & Leadership

Conquer Data Mess With Universal Data Products

In today’s fast-paced business environment, real-time data insights are crucial, but traditional batch processing often results in complex, tangled integrations. To resolve this, companies are shifting to a universal data products mindset, which means treating data as a reusable, discoverable asset. A data streaming platform enables businesses to build reliable data products that support real-time experiences efficiently and cost effectively.
Get Ebook
Available in German, French, Spanish, Italian, and Japanese!

Shift Left: Unifying Operations and Analytics with Data Products

The need for high-quality business data is greater than ever, so preventing and mitigating bad data—across the entire business—has become a critical capability. Shifting data processing and governance “left” eliminates duplicate pipelines, reduces the risk and impact of bad data at the source, and leverages high-quality data products for both operational and analytical use cases.
Read Ebook

Data Streaming Report 2024

As businesses learn to do more with less, proving tech investment ROI has become even more critical. This ensures they are achieving value, despite limited resources. 84% of IT leaders cite 2x to 10x return on data streaming investments—and 41% cite an ROI of 5x or more—as it continues to drive significant benefits across a wide range of business functions and industries.
47% of IT leaders in the healthcare industry see a 5x or more ROI on data streaming investments.
Download Report

The Power of Predictive Analytics in Healthcare: Using Generative AI and Confluent

Predictive analytics, powered by GenAI and data streaming, is transforming patient care, resource allocation, and operational efficiency. By analyzing historical and real-time data through machine learning models, predictive analytics helps healthcare providers anticipate patient needs, detect early signs of disease, and optimize treatment plans with greater accuracy. GenAI enhances these predictions by creating diverse scenarios, filling in data gaps, and continuously adapting to new patient data for real-time insights.
Read Blog

Data Streaming in Real Life: Healthcare

Data streaming and event-driven architectures are reshaping the healthcare industry by enabling real-time integration, analysis, and correlation of diverse data sources. These innovations improve processes, enhance patient care, ensure regulatory compliance, and enable new AI-driven use cases like sensor diagnosis and emergency vehicle routing.
Watch Online Talk
Anchor: Architects-Blueprint

Architect's Blueprint: Data Systems Design

The Streaming Data Quality Guidebook

Maintaining high data quality is critical, especially with the rise of AI/ML, where poor data can lead to system failures, inaccurate decisions, and costly outages. By implementing best practices for stream governance—identifying schema issues, establishing data contracts, and adopting decentralized data ownership—organizations ensure the integrity of their data streams and significantly reduce risks.
Download Guidebook
Available in German, French, Spanish, Japanese, and Korean!

From Batch and Lakehouse to Real-Time Data Products with Data Streaming

“The Shift Left Architecture enables a data mesh with real-time data products to unify transactional and analytical workloads with Apache Kafka, Flink, and Iceberg. Consistent information is handled with streaming processing or ingested into Snowflake, Databricks, Google BigQuery, or any other analytics/AI platform to increase flexibility, reduce cost, and enable a data-driven company culture with faster time-to-market building innovative software applications.”
Read Blog
Available in German, French, Spanish, and Italian!

Clinical Examination Training with AI Patient Avatars

GoodLabs, in partnership with Confluent, has developed a virtual training platform that allows medical students and professionals to practice clinical examination skills through interactions with AI-powered patient avatars. This solution addresses the decline in clinical skills by providing access to a range of virtual patients with validated medical histories, benefiting both medical learners and practicing professionals in improving their patient care capabilities.
Read Use Case

How Siemens Healthineers Leverages Data Streaming with Apache Kafka and Flink in Manufacturing and Healthcare

Siemens Healthineers uses data streaming with Apache Kafka® and Flink® to enhance manufacturing operations, particularly in predictive maintenance and machine integration. By streaming real-time IoT data from machines and robots, they reduce downtime, optimize maintenance schedules, and improve production quality. These data-driven insights help ensure efficient manufacturing processes, delivering critical medical equipment on time while boosting overall operational efficiency.
Read Blog
Anchor: Developers-Desk

Developer's Desk: Building Applications

Kafka 101: Getting Started with Kafka Streams 

In this course, Sophie Blee-Goldman, Apache Kafka® Committer and Software Engineer, provides an introduction to Kafka Streams. Understanding Kafka Streams begins with Apache Kafka, a distributed, scalable, elastic, and fault-tolerant event-streaming platform.
Kafka Streams is declarative, enabling the expression of what should be done rather than specifying how to do it. For example, filtering all records marked with the color "red" from a topic could be achieved using plain Kafka, but implementing the same functionality with Kafka Streams would require only a third of the code.
Watch Free Course

How Apache Iceberg and Flink Can Ease Developer Pain

Although developing stateful applications can be chaotic due to complex upstream and downstream interactions, tools like Apache Iceberg and Apache Flink® help simplify this process. Iceberg optimizes data querying by defining efficient table structures, while Flink enables real-time data processing, improving speed and reliability. Together, they provide a robust framework that enhances both the efficiency and reliability of stateful application development.
Watch Podcast

Shift Processing and Governance Left: A DSP Product Demo

“Shift left" is gaining traction as a trend by addressing the increasing need for high-quality business data. This approach solves common challenges like redundant datasets, poor data quality, and high costs associated with data warehouses and data lakes by cleaning and aggregating data closer to the point of data generation.
Watch Demo

Building an Event-Driven Architecture? Here Are Five Ways to Get It Done Right

Despite the widespread adoption of Apache Kafka, its integration with event-driven systems continues to present challenges for developers and architects. Some key factors to consider are the importance of schema management, when to use stream processing over bespoke consumers, and how to ensure systems scale elastically for the future.
Watch Online Talk

Free Course: Intro to Flink SQL 

Flink SQL is a standards-compliant SQL engine for processing both batch and streaming data with the scalability, performance, and consistency of Apache Flink®. This is a very expressive API, based on powerful abstractions, that can be used to quickly develop many common use cases.
This video explains the relationship of Flink SQL to the Table and DataStream APIs. Through an extended example, it illustrates the stream/table duality at the heart of Flink SQL.
Watch Free Course

Learn About Data Streaming With Apache Kafka® and Apache Flink®

High-throughput low latency distributed event streaming platform. Available locally or fully-managed via Apache Kafka on Confluent Cloud.

High-performance stream processing at any scale. Available via Confluent Cloud for Apache Flink.

Explore Developer Hub

Request Flink Workshop or Tech Talk

Anchor: Innovation-Research

Data Streaming for Innovation & Research

Transforming Health Payer Interoperability with Real-Time Data Streaming

Interoperability in healthcare is crucial for seamless data exchange between patients, providers, and payers. However, many healthcare payers (i.e., healthcare insurers) face challenges with siloed data and outdated communication methods. Real-time data streaming offers a solution by enabling continuous, efficient data flow, helping healthcare payers modernize their systems and improve overall patient outcomes.
Read Blog

Revolutionizing Telemedicine with Data Streaming

As telemedicine becomes more embedded in global healthcare, the complexity of managing secure, real-time data, integrating various systems, and ensuring scalability becomes more apparent. Data streaming addresses these challenges by enabling secure, real-time data processing, integration, and management, allowing telemedicine services to scale efficiently and meet healthcare demands.
Read Blog

Join the Community

Sign up for updates below and check out previous issues!

Share content suggestions and new uses cases in the Comments Section