Anchor: Navigation

Telecommunication Issue 1.0 - February 2025 Follow the Stream Your Dedicated Community Content Scroll 

Welcome to Your Data Streaming Learning Journey!

Welcome to Follow the Stream Telco! In this first issue, we invite you to explore the fundamentals of data streaming, what it is, and how it empowers businesses. After looking through the crash course, feel free to scroll through the whole issue or jump to role-specific sections using the navigation menu above.
See Industry Updates

Data Streaming Fundamentals

Real-Time Data Crash Course

Explore the basics of Apache Kafka®, Apache Flink®, and DSP.

Heading 1

Subtitle 1

Heading 2

Subtitle 2

Heading 3

Subtitle 3
Data streaming continuously collects, processes, and delivers real-time data, enabling systems to react instantly to changes. Powered by Apache Kafka®, it leverages distributed, fault-tolerant architecture to handle massive data flows with low latency and high scalability.
Check out the links below to get an understanding of how real-time data processing powers business productivity and innovation:
Subtitle 1
Subtitle 2
Subtitle 3
Anchor: General-Updates

Industry in Motion: General Updates

Available in German, French, and Spanish!

3 Data Engineering Trends Riding Apache Kafka®, Apache Flink®, and Apache Iceberg™

The Apache Kafka, Flink, and Iceberg communities are continuously evolving, offering innovative ways for engineers to manage and process data at scale. From re-envisioning microservices with Flink streaming applications to enabling real-time AI model applications, these tools are reshaping data integration. With strong community contributions, especially to Iceberg, data governance and real-time analytics are set to accelerate, revolutionizing how businesses manage their data infrastructure.
Read Article

Past, Present, and Future of Stream Processing

"Stream processing has existed for decades. However, it really kicks off in the 2020s thanks to the adoption of open source frameworks like Apache Kafka and Apache Flink®. Fully managed cloud services make it easy to configure and deploy stream processing in a cloud-native way; even without the need to write any code.” Confluent’s Field CTO and data streaming expert Kai Waehner explores the past, present, and future of stream processing including "how machine learning and GenAI relate to stream processing, and the integration between data streaming and data lakes with Apache Iceberg.”
Read Blog

Ushering in a New Era of Data Streaming Confluent Recognized as a Challenger in 2024 Gartner® Magic Quadrant™ for Data Integration Tools

Recognized as a Challenger in Gartner’s 2024 Magic Quadrant for Data Integration Tools, Confluent predicts that the future of data integration will focus on universal data products and event-driven flows, with real-time streaming bridging gaps between operational and analytical data. This shift will drive innovation in areas like fraud detection, personalized experiences, and supply chain optimization, making data streaming the standard for faster, more efficient decision-making.
Read Blog

The Data Streaming Landscape 2025

“Data streaming is a new software category. It has grown from niche adoption to becoming a fundamental part of modern data architecture, leveraging open source technologies like Apache Kafka® and Apache Flink®. With real-time data processing transforming industries, the ecosystem of tools, platforms, and cloud services has evolved significantly.” Kai Waehner, Confluent’s Field CTO, “explores the data streaming landscape of 2025, analyzing key players, trends, and market dynamics shaping this space.”
Read Blog
Anchor: Executives-Brief

Executive's Brief: Data Strategy & Leadership

Conquer Data Mess With Universal Data Products

In today’s fast-paced business environment, real-time data insights are crucial, but traditional batch processing often results in complex, tangled integrations. To resolve this, companies are shifting to a universal data products mindset, which means treating data as a reusable, discoverable asset. A data streaming platform enables businesses to build reliable data products that support real-time experiences efficiently and cost effectively.
Get Ebook
Available in German, French, Spanish, Italian, and Japanese!

Shift Left: Unifying Operations and Analytics with Data Products

The need for high-quality business data is greater than ever, so preventing and mitigating bad data—across the entire business—has become a critical capability. Shifting data processing and governance “left” eliminates duplicate pipelines, reduces the risk and impact of bad data at the source, and leverages high-quality data products for both operational and analytical use cases.
Read Ebook

Data Streaming Report 2024

As businesses learn to do more with less, proving tech investment ROI has become even more critical. This ensures they are achieving value, despite limited resources. 84% of IT leaders cite 2x to 10x return on data streaming investments—and 41% cite an ROI of 5x or more—as it continues to drive significant benefits across a wide range of business functions and industries.
47% of IT leaders in the healthcare industry see a 5x or more ROI on data streaming investments.
Download Report
Available in German, French, and Portuguese!

How DISH Wireless Built a 5G Network with Cloud-Native Data Streaming

DISH Wireless has built a cloud-native, 5G, ORAN network using Confluent Cloud, which allows developers to access and customize network data, facilitating Industry 4.0 use cases such as predictive analytics, smart factory monitoring, and intelligent mobility applications. By leveraging data streaming, DISH Wireless has created a scalable, resilient network that enhances modern app communications and accelerates enterprise innovation.
Read Blog

Data Streaming in Real Life: Telecommunication

Data streaming and event-driven architectures are reshaping the telecommunications industry by powering trends like proactive customer service management, real-time network monitoring, and the integration of 5G infrastructure. These technologies enable real-time data integration across telecom projects, support the development of innovative OTT and mobility services, and drive enterprise architecture advancements in the industry.
Watch Online Talk
Anchor: Architects-Blueprint

Architect's Blueprint: Data Systems Design

The Streaming Data Quality Guidebook

Maintaining high data quality is critical, especially with the rise of AI/ML, where poor data can lead to system failures, inaccurate decisions, and costly outages. By implementing best practices for stream governance—identifying schema issues, establishing data contracts, and adopting decentralized data ownership—organizations ensure the integrity of their data streams and significantly reduce risks.
Download Guidebook
Available in German, French, Spanish, Japanese, and Korean!

From Batch and Lakehouse to Real-Time Data Products with Data Streaming

“The Shift Left Architecture enables a data mesh with real-time data products to unify transactional and analytical workloads with Apache Kafka, Flink, and Iceberg. Consistent information is handled with streaming processing or ingested into Snowflake, Databricks, Google BigQuery, or any other analytics/AI platform to increase flexibility, reduce cost, and enable a data-driven company culture with faster time-to-market building innovative software applications.”
Read Blog
Available in German, French, Spanish, and Italian!

How BT Group Built a Smart Event Mesh with Confluent

BT Group built a Smart Event Mesh leveraging Confluent's platform to enable scalable, real-time event streaming across its hybrid cloud infrastructure. Starting with the "Common Event Broker" for centralized event streaming, BT expanded to a distributed architecture that simplifies data sharing and governance while enhancing automation, discoverability, and self-service for teams. This strategic transformation supports operational efficiency, improved customer experiences, and innovative use cases like network monitoring and machine learning insights.
Read Blog

Making Predictive Customer Support a Reality for Telcos

Network downtime can cost telcos billions each year, but predictive customer support powered by real-time data can mitigate these losses. By integrating and processing data from devices, towers, and performance metrics, streaming platforms create predictive models to resolve anomalies quickly, reducing downtime and improving service-level agreements. This proactive approach enhances operational efficiency, prevents outages, and boosts customer satisfaction.
Read Blog
Anchor: Developers-Desk

Developer's Desk: Building Applications

Kafka 101: Getting Started with Kafka Streams 

In this course, Sophie Blee-Goldman, Apache Kafka® Committer and Software Engineer, provides an introduction to Kafka Streams. Understanding Kafka Streams begins with Apache Kafka, a distributed, scalable, elastic, and fault-tolerant event streaming platform.
Kafka Streams is declarative, enabling the expression of what should be done rather than specifying how to do it. For example, filtering all records marked with the color "red" from a topic could be achieved using plain Kafka, but implementing the same functionality with Kafka Streams would require only a third of the code.
Watch Free Course

How Apache Iceberg and Flink Can Ease Developer Pain

Although developing stateful applications can be chaotic due to complex upstream and downstream interactions, tools like Apache Iceberg and Apache Flink® help simplify this process. Iceberg optimizes data querying by defining efficient table structures, while Flink enables real-time data processing, improving speed and reliability. Together, they provide a robust framework that enhances both the efficiency and reliability of stateful application development.
Watch Podcast

Shift Processing and Governance Left: A DSP Product Demo

“Shift left" is gaining traction as a trend by addressing the increasing need for high-quality business data. This approach solves common challenges like redundant datasets, poor data quality, and high costs associated with data warehouses and data lakes by cleaning and aggregating data closer to the point of data generation.
Watch Demo

Building an Event-Driven Architecture? Here Are Five Ways to Get It Done Right

Despite the widespread adoption of Apache Kafka, its integration with event-driven systems continues to present challenges for developers and architects. Some key factors to consider are the importance of schema management, when to use stream processing over bespoke consumers, and how to ensure systems scale elastically for the future.
Watch Online Talk

Free Course: Intro to Flink SQL 

Flink SQL is a standards-compliant SQL engine for processing both batch and streaming data with the scalability, performance, and consistency of Apache Flink®. This is a very expressive API, based on powerful abstractions, that can be used to quickly develop many common use cases.
This video explains the relationship of Flink SQL to the Table and DataStream APIs. Through an extended example, it illustrates the stream/table duality at the heart of Flink SQL.
Watch Free Course

Learn About Data Streaming With Apache Kafka® and Apache Flink®

High-throughput low latency distributed event streaming platform. Available locally or fully-managed via Apache Kafka on Confluent Cloud.

High-performance stream processing at any scale. Available via Confluent Cloud for Apache Flink.

Explore Developer Hub

Request Flink Workshop or Tech Talk

Anchor: Innovation-Research

Data Streaming for Innovation & Research

How Intrado Transformed Conferencing with an Event-Driven Architecture 

Intrado transformed its global communication conferencing solution by shifting from a monolithic legacy architecture to an event-driven design powered by Confluent. By adopting a universal data pipeline as a single source of truth, Intrado achieved real-time data processing, enabling faster innovation, personalized customer experiences, and reduced infrastructure costs. This modernized architecture not only met the real-time demands of their business but also positioned Intrado to scale and support future initiatives seamlessly.
Watch Online Talk

Join the Community

Sign up for updates below and check out previous issues!

Share content suggestions and new uses cases in the Comments Section