Anchor: Navigation

Telecommunication Issue 2.0 - May 2025 Follow the Stream Your Dedicated Community Content Scroll 

Unlocking New Data Streaming Use Cases!

Welcome to the second issue of Follow the Stream Telecom! In this edition, we continue our journey into how organizations harness the power of real-time data to foster innovation. Featuring real-world use cases and the latest applications of data streaming in telecom and media, we’ll explore how industry leaders are leveraging real-time data for over-the-top (OTT) services, revolutionized connectivity, and operational efficiency. 
See Industry Updates

Data Streaming Fundamentals

Real-Time Data Crash Course

Explore the basics of Kafka, Flink, and DSP.

Heading 1

Subtitle 1

Heading 2

Subtitle 2

Heading 3

Subtitle 3
Data streaming continuously collects, processes, and delivers real-time data, enabling systems to react instantly to changes. Powered by Kafka, it leverages distributed, fault-tolerant architecture to handle massive data flows with low latency and high scalability.
Check out the links below to get an understanding of how real-time data processing powers business productivity and innovation.
Subtitle 1
Subtitle 2
Subtitle 3
Anchor: General-Updates

Industry in Motion: General Updates

2025 Data Streaming Report: Moving the Needle on AI Adoption, Speed to Market, & ROI 

In the age of AI, real-time, contextual, and trustworthy data is crucial. The 2025 Data Streaming Report reveals how 4,175 IT leaders view data streaming platforms (DSPs) as pivotal for simplifying access to real-time data, accomplishing data-related goals, and enhancing AI adoption and innovation. With DSPs offering significant return on investment (ROI) and 44% of IT leaders reporting 5x returns, 90% are increasing their investments for faster innovation and data value gains. 
Key findings:
  • Data streaming ROI hits new highs 
  • DSPs emerge as business imperative
  • DSPs enable AI success 
  • Shifting left maximizes data value
Download Report
Available in German, French, Spanish, and Italian!

How Data Streaming and AI Help Telcos to Innovate: Top 5 Trends From MWC 2025

"As the telecom and tech industries rapidly evolve, real-time data streaming is emerging as the backbone of digital transformation. For Mobile World Congress (MWC) 2025, McKinsey outlined five key trends defining the future: IT excellence, sustainability, 6G, generative AI, and AI-driven software development. Kai Waehner, Confluent’s Global Field CTO, explores how data streaming powers each of these trends, enabling real-time observability, AI-driven automation, energy efficiency, ultra-low latency networks, and faster software innovation."
Read Blog

Top Trends for Data Streaming With Apache Kafka® and Apache Flink® in 2025

"Apache Kafka and Apache Flink are leading open-source frameworks for data streaming that serve as the foundation for cloud services, enabling organizations to unlock the potential of real-time data. Over recent years, trends have shifted from batch-based data processing to real-time analytics, scalable cloud-native architectures, and improved data governance powered by these technologies. Looking ahead to 2025, the data streaming ecosystem is set to undergo even greater changes, featuring trends like Democratization of Kafka, BYOC Deployment Model, and Data Streaming Organizations."  
Read Blog

The Data Streaming World Tour 2025 Is Hitting the Road!

Join for an opportunity to network with peers and ecosystem partners, hear directly from customers, learn from Confluent's team of Kafka experts, and roll up your sleeves in interactive demonstrations and hands-on labs.
Upcoming Dates: June 4: Calgary - RSVP June 5: Tokyo - RSVP June 5: Detroit - RSVP June 10: Rome - RSVP June 17: Cairo - RSVP June 17: Atlanta - RSVP June 18: Barcelona - RSVP
Learn More
Anchor: Executives-Brief

Executive's Brief: Data Strategy & Leadership

Why Telecom and Media Companies Love Data Streaming

Telecom and media companies rely on data streaming to turn constantly evolving data into a real-time asset, enabling faster decisions and seamless operations across operations support systems (OSS), business support systems (BSS), and IoT networks. By replacing siloed, batch-based systems with unified, event-driven architecture, data streaming enhances automation, predictive analytics, and AI-driven optimization. This continuous data flow improves efficiency, agility, and the ability to deliver innovative, personalized services.
Download eBook
Available in German, French, and Spanish!

The Practical Guide to Building Data Products for IT Leaders

The challenge of "analysis paralysis" occurs when organizations are presented with a wide range of data product options. Sorting through complex systems to determine the most valuable data solutions can be time-consuming and costly. But a simple, strategic framework can help quickly identify the most impactful data products, focusing on value rather than just cost. This approach allows for efficient decision-making and prioritization, whether starting a new project or optimizing an existing system. 
Read eBook
Available in German, French, and Spanish!

Globe Group Slashes Infra Costs and Fuels Personalized Marketing With Data Streaming

The Globe Group transitioned from batch processing to an event-driven architecture with Confluent to enable real-time data streaming, improve campaign targeting, and enhance data governance. By adopting Confluent Cloud on AWS, the company reduced infrastructure and operational costs while empowering its data engineering team to focus on innovation and building data products. With future plans to implement a data mesh architecture, Globe aims to scale its real-time capabilities and democratize data access across the organization.
Read Case Study

Moving Up the Curve: 5 Tips for Enabling Enterprise-Wide Data Streaming

The Confluent maturity curve outlines the five stages of Kafka adoption. It tracks how deeply data streaming is integrated into business operations, which can help identify opportunities and challenges. Here are some tips for advancing through the stages of the maturity curve:
  • Show what Level 4 and Level 5 data streaming platforms look like.
  • Make the business case (specifically for Levels 4+).
  • Demonstrate how to implement enterprise-wide data streaming with a target operating model.
  • List all stakeholders involved.
  • Confirm when to implement—specifically at Level 3.
Read Blog

Maximizing Data Value and Innovation Across the Organization

Maintaining high data quality is critical, especially with the rise of AI and machine learning, where poor data can lead to system failures, inaccurate decisions, and costly outages. By implementing best practices for stream governance—identifying schema issues, establishing data contracts, and adopting decentralized data ownership—organizations can ensure the integrity of their data streams and significantly reduce risks.
Read White Paper

New Webinar! Unlocking Data Value: A Proven Framework to Implement Data Products 

For more than a decade, data products have sought to simplify complex data landscapes. Confluent’s webinar breaks down critical concepts of analytical and operational data products to provide practical guidance for implementing a unified data product strategy, enhancing interactions with data across any use case.
Watch Webinar
Anchor: Architects-Blueprint

Architect's Blueprint: Data Systems Design

Anchor: must-read

Top 3 Data Streaming Use Cases for Telecom and Media

Data streaming is transforming the telecom and media industries by enabling real-time, data-driven experiences that improve performance, engagement, and revenue. In telecom, network monitoring and alerting allows companies to proactively detect issues, reduce downtime, and ensure high availability—resulting in better service quality and customer satisfaction. In media, personalized OTT content recommendations use real-time analytics to drive user engagement and loyalty, while targeted advertising leverages behavioral data to deliver relevant ads, increasing campaign effectiveness and ad revenue.
Read eBook
Available in German, French, and Spanish!

Real-Time Data Sharing in the Telecom Industry for MVNO Growth and Beyond With Data Streaming

"The telecommunications industry is transforming rapidly as Telcos expand partnerships with MVNOs, IoT platforms, and enterprise customers. Traditional batch-driven architectures can no longer meet the demands for real-time, secure, and flexible data access. Kai Waehner, Confluent’s Global Field CTO, explores how real-time data streaming technologies like Apache Kafka and Flink, combined with hybrid cloud architectures, enable Telcos to build trusted, scalable data ecosystems. It covers the key components of a modern data sharing platform, critical use cases across the Telco value chain, and how policy-driven governance and tailored data products drive new business opportunities, operational excellence, and regulatory compliance."
Read Blog

Real-Time Field Service Optimization

A leading telco modernized its operations by replacing manual, batch-based workflows with a real-time, event-driven architecture powered by Confluent Cloud. This shift enabled seamless communication with third-party service providers, streamlined field service management, and improved customer experiences through faster task assignment and response times.
Read Blog

Unlocking Real-Time Insights in the Network With Steel Thread Networking IaC

The Steel Thread Networking Accelerator, powered by Confluent Cloud and Flink, enables telcos to detect and respond to network issues in real time using streaming analytics. By replacing delayed operational reports with instant, actionable insights, it improves service quality and reduces time to resolution. With reusable infrastructure as code (IaC), it also simplifies deployment, helping networking enterprises quickly adopt scalable stream computing solutions.
Read Use Case
Anchor: Developers-Desk

Developer's Desk: Building Applications

A Technical Guide: How to Optimize Data Ingestion into Lakehouses & Warehouses

In the rapidly evolving landscape of data management and processing, the concept of shifting left has emerged as a key strategy for enhancing data operations. Confluent’s guide outlines the principles, technical components, and governance practices essential to implementation of shifting left with a DSP. Designed for data and analytics professionals, it offers practical steps for enhancing data reliability and efficiency at scale as well as real-world examples demonstrating streamlined ingestion, enhanced governance, and simplified conversion of Kafka topics into Apache Iceberg™ tables.
Download White Paper

New Webinar! Getting Started With Apache Kafka® and Real-Time Data Streaming

This session is an in-depth look at how Kafka supports modern data streaming platforms and real-time business transformation. It covers Kafka’s architecture and a range of real-world use cases—from data integration to real-time analytics—highlighting how organizations are leveraging data streaming to improve agility and foster innovation.
Watch Webinar

Processing Without Pause: Continuous Stream Processing and Apache Flink®

The second episode of Confluent’s new podcast “Life Is But a Stream” dives even deeper into the fundamentals of data streaming to explore stream processing—what it is, the best tools and frameworks, and its real-world applications. Anna McDonald, Confluent’s Distinguished Technical Voice of the Customer, and Abhishek Walia, Staff Customer Success Technical Architect at Confluent, break down what stream processing is, how it differs from batch processing, and why tools like Flink are game changers.
Watch Podcast

Building Streaming Data Pipelines, Part 1: Data Exploration With Tableflow

Tableflow enables seamless querying of Kafka topics by syncing them to Iceberg tables, making real-time data exploration easier using standard SQL tools such as Trino and PopSQL. In this example, sensor data from the U.K. Environment Agency is ingested into Kafka, exposed via Tableflow, and analyzed by joining related datasets to build a denormalized view. This process lays the groundwork for building streaming data pipelines with a clear understanding of the underlying data.
Read Blog

Ready to break up with ZooKeeper? KRaft, you had me at hello

KRaft introduces a modern architecture for Kafka by replacing ZooKeeper with a built-in consensus protocol, streamlining metadata management and simplifying operations. Confluent’s new webinar features two product demos—initial setup and migration to KRaft—along with a readiness checklist for deploying ZooKeeper-less Kafka in self-managed environments. With production-ready KRaft, organizations gain improved scalability, reduced operational complexity, and enhanced system reliability.
Watch Online Talk
Available in Japanese!

Learn About Data Streaming With Apache Kafka® and Apache Flink®

High-throughput low latency distributed event streaming platform. Available locally or fully-managed via Apache Kafka on Confluent Cloud.

High-performance stream processing at any scale. Available via Confluent Cloud for Apache Flink.

Explore Developer Hub

Request Flink Workshop or Tech Talk

Join the Community

Sign up for updates below and check out previous issues!

Share content suggestions and new uses cases in the Comments Section