article thumbnail

Event-Driven Microservices with Python and Apache Kafka

Confluent

A deep dive into how microservices work, why it’s the backbone of real-time applications, and how to build event-driven microservices applications with Python and Kafka.

Kafka 98
article thumbnail

Putting Apache Kafka To Use: A Practical Guide to Building an Event Streaming Platform (Part 1)

Confluent

Putting Apache Kafka To Use: A Practical Guide to Building an Event Streaming Platform.

Kafka 104
Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Revolutionizing Real-Time Streaming Processing: 4 Trillion Events Daily at LinkedIn

LinkedIn Engineering

Authors: Bingfeng Xia and Xinyu Liu Background At LinkedIn, Apache Beam plays a pivotal role in stream processing infrastructures that process over 4 trillion events daily through more than 3,000 pipelines across multiple production data centers.

Process 119
article thumbnail

Putting Apache Kafka To Use: A Practical Guide to Building an Event Streaming Platform (Part 2)

Confluent

This is the second part of our guide on streaming data and Apache Kafka. This guide will contain specific advice on how to go about building an event streaming platform in your organization.

Kafka 52
article thumbnail

Building A Real Time Event Data Warehouse For Sentry

Data Engineering Podcast

Summary The team at Sentry has built a platform for anyone in the world to send software errors and events. Go to dataengineeringpodcast.com/conferences to learn more about these and other events, and take advantage of our partner discounts to save money when you register today. What did the previous system look like?

article thumbnail

Streaming Data Pipelines: What Are They and How to Build One

Precisely

This article explores what streaming data pipelines are, how they work, and how to build this data pipeline architecture. Streaming data pipelines, by extension, offer an architecture capable of handling large volumes of data, accommodating millions of events in near real time. That’s where streaming data pipelines come into play.

article thumbnail

Reducing The Barrier To Entry For Building Stream Processing Applications With Decodable

Data Engineering Podcast

Summary Building streaming applications has gotten substantially easier over the past several years. RudderStack Profiles takes the SaaS guesswork and SQL grunt work out of building complete customer profiles so you can quickly ship actionable, enriched data to every downstream team. How can you get the best results for your use case?

Process 182