With Kafka, brokers are simple data pipes with no understanding of what data traverses them and no guarantees of data quality. Not Bufstream. With its Broker-side Schema Awareness, Bufstream eliminates bad data at the source. Consumers can trust that the data they receive will always match the appropriate schema and adhere to any semantic constraints.
Join our interactive workshop for a deep dive into the technical underpinnings of Bufstream and the capabilities that become available when you understand the shape of your streaming data.
Our technical staff will share how we architected our schema-aware streaming data platform, example use cases, and best practices for managing your deployment. We will stop to answer your questions throughout this interactive session.
We’ll cover:
- The Bufstream architecture
- How Bufstream enforces quality controls for all topics
- Automatic enveloping and semantic validation
- Tips & tricks on using Bufstream
- Q&A throughout
Presenters:

Joe Rinehart
