Protobuf-first data infrastructure

Ironclad data quality
in Kafka

Attach schemas to Kafka topics, ensure backward compatible schema evolution, and eliminate runtime registration errors, all while maintaining compatibility with the Confluent ecosystem.

Validate and transform dynamic data
Protobuf First

Integrate deeply with your existing schemas

Manage message-to-topic mappings right in your schemas, rather than in an opaque runtime registry or a generic infrastructure-as-code system.

Because it's powered by a complete Protobuf compiler, the BSR handles imports automatically and works correctly with both the binary and JSON encodings.

Learn how to create Kafka topics with Buf
Protobuf first
No Migrations

Fully replace the Confluent Schema Registry

Get first-class Protobuf support without an expensive migration.

The BSR implements the same API as the Confluent Schema Registry, so it works with most Kafka producers and consumers, downstream systems like kSQL and Kafka Connect, and management tools like AKHQ.

Explore our Confluent integration
No migrations
Governance Workflows

Eliminate runtime registration errors

Validate that schema changes are backward compatible at build time — in IDEs and GitHub pull requests — and when they're published to the BSR.

If breaking changes are unavoidable, the BSR's governance workflow ensures that the right team members sign off.

Learn about Buf's compatibility guarantees
Governance workflows
Buf Kafka Gateway

Validate and transform data in motion

Insulate your Kafka cluster from rogue producers with a lightweight gateway that speaks the Kafka protocol.

The gateway can validate and transform in-flight data, ensuring that it matches the expected schema and complies with higher-level data contracts.

Contact us to learn more
Protobuf first

The Buf data pipeline add-on is available for preview

Let's connect

Thank you for signing up!
Connect on Slack