Data mesh is a new approach to designing data architectures by embracing organisational and data-centric constructs, such as data management and governance. The idea is that data should be easily accessible and interconnected across the entire business.
In this talk, will cover:
- The basics of building a streaming data mesh with Kafka & how Confluent enables this.
- The four principles of the data mesh: domain-driven decentralisation, data as a product, self-service data platform, and federated governance.
- The differences between working with event streams versus centralised approaches.
- How to onboard data from existing systems into a mesh, modelling the communication within the mesh
- How to deal with changes to your domain’s “public” data, give examples of global standards for governance
- The importance of taking a product-centric view of data sources and the data sets they share.
- Vendor:
- Posted:
- Feb 15, 2024
- Published:
- Feb 15, 2024
- Format:
- Type:
- Replay