Much digital ink has been spilled on the hottest topic in data integration today: data meshing. But is data mesh really feasible for a modern enterprise? And is it really new, or just a reworking of older paradigms and techniques?

DBTA hosted a webinar with Paul Lacey, Senior Director of Product Marketing, Matillion, who discussed what a decentralized approach to data management might look like in practice.

The data mesh exists to overcome common challenges with a centralized data infrastructure, he explained. It is a new approach based on a modern and distributed architecture for the management of analytical data.

Data mesh can reduce data downtime and accelerate the adoption of ML, AI, and other types of modern analytics.

Data mesh can solve several problems such as challenges with creating a single source of truth, fragile ETL pipelines, and unresponsive centralized analytics teams.

A data mesh is governed by four primary tenants: domain ownership, data as a product, infrastructure abstraction, and distributed governance.

There are two types of data mesh, Lacey said. It can be virtualized or physical. A physical data mesh requires:

  • Accessible processing
  • Multimodal data pipelines: batch or change data capture (CDC)
  • Output connectors
  • Robust data cataloging
  • Strict enforcement of policies
  • Version control of data product schemas
  • Strong data culture

Matillion can help create a data mesh, he noted. Accessible interfaces and visual workflows enable more people in the business to work with live data.

“Accessible interfaces are critical to success,” Lacey said.

An archived on-demand replay of this webinar is available here.