Unified platform for comprehensive data management
Traditional data lake initiatives are stuck in neutral and don't move beyond a few use cases. Enterprises are realizing that bringing data together without context is not serving the purpose. Data lakes are not optimized for semantic enforcement or consistency. Creating policies for data without context and consistency is proving hard. Unified data governance and semantic consistency are the major challenges with data lakes. Add to it, the data has also become more diverse and siloed in the past few years with the explosion of structured, unstructured data silos and also cloud technologies.
A single platform that stitches data together and enables data management and integration can handle these challenges. Data Fabric provides such a unified data platform that stretches across location, including on-premise, cloud or edge, data types and access methods. It is the fabric that provides an ensemble of tools and technologies surrounding disparate data siloes to ensure that the data is treated consistently.
- Scales with growing data volumes and access methods
- Provides common ways for users to access data processing tools
- Enables data integration across data nodes
- Fully supports automation needed for DataOps
- Moves computation to the data
- Manages data across all environments for batch and real-time needs
- Supports complete API life cycle
Distributed Location Support
Workflows are orchestrated right from data ingestion and preparation to deployment and consumption through the execution of services of data capture, storage, integration, governance, provision and application. The fabric provides a single control platform to execute a sequence of services built with pre-packaged diverse technologies and distributed across multiple execution environments. It also automates the coordination of distributed multi-services workflows.
Uniform Governance Model
The fabric ensures data consistency locally and provides a simple consistency model to implement across data locations. It integrates multiple data types across the discrete data stores. The data is secured at the lowest granularity within the fabric rather than just as a function of access methods.
Data fabric does extensive collection of metadata to discover, access, search and integrate data silos. It allows metadata to be distributed across all data nodes to avoid any bottlenecks. The metadata is also useful in minimizing cloud lock-ins.