Migrating to the HPE Ezmeral Data Fabric

Provides instructions for migrating business-critical data and applications from an Apache Hadoop cluster to an HPE Ezmeral Data Fabric cluster.

This guide provides instructions for migrating business-critical data and applications from an Apache Hadoop cluster to an HPE Ezmeral Data Fabric cluster.

The data-fabricdistribution is 100% API-compatible with Apache Hadoop, and migration is a relatively straightforward process. The additional features available in the HPE Ezmeral Data Fabric provide new ways to interact with your data. In particular, the HPE Ezmeral Data Fabric provides a fully read/write storage layer that can be mounted as a filesystem via NFS, allowing existing processes, legacy workflows, and desktop applications full access to the entire cluster.

Migration consists of planning, deployment, and migration of components, applications, data, and nodes.

See the https://docs.datafabric.hpe.com/home/ReleaseNotes/c_relnotes_intro.html for up-to-date information about migration issues.