boletín de noticias

Reciba actualizaciones recientes de Hortonworks por correo electrónico

Una vez al mes, recibir nuevas ideas, tendencias, información de análisis y conocimiento de macrodatos.

AVAILABLE NEWSLETTERS:

Sign up for the Developers Newsletter

Una vez al mes, recibir nuevas ideas, tendencias, información de análisis y conocimiento de macrodatos.

cta

Empezar

nube

¿Está preparado para empezar?

Descargue sandbox

¿Cómo podemos ayudarle?

* Entiendo que puedo darme de baja en cualquier momento. Agradezco asimismo la información complementaria porporcionada en la Política de privacidad de Hortonworks.
cerrarBotón de cerrar
cta

Data Lifecycle Manager de Hortonworks

Protect Your Enterprise Data On-Premises & In-Cloud Through Hadoop Replication

Modernice la gestión de sus datos

Descargar el libro blanco

Información general

Data Lifecycle Manager (DLM) is a DataPlane application that protects not only your data but also the security policies associated with it through replication. This application empowers system administrators to replicate HDFS and Hive data from on-premises cluster to cloud storage. Replication of the Hive database from a cluster with underlying HDFS to another cluster with cloud storage is supported. DLM protects Data-at-Rest (TDE) and Data-in-Motion (TLS) and provides support for multiple key management service (KMS) and encryption keys.

Data Lifecycle Manager video imgvideo button

Administrador de datos del ciclo de vida

Beneficios

Proteja los activos de datos críticos

DLM provides replication of HDFS and Hive data from on-premises cluster to cloud storage. DLM provides a web UI that administrators can use to create and manage replication and disaster recovery policies and jobs. Avoid unnecessarily copying renamed files and directories and protect the data against accidental or intentional modification to meet governance and disaster recovery (DR) requirements. DLM enables system administrators to:

  • Incrementally replicate Hive data and metadata
  • Replicación de datos entre clústeres HDP utilizando instantáneas HDFS.
  • Provide support for data-at-rest (TDE) and data-in-motion (TLS) encryption
  • Evita el acceso no autorizado a los datos y permite la segregación de tareas.
  • Configure the destination cluster to serve as the new source, if the source cluster becomes unavailable
Seminario web: Gestión global de datos en un mundo híbrido de nubes múltiples
Exhaustivo
Replicate security policies associated with data

Replicate not only data but also the metadata and security policies that have been associated with data. DLM enhances the productivity of system administrators by:

  • Exporting Apache Ranger policies for the HDFS directory from source Ranger service and replicating them to destination Ranger service
  • Replicating associated file metadata, table structures or schemas
  • Providing active/standby behavior or DR site using Ranger policies
Blog: Painless Disaster Recovery using Hortonworks Data Lifecycle Manager
Exhaustivo
Implement hybrid data replication

DLM supports replication of HDFS and Hive data between underlying HDFS and AWS S3 cloud storage. DLM provides administrators with:

  • Replicación bidireccional de datos entre la nube y el entorno local.
  • Flexibility to designate either cluster in a pair to serve as the source or as the destination in a replication policy
  • Replicación nativa de almacenamiento en la nube en buckets S3.
  • Seamless integration between AWS-cloud and DLM for data and security policy replication
Blog: Data Replication in Hadoop
Exhaustivo
Get visibility into cluster status and automate replication tasks for enhanced productivity

Quickly identify any issues or verify the health of the clusters, policies, or jobs in DLM. View the total number of clusters enabled for DLM, the number for which all or some of the services are running, and the number of clusters for which remaining disk capacity is less than 10%. DLM provides system administrators with the flexibility to:

  • Create policies based on business rules
  • Replicate data based on data sets, day and time, the frequency of job execution and bandwidth restrictions
Blog: A Step-by-Step Guide for HDFS Replication
Exhaustivo