Get fresh updates from Hortonworks by email

Once a month, receive latest insights, trends, analytics, offering information and knowledge of the Big Data.

cta

Get Started

nube

¿Está preparado para empezar?

Descargue sandbox

¿Cómo podemos ayudarle?

cerrarBotón de cerrar
cta

Apache Hadoop Data Warehouse Architecture for EDW Optimization

Reduce Costs by Moving Data and Processing to Hadoop®

nube Hortonworks is a leader. Read the Forrester Wave.

DOWNLOAD Report

What is an EDW?

Enterprise Data Warehouse (EDW) is an organization’s central data repository that is built to support business decisions. EDW contains data related to areas that the company wants to analyze. For a manufacturer, it might be customer, product or bill of material data. EDW is built by extracting data from a number of operational systems. As the data is fed into EDW it is converted, reformatted and summarized to present a single corporate view. Data is added into the data warehouse over time in the form of snapshots and normally an enterprise data warehouse contains data spanning 5 to 10 years. A Hadoop data warehouse architecture enables deeper analytics and advanced reporting from these diverse sets of data.

EDW Optimization

Problems with a typical EDW

The Enterprise Data Warehouse has become a standard component of the corporate data architectures. However, the complexity and volume of data has posed some interesting challenges to the efficiency of existing EDW solutions.

Realizing the transformative potential of Big Data depends on the corporations’ ability to manage complexity while leveraging data sources of all types such as social, web, IoT and more. The integration of new data sources into the existing EDW system will empower corporations more and deeper analytics and insights. More importantly, EDW optimization using Hadoop provides a highly cost-efficient environment with optimal performance, scalability and flexibility.

Elementos de soluciones

Hortonworks Data Platform

*

Powerful open Hadoop data warehouse architecture with capabilities for data governance and integration, data management, data access, security and operations—designed for deep integration with your existing data center technology. Learn More

Syncsort

*

EDW offload to Hadoop - High-performance ETL software to access and easily onboard traditional enterprise data to HDP. Learn More
 
 

SERVICIOS PROFESIONALES

*

Orientación experta y soporte para probar rápidamente el valor de su nueva arquitectura y maximizar el valor de los datos comprobados y validadeos de Hortonworks. Más información

EDW optimization with Apache Hadoop ®

Flexible

*

Data can be loaded in HDP without having a data model in place

*

Data model can be applied based on the questions being asked of data (schema-on-read

*

HDP is designed to answer questions as they occur to the user

Efficient

*

100% of the data is available at granular level for analysis

*

HDP can store and analyze both structured and unstructured data

*

Data can be analyzed in different ways to support diverse use cases

Cost Effective

*

HDP (Hortonworks Data Platform) is 100% open - there is no licensing fee for software

*

HDP runs on commodity hardware

*

New data can be landed in HDP and used in days or even hours

Use-Cases on EDW Optimization

CASO DE USO 1
img multimedia

IB rápido en Hadoop

Los sistemas propiedad de EDW se adoptaron para un IB rápido y un análisis profundo, pero los precios EDW son insosteniblemente altos y esos sistemas no adoptaron los retos de macrodatos modernos como los datos no esructurados y los análisis de gran escala.

Hortonworks makes fast BI on Hadoop a reality, with the combination of a fast in-memory SQL engine to create data marts with an OLAP cubing engine that lets you query huge datasets in seconds. This gives you the choice of querying pre-aggregated data for maximum performance or in full-fidelity form when the nest grains of detail are needed, allowing access from any major BI tool that supports ODBC, JDBC or MDX.

Más información

CASO DE USO 2
img multimedia

PROCESOS ETL A BORDO PARA HADOOP

A typical EDW spends between 45 to 65 percent of its CPU cycles on ETL processing.These lower-value ETL jobs compete for resources with more business-critical workloads and can cause SLA misses. Hadoop can EDW offload these ETL jobs with minimal porting effort and at substantially lower cost, saving money and freeing up capacity on your EDW for higher-value analytical workloads. Hortonworks makes it easy by providing high-performance ETL tools, a powerful SQL engine and integration with all major BI vendors.

Más información

CASO DE USO 3
img multimedia

DATOS DE ARCHIVO EN HADOOP

Aumenar los volúmenes de datos y las presoines de costes obliga a muchas empresas archivar antiguos datos a guardarlos donde no pueden analizarse o deben recuperarse a un alto coste.

A Hadoop data warehouse architecture offers cost per terabyte on par with tape backup solutions. Because of the appealing cost, you can store years of data rather than months. All of your enterprise data remains available for retrieval, query and deep analytics with the same tools you use on existing EDW systems.

Más información