HDF hace que el análisis de la retransmisión sea más rápida y sencilla, permitiendo acelerar la recolección de datos, la cura, el análisis y la entrega en tiempo real, de forma local y en la nube a través de una solución integrada con Apache NiFi, Kafka y Storm.
Integrated collection from dynamic, disparate and distributed sources of differing formats, schemas, protocols, speeds and sizes such as machines, geo location devices, click streams, files, social feeds, log files and videos.
Cómo la gestión en tiempo real del flujo de datos hace que el movimiento de los datos sea más sencillo Ver vídeo Saber más
Real-Time Response to Streaming Data
Real-time evaluation of perishable insights at the edge to determine what is pertinent or not, and executing upon consequent decisions to send, drop or locally store data as needed through a visual user interface with real time operational visibility and control. Unprecedented operational effectiveness is achieved by eliminating the dependence and delays inherent in a coding and custom scripting approach.
How streaming data managed through a real-time visual user interface of Apache NiFi increases operational effectiveness. Watch Video
Seguros desde la fuente hasta el destino, con una cadena de custodia en tiempo real
Enrutamiento seguro de extremo a extremo desde la fuente hasta el destino, con autorización de usuario discreta y detallada, cadena de custodia visual en tiempo real y metadatos (procedencia de los datos). La capacidad de apoyar la seguridad y el cifrado a pequeña escala, las fuentes de datos JVM desde la frontera del Internet de las cosas, hasta clústeres de gran escala de las empresas para dar soporte al Internet de cualquier cosa por igual garantiza una alta fiabilidad de los resultados analíticos y sus datos subyacentes.
See how HDF enables high trust of your data Watch Video
Real-time business depends upon capturing perishable insights from data in motion.
HDF supports stream processing to aggregate and analyze event data in order to dynamically recognize data patterns and detect outliers. The real-time, high volume event processing for immediate action and response to streaming is supported through an integrated enterprise offering of Apache NiFi, MiNiFi, Kafka and Storm to initiate data.
Apache NiFi and MiNiFi provide dynamic, configurable data pipelines, through which all sources, systems and destinations communicate. Kafka adapts to differing rates of data creation and delivery while real-time streaming analytics with Storm creates immediate insights at a massive scale.
Processing Streaming Data Learn More Realtime Event Processing in Hadoop with NiFi, Kafka and Storm Learn More
Create New Insights By Securely Sharing Select Pieces of Data
HDF proporciona acceso y control seguro de los datos para permitir tomar decisiones empresariales informadas. Permitiendo compartir piezas de información específicas y eliminando el acceso de datos basado en roles que garantiza un acceso general a todos los datos de una vez; HDF permite a las empresas compartir de forma segura y dinámica piezas seleccionadas de datos y obtener nuevas ideas de negocios.
Vea cómo HDF reduce los riesgos al permitir la democratización de los datos Ver vídeo
Optimice las analíticas de registro con contenido basado en el enrutamiento en la frontera
Real-time edge analytics eases integration with log analytics systems such as Splunk, SumoLogic, Graylog, LogStash, etc. for easy, secure and comprehensive data ingest of log files. By cost-effectively increasing volumes of data collected with content based routing, enterprises can accelerate trouble-shooting and improve anomaly detection with a more comprehensive view across all available machine data.
Una plataforma escalable y extensible para el IoAT, incluye IoT
Equally well designed to run on the small scale data sources that make up the Internet of Things as well as on large scale clusters in today's enterprise data centers, HDF is designed to support the Internet of AnyThing. HDF securely moves data from wherever it is, to wherever it needs to go, regardless of size, shape, or speed dynamically adapting to the needs of the source, the connection, and the destination.
Utilice el interfaz de usuario visual de HDF para arrastrar y soltar flujos de datos y cifrar datos, envíelos a Kafka, configure el acopio y la gestión de la congestión para que los datos puedan priorizarse dinámicamente y enviarse de forma segura desde la fuente al destino, con una respuesta inmediata sobre las condiciones de fluctuación que suelen ocurrir en la frontera.
Vea cómo el flujo de datos en tiempo real se cifra y dirige a Kafka en minutos Ver vídeo
What’s New in HDF
New UI: More intuitive interface for easy and time-saving dataflow creation, management, tuning and control of real-time data flows
Increased productivity: Multiple actors can manage different parts of a single dataflow shared across an entire enterprise, connecting data silos and teams with fine grained component level control
Enterprise Readiness: Ease of deployment and management of NiFi, Kafka and Storm through integrated Ambari interface and integration with Ranger for authorization management
MiNiFi: New MiNiFi architecture option in JVM for expanded scalability that addresses the challenges of secure, first-mile and bi-directional communication with millions of edge devices (IoT) in less than 40 MB footprint
THE VALUE OF STREAM PROCESSING AND STREAMING INTEGRATION
Apache Hadoop is the default choice for processing and analyzing large volumes of data at rest, but what about data in motion? Learn how to generate maximum value from data streams and execute business decisions in real time. Download the 451 Research Report to find out: Beyond big data – what users want to do with data…
Easily, securely move data from the Internet of Anything with Hortonworks DataFlow, powered by Apache™ NiFi. Download the white paper now to find out how to: Accelerate data collection and movement for increased big data ROI Secure collection and transport from the Internet of Anything Provide real-time operational visibility, feedback, and control Enable real-time decision…
Join us for a live 30 minute webinar to see how you can make data collection from the Internet of Anything fast, easy and secure. Designed to accelerate big data ROI from streaming analytics systems such as Spark and Storm, Hortonworks DataFlow delivers data from anywhere it originates to anywhere it needs to go. We…
Flujo de datos de Google, Flujo de datos de Hortonworks – ¿Qué hay en un nombre? ¿Todos son el mismo?
La gente nos ha estado preguntando – ¿El flujo de datos en la nube de Google es lo mismo que el flujo de datos de Hortonworks (HDF)? Por lo tanto, hemos pensado en darnos la oportunidad de compartir con usted cómo trabajan estos dos productos juntos. Ambos tienen la palabra "flujo de datos" en su nombre, y ambos sistemas están arraigados en la premisa de la programación del flujo de datos,…
Clearsense: Maximum Healthcare Transformation, Minimal Investment
Clearsense, based in Jacksonville, Florida, develops cloud-based applications based upon Hortonworks 100% open-source Connected Data Platforms. Its customers are hospitals and healthcare systems, and its mission is to save people's lives by giving providers and medical practitioners advanced notice of a patient’s deteriorating health. Clearsense achieves its mission through the open source power of Hortonworks…
Expressway Authorities do Hadoop Every day, Expressway Authorities must make critical decisions -- often times without sufficiently accurate and transparent data. At the same time, they may be losing revenue due to reporting latency and the inability to respond when toll plaza sensors are down. Hortonworks DataFlow (HDF™) and Hortonworks Data Platform (HDP®), can help resolve these…
Danske Bank, headquartered in Copenhagen, is the largest bank in Denmark. It’s also one of the major retail banks in the northern European region, with over 5 million retail customers. Data is mission critical to Danske Bank as it provides them with actionable intelligence to help minimize risk and maximize opportunities. In our latest video,…
Open Energi on the Business Value of Hortonworks Technical Expertise
Open Energi is a UK clean tech company working with businesses to intelligently optimize their electricity demand and to deliver both revenue and cost savings. The company pioneers in its market in terms of services, real-time analytics, machine learning, and Internet of Things (IoT). Through Hortonworks Data Platform (HDP) and DataFlow (HDF), Open Energi developed…
Be First: How Clearsense Leverages Big Data Analytics in Healthcare
Clearsense, based in Jacksonville, Florida, develops cloud-based applications based upon Hortonworks 100% open-source Connected Data Platforms. Its customers are hospitals and health systems, and its mission is to save people's lives by giving providers and medical practitioners advanced notice of a patient’s deteriorating health. Clearsense’s flagship product, Inception, is “designed specifically for the needs of…
Part 2 of HDF Blog Series: A Shared Schema Registry: What is it and Why is it Important?
In Part 1 of this series, we discussed how data-in-motion solutions require both flow management and stream analytics capabilities. Also, we introduced an exciting new technology that Hortonworks is in the process of releasing that helps users build streaming analytics apps faster and caters to three different personas in the enterprise: app developer, operations teams and the…
TELUS: Bringing Real-Time to the Enterprise with HDF
With the San Jose DataWorks Summit (June 13-15) just one month away, we’re busy finalizing the lineup of an impressive array of speakers and business use cases. This year our Enterprise Adoption Track will feature Cavan Loughran, Solution and Big Data Architect of TELUS, and Oliver Meyn, Solutions Architect of T4G. TELUS is a Canadian national telecommunications company…
Part 1: Hortonworks Thoughts on Building a Successful Streaming Analytics Platform
As part of the product management leadership team at Hortonworks, there is nothing more valuable than talking directly with customers and learning about their successes, challenges, and struggles implementing their big data and analytics use cases with HDP and HDF. These conversations provide more insight than any analyst report, white paper, or market study. In…
Danske Bank, headquartered in Copenhagen, is the largest bank in Denmark. It’s also one of the major retail banks in the northern European region, with over 5 million retail customers. Danske Bank is leveraging Hortonworks for actionable intelligence to help minimize risk and maximize opportunities. Three weeks ago, at the DataWorks Summit in Munich, we announced…
Four Trends in Artificial Intelligence That Affect Enterprises
Andrew Ng, the renowned data scientist, has said that artificial intelligence (AI) needs to be a company-wide strategic decision. Companies that don't strategically invest in AI will slowly lose market share to companies whose core businesses are built around AI. AI enables the prediction, planning and automation of a variety of tasks, and for enterprises,…
Applied Healthcare Informatics: A Healthcare Data Ecosystem Constructed on HDP and Utilizing HDF
This is a guest blog post by Charles Boicey, Chief Innovation Officer at Clearsense. Clearsense was born out of a passion for helping healthcare organizations realize the promise of their data and its ability to help them make better, faster clinical decisions—to meet the challenges of value-based care, drive research, improve patient care, and ultimately…
Apache, Hadoop, Falcon, Atlas, Tez, Sqoop, Flume, Kafka, Pig, Hive, HBase, Accumulo, Storm, Solr, Spark, Ranger, Knox, Ambari, ZooKeeper, Oozie, Metron and the Hadoop elephant and Apache project logos are either registered trademarks or trademarks of the Apache Software Foundation in the United States or other countries.