HDS unveiled the next-generation HSP for big data

Hitachi Data Systems Corporation (HDS) , today unveiled the next generation Hitachi Hyper Scale-Out Platform (HSP), which now offers native integration with the Pentaho Enterprise Platform to deliver a sophisticated, software-defined, hyper-converged platform for big data deployments. Combining compute, storage and virtualization capabilities, the HSP 400series delivers seamless infrastructure to support big data blending, embedded business analytics and simplified data management.

Modern enterprises increasingly need to derive value from massive volumes of data being generated by information technology (IT), operational technology (OT), the  Internet of Things (IoT) and machine-generated data in their environments. HSP offers a software-defined architecture to centralize and support easy storing and processing of these large datasets with high availability, simplified management and a pay-as-you-grow model. Delivered as a fully configured, turnkey appliance, HSP takes hours instead of months to install and support production workloads, and simplifies creation of an elastic data lake that helps customers easily integrate disparate datasets and run advanced analytic workloads.

HSP’s scale-out architecture provides simplified, scalable and enterprise-ready infrastructure for big data. The architecture also includes a centralized, easy-to-use user interface to automate the deployment and management of virtualized environments for leading open source big data frameworks, including Apache Hadoop, Apache Spark, and commercial open source stacks like the Hortonworks Data Platform (HDP).

“Many enterprises don’t possess the internal expertise to perform big data analytics at scale with complex data sources in production environments. Most want to avoid the pitfalls of experimentation with still-nascent technologies, seeking a clear path to deriving real value from their data without the risk and complexity,” said NikRouda, Senior Analyst at Enterprise Strategy Group (ESG). “Enterprise customers stand to benefit from turnkey systems like the Hitachi Hyper Scale-Out Platform, which address primary adoption barriers to big data deployments by delivering faster time to insight and value, accelerating the path to digital transformation”

“Modern enterprises must merge their IT and OT environments to extend the value of their investments. HSP is a perfect solution to accelerate and simplify IT/OT integration and increase the time to insight and business value of their big data deployments,” saidJames Dixon, chief technology officer at Pentaho.“The HSP-Pentaho appliance gives customers an affordable, enterprise-class option to unify all their disparate datasets and workloads—including legacy applications and data warehouses—via a modern, scalable and hyper-converged platform that eliminates complexity. We’re pleased to be working with HDS to deliver a simplified, all-in-the-box solution that combines compute, analytics and data management functions in a plug-and-play, future-ready architecture. The Hitachi Hyper Scale-Out Platform 400 is a great first-step in simplifying the entire analytic process.”

With HSP, Hitachi continues to deliver on the promise of the software-defined datacenter to simplify the delivery of IT services through greater abstractionof infrastructure, and improved data access and automation. While its initial focus is on big data analytics use cases, the company’s long-term direction for HSP is to deliver best-in-class total cost of ownership (TCO) for a variety of IT workloads.Hitachi will offer HSP in two configurations to support a broad range of enterprise applications and performance requirements: Serial Attached SCSI (SAS) disk drives, generally available now, and all-flash, expected to ship in mid-2016.

 “We consistently hear from our enterprise customers that data silos and complexity are major pain points—and this only gets worse in their scale-out and big data deployments.We have solved these problems for our customers for years, but we are now applying that expertise in a new architecture with Hitachi Hyper Scale-Out Platform,” said Sean Moser, senior vice president, global portfolio and product management at Hitachi Data Systems. “Our HSP appliance gives them a cloud and IoT-ready infrastructure for big data deployments, and a pay-as-you-go model that scales with business growth. Seamless integration with the Pentaho Platform will help them put their IT and OT data to work—faster.This is only the first of many synergistic solutionsyou can expect to see from Hitachi andPentaho. Together, we are making it easy for our enterprise customers to maximize the value of their IT and OT investments and accelerate their path to digital transformation.”

Related posts

IBM Study: More Companies Turning to Open-Source AI Tools to Unlock ROI

RUCKUS Wi-Fi 6 Solution Improves Campus Experience for Amrita University

Team Computers Launches Global Delivery Center in Uttarakhand to Transform Rural India

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Read More