Enterprise2.us - Harnessing the Power of Us

Enterprise 2.0

Subscribe to Enterprise 2.0: eMailAlertsEmail Alerts newslettersWeekly Newsletters
Get Enterprise 2.0: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn

Hitachi Data Systems has sharpened its focus on "Big Data Clouds" with today's announcement of the latest version of its Hitachi Content Platform and a new product called the Hitachi Data Ingestor.  Together, they can be used by cloud service providers and distributed IT organizations to make easier, cheaper and safer to but large amounts of data, especially unstructured content, into the cloud.

The Hitachi Content Platform is one big honkin' object store that can hold as much as 40 petabytes (1024 terabytes) of both structured and unstructured data per physical cluster, presented to both users and applications with a single, unified object view.   The HCP employs what the company calls "intelligent object" that can, "manage themselves given certain conditions" and eliminate the need for a "master control program"

It includes a number of features for cloud service providers and IT services organizations, including fine-grained multitenancy support, quota and charge-back tools, compression, and version control.  And, it also offers a number of high-end data management facilities, including WORM support, HDI-HCP encryption, advanced replication, continuous integrity checking, RAID 6 support and more.  The HCP is also tightly integrated with Symantec's EnterpriseVault archiving software.

The Hitachi Data Ingestor is a sort of distributed front end for the HCP that the company describes like this:

"HDI acts as an on-premise intelligent storage cache for distributed sites to adapt users and applications to a cloud-based or centralized Hitachi Content Platform. The new on ramp can reside at remote or branch offices or public cloud locations, to provide bottomless, backup-free storage by connecting to the multitenant, multipurpose HCP."

The HDI provides read/write CIFS and NFS access to local and remote HCP systems, enables secure multitenant access to segments of a single storage pool, provides local quota and chargeback accounting, and supports both Active Directory and LDAP authentication.

The two solutions are being aimed at several different targets, including large medium to large enterprises with branch office locations and/or distributed IT facilities, organizations looking to consolidate data assets in a centralized private cloud, and telcos, service providers, and systems integrators who want to build integrated infrastructure for hybrid and public cloud services.

  • Hitachi Data Systems asserts that the new offerings will:
  • Simplify IT by halting the proliferation of storage silos
  • Reduce cost by eliminating tape-based backup and improving data utilization
  • Lower risks with improved compliance, governance, and data location control
  • Speed cloud adoption by transparently supporting existing applications and policies

“Arming customers with solutions that are standards-based and easily plug into their existing environment and helps them to realize the operational benefits of cloud storage infrastructure without a major infrastructure overhaul continues to be one of the best ways to encourage cloud adoption. The new Hitachi Data Ingestor offers a low-risk solution for customers that would like to shift their data to the cloud,” said Terri McClure, senior analyst, Enterprise Strategy Group. “Moreover, as unstructured data continues to grow rapidly, the enhancements to the new Hitachi Content Platform will help customers store, access and view that data more efficiently and easily.”

The new Hitachi Content Platform and Hitachi Data Ingestor are now available worldwide.

More Stories By Tim Negris

Tim Negris is SVP, Marketing & Sales at Yottamine Analytics, a pioneering Big Data machine learning software company. He occasionally authors software industry news analysis and insights on Ulitzer.com, is a 25-year technology industry veteran with expertise in software development, database, networking, social media, cloud computing, mobile apps, analytics, and other enabling technologies.

He is recognized for ability to rapidly translate complex technical information and concepts into compelling, actionable knowledge. He is also widely credited with coining the term and co-developing the concept of the “Thin Client” computing model while working for Larry Ellison in the early days of Oracle.

Tim has also held a variety of executive and consulting roles in a numerous start-ups, and several established companies, including Sybase, Oracle, HP, Dell, and IBM. He is a frequent contributor to a number of publications and sites, focusing on technologies and their applications, and has written a number of advanced software applications for social media, video streaming, and music education.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.