Saturday, December 21, 2024

Elevated Infrastructure, Hitachi Vantara Up-Front On Backend Optimization

Must read

Backends are unloved, unfairly. From the restaurant kitchen to the retail store warehouse and the concert hall lighting rig, what happens at the backend of any successful operation is often overlooked by those who consume at the frontend. The parallel carries over in fairly robust terms in information technology i.e. the server room, the IT department, the database division and the software development team are often regarded as a functional element of an organization in the same breath as the water and electricity supply.

But that demotion to utility-level status for any part of the backend isn’t always prudent. In an age where we need to care about power consumption, energy use should be part of the management’s strategic plan for growth, development and profit. Equally, as we have been saying for many years now, the IT function should be seen as a means of gaining operational advantage for firms, rather than something that exists to merely serve and then be consigned to backroom ignominy when not needed.

Hitachi Vantara says it works to champion corporate epiphanies of this kind every day.

Optimize Infrastructure, Then Apps

Pushing to get us thinking IT base layer first, then (and only then) build great apps and user experience second rather like the Gary Larson “first pants, then shoes” life truism cartoon, the data storage, infrastructure and hybrid cloud management subsidiary of Hitachi, Ltd. wants to drive home the need for organizations to optimize their IT infrastructure while reducing costs.

According to Octavian Tanase, chief product officer at Hitachi Vantara, modern businesses struggle with the complexity involved in managing and operating traditional datacenters. To compound matters, he reminds us that there’s a growing need to reduce operational overheads, specifically with power, cooling and space requirements, to streamline operations and enhance sustainability efforts. A recent report found datacenters across the world produce up to 3.7% of global greenhouse gas emissions while they obviously also use huge amounts of water for cooling.

Aiming to provide technology solutions that actually work as “solutions” to these realities, Hitachi Vantara has recently announced the development of high-performance energy-efficient hybrid cloud and database solutions. The AMD Epyc (a brand term often stylized in capitals and used as a play on the work epic) CPU-powered Hitachi Vantara hybrid cloud offerings combine converged and hyperconverged solutions, including the Hitachi Unified Compute Platform.

High-Density & High-Throughput

The company is now focused on using these high-performance processors to help create what it calls “high-density and high-throughput” technologies with 100% data availability guarantees. Although those nods to industry terminology may sound like marketing taglines, they do matter; high-density in this case refers to a server that can achieve a large amount of data storage capacity in any given box – while high-throughput is a measure of how many units of application, analytics or other data that a system can process in a given measure of time.

“These solutions offer a suite of key differentiators in the realm of hybrid cloud, databases and high-performance computing, providing high-performance, efficient, and scalable solutions tailored to modern enterprise needs,” said Tanase. “The UCP portfolio also includes the Hitachi UCP for Azure Stack HCI, which helps deliver a consistent hybrid cloud infrastructure across edge, core and public clouds. Hitachi Vantara helps businesses simplify hybrid cloud deployments with a single source of systems, solutions and services that streamline operations while reducing multi-vendor logistics. Organizations benefit from customizable cloud assessment, advisory, and cloud migration services to plan and execute the multi-cloud journey on a schedule that meets their business objectives.”

In terms of deployment scope, the hybrid computing solutions here are claimed to excel at improving processing times for traditional and modern workloads, including general-purpose workloads, databases, high-performance computing and computer-aided engineering. The technology here features modern designs and smaller footprints, which the company says is key to contributing to reduced carbon dioxide emissions and increased energy efficiency. One financial service provider has reported that Hitachi Vantara’s storage can reduce C02 emissions up to 96% and datacenter storage footprint as much as 35%.

Environmental, Social, Governance

The company’s recent sustainability report underlines how it says using infrastructure optimization and management functions can help organizations wit their environmental, social and governance principles. The report outlines the company’s past and present initiatives, as it also reflects the collective effort of Hitachi Vantara’s global workforce, emphasizing collaboration and cross-functionality in advancing sustainable business practices. It aligns with internationally recognized accounting standards and the United Nations Sustainable Development Goals.

“Hitachi Vantara is committed to sustainable product design, end-of-life management, and minimizing waste, aligning with circular economy principles,” said Sheila Rohra, CEO of Hitachi Vantara,. “[Our] Virtual Storage Platform One data platform products are Energy Star certified, with the automated switching process reducing power consumption and contributing to lowering CO2 emissions by 30-40% from model to model. This certification signifies a commitment to environmental responsibility alongside substantial energy savings for consumers and businesses.”

The company both generates and mitigates energy across our global sites. Its state-of-the-art Netherlands distribution center’s on-site solar panels produce roughly one-third of its annual electricity consumption, with the remainder derived renewably from verified Energy Attribute Certificates.

Will Infrastructure Epiphanies Happen?

Is this the point when unloved technology infrastructure finally gets a share of the love that is so often directed towards “cool new app” across upper level user interfaces and of course the now-ubiquitous penetration of artificial intelligence?

The answer to those types of questions is usually, maybe, a little or no… but in this case it’s more likely to be increasingly yes.

What may matter most of all in this space is just how quickly software engineers start to appreciate the need for infrastructure optimization and control. After all, if a programmer got into coding because they wanted to make great apps, create the next Twitter and change the world, why would they willingly take a step back and embrace the need to take more interest in the backend lower substrate levels of the IT stack?

Sasan Moaveni, global business lead for AI & high-performance data platforms at Hitachi Vantara says that this progression point is something that comes up with users all the time.

“What we find is that there are (in general) two groups of customers. The first group are organizations that know what they want to do with AI implementation (many of whom are public sector government departments in fact) and the second group are at a more exploratory stage where the board has agreed it needs AI, but needs more information,” said Moaveni. “Our AI discovery process is a three week engagement program designed to analyze an organization’s core needs and identify the business value that they might derive from AI. This means first examining whether the firm has access to the right data to feed the AI engine they need to fuel – it also looks to see whether they are keeping six redundant copies of duplication data that is costing them money, increases their carbon emissions and slows them down. This whole procedure is a key part of the realization process that leads engineers to the infrastructure optimization epiphany that they need to have… if indeed it doesn’t just happen naturally.”

At a practical level, the “efficiency DNA” cultured here can be infectious. Moaveni and team suggest that developers themselves can start to look for optimization channels they can apply further up the IT stack, right through to the number of lines of code used for any software function. Equally, it also filters back downwards to aspects of IT operations as granular as the power transformers used in a datacenter. These unloved blocks of metal can now be used to switch between clean energy sources such as solar, wind and nuclear, or “dirty” non-renewable energy when no clean power is available. With a product stable spanning everything from trains to MRI scanners to computers, the company insists it has the scope to provide a special view into industry-specific use cases for AI in a multiplicity of deployment scenarios.

Hitachi Vantara’s portfolio was ranked by analyst firm GigaOm in 2024 (specifically for its ability to manage unstructured data, but the wider point is also made) as extremely positive in features including metadata analytics, global content search, big data analytics, data governance and compliance, access control, workload orchestration and data protection. It is these backend, backroom and back-office technologies that (arguably) matter even more now that we work in the modern age of cloud and always-on data and application pipes that feed us from the rear.

Does that mean that data storage and infrastructure management just got sexy and fun? Steady on now, let’s not be silly.

Latest article