Artificial Intelligence and Business IT: How to Adopt It?

How can business IT adopt the change that Artificial Intelligence means smoothly?

That Artificial intelligence It will be disruptive for companies no one disputes it anymore, but how will it be for the architectures of Enterprise IT What supports them?

Here I list six keys to understanding an adaptation to change in business IT.

Business IT and the incorporation of Artificial Intelligence

AI is all about data

There will always be applications coming out that make us rethink the systems that support them, but AI is especially disruptive. Everyone knew something was going to completely change in terms of enterprise IT when we first saw ChatGPT last fall. But while we focus a lot on processing, in the end it’s all about data.

AI may be a newer application, but its principles are not unknown: the desire to make decisions faster based on the infinite data accumulated by a company. However, what companies are building for AI is unlike anything done in the past. The closest thing may be an infrastructure for high-performance computing (HPC), Allyson said, but that has rarely been in the domain of enterprise IT and has typically stayed within the limits of academia and research.

Most companies have not even dabbled in HPC. Even for those that have it, it doesn’t usually mix with other workflows; it is treated like a silo and managed like a different beast. If the key promise of AI is to transform workflows across the enterprise by accessing all data, we can learn from HPC solutions, but we can’t copy them.

Enterprise IT

When you can’t reuse, redesign

Most enterprise infrastructures are not inherently designed for AI, but that’s not the only challenge. Keith Townsend, director of The CTO Advisor, noted that AI infrastructure is not only new, but in many ways it runs counter to most enterprise IT strategies. This is partly because the life cycle of AI applications is more iterative than that of traditional enterprise applications.

The other challenge is that most data centers designed for traditional IT were designed around physical and power limitations that AI has challenged, with its potentially massive footprint and power consumption. Allyson noted that many abandoned data centers simply were not designed to supply power to these GPU clusters. I saw this when a customer tried to deploy AI workloads in an abandoned data center. They were only able to deploy two GPU servers per rack due to power limitations. This resulted in two-thirds of the shelf remaining unused.

Also Read  A car battery that offers 1000 km of autonomy and charges in 10 minutes

With data center square footage at a premium and the cost of energy increasing, we need to think carefully about how to address this issue in a sustainable way.

Artificial Intelligence and the “E” in ESG criteria

Many environments were not designed for heavy workloads, a typical rack receives between 5 and 10 kW of power per rack, but that could be just one server in an AI deployment. So for this type of workflow, between 45 and 100 kW of power is needed per rack, which creates cooling challenges.

Clearly, the “E” in ESG still comes up, and rightly so: geopolitics, climate change, energy constraints, and sustainability goals help make the case for all-flash data centers as the only logical path to follow.

Enterprise IT

The cloud conundrum: Can AI be outsourced without compromise?

If most data centers are built for general-purpose computing and legacy can’t accommodate AI, that leaves companies with few options: make general-purpose architectures scalable and efficient enough for AI , take advantage of the cloud or both.

The cloud may be great for some use cases, such as AI, but we know it is not a panacea. This raised considerations (or drawbacks) around data governance, visibility and ESG. The cloud can mask the power and cooling results that businesses need to report and creates yet another place for data to reside. Many organizations already struggle to know where their data is, and the cloud can exacerbate this problem. While it remains to be seen how many companies build entire AI infrastructures instead of leveraging the cloud as their secret sauce, change is coming.

Also Read  Get the Google Pixel 8 at a Great Price - High-Quality Mobile at a Good Price

What will the new infrastructures focused on AI look like?

No matter how new systems are designed, the one thing that will always unite them all is the data they consume and share. AI is about data. Storage innovation must come to the fore to enable businesses to take advantage of this technology.

The consensus was that new architectures should be designed not for specialization but for flexibility and disaggregation. They will be less of a vertically integrated silo and more of a set of resources optimized to solve enterprises’ biggest data challenges. That means deploying general-purpose infrastructure, but for a broader set of use cases, such as sophisticated workloads and accelerators.

AI will benefit from the internal sharing of a lot of data. You will benefit from the availability to feed it from all the different data sets. That’s a big push to use a more general-purpose architecture for AI processing.

A small number of highly scalable platforms can simplify the future of enterprise IT across all workloads: analytics, files, objects, and more. This will allow IT to expand and broaden general-purpose capabilities, making AI less disruptive to enterprise IT than we think.

Pure Storage: A Better Data Storage Platform for AI

With all the different data sets and the need to share them, there has to be consolidation so that all resources in this new composable infrastructure can take advantage of the data. The alternative is excessive waste from a capacity perspective or a data governance and compliance nightmare.

This is where a data storage platform built for AI like Pure Storage FlashBlade//S comes into play. It’s already clear that legacy storage has no place in the next-generation AI-powered data center.

Pure Storage has innovated to make this difficult task easier for the enterprise by creating systems that eliminate legacy complexity and rise to any challenge within an AI data pipeline: from addressing high-intensity file and object workloads performance to meet the requirements of large enterprise flash-enabled file object stores in the disk economy.

Leave a Reply

Your email address will not be published. Required fields are marked *