We know that artificial intelligence (AI) is not new. Companies have been using and employing AI technology for the better part of a decade, applying AI to tasks ranging from identifying malware to recommending movies on streaming sites.
What has changed in the last year are the demonstrations showing just how far AI has come and the seemingly limitless possibilities inherent in it. Some of that progress is due to innovations in AI technology itself—such as the large language models (LLM) that power mainstream generative AI applications—as well as the advancements in the supporting technology like the increased availability of high-powered computing through the cloud and immense stores of data. All these factors have drawn renewed and broad attention from business leaders in the last year as they began asking how AI could benefit their businesses.
2024 is the year generative AI graduates from science project to business-critical use case, helping organizations putting their data to work succeed and leave behind those that are not. A study from the Boston Consulting Group showed that 30% of businesses they labeled “data champions” are expected to increase revenue by more than 10% by the end of 2024, but less than half that many (13%) of the companies that they deemed “data laggards” are expected to do the same.
The business case for leveraging data for AI and analytics is clear. It can help businesses drive operational insights and automate processes. NetApp’s 2023 Data Complexity Report found that 72% of companies are already using generative AI in some way. However, a lot of those generative AI projects are still in the proof-of-concept stage. Companies are still working on building the basic elements of this technology such as putting together data sets and learning what AI models can help them do.
To help organizations turn these early demos into fully functional AI and analytics applications that support the business in the year to come, there are three key considerations for every organization as they move along their AI journeys.
- Stay Flexible to Manage AI Evolution Effectively
As part of this AI renaissance, most tech vendors are offering their own portfolio to support AI training and inferencing. For example, each of the three largest cloud providers has a range of solutions for both AI-powered services and infrastructure to support creating your own AI applications from scratch. And that’s not even accounting for smaller SaaS vendors offering their own AI service or niche infrastructure solutions. These vendors have their own separate requirements for integration and compatibility that IT teams will need to manage in their environments to use these services to their full potential.
At this point, there is no clear winner in the generative AI space and new technologies and solutions will continue cropping up. While the industry is changing so rapidly, it’s hard to predict which AI solutions will be useful and succeed even a few years into the future. Therefore, IT leaders tasked with integrating AI into their businesses need to build flexibility into their operating environments so they can keep up with, and adopt, the latest innovations as they become available.
Building a strong intelligent data infrastructure foundation can go a long way to ensuring success. By establishing an intelligent data infrastructure that is natively compatible with on-premises systems and the major clouds, those organizations will be in a better position to manage their organization’s data and leverage it for AI wherever that data lives, now or in the future. In short, they will be able to adapt to the market and take advantage of new innovations without expensive and time-consuming overhauls of their IT environments.
- Think Data Oceans Not Lakes
With an intelligent data infrastructure in place, organizations need to focus on getting as much value as they can out of their data. An important first step is to remove data silos. Unifying data across any type, workload and environment can help IT environments operate more efficiently. By ensuring all their data easily accessible, IT teams have more visibility into how much storage they need. Teams working together on developing AI models and applications can move faster as bottlenecks are removed.
Further, combining different types of data—such as customer, financial, employee or operational data—can help AI or analytics systems unlock insights about the business that form connections we wouldn’t normally consider.
- Take Only What You Need
Many projects are still in the proof-of-concept stage while organizations feel out the best ways to make AI work for them in their businesses. As generative AI experiments are spun up, it’s tempting to try to build or use the largest models with as much data (or parameters) as possible to account for every possibility.
The problem is that the kinds of LLMs that made headlines in the last year take huge amounts of cloud compute resources to run, let alone build in the first place. And the benefits don’t always match up with those costs. Having more parameters does not always generate more accurate results, and in fact, it can make it harder for the model to interpret any new data it encounters.
Instead, organizations that want to use AI more efficiently need to take the time to distill larger AI models. This process lets larger AI models train smaller models so that they are only using the parameters that generate the right results. The outcome is a faster, less compute-intensive model that produces the same quality of results.
All of this results in a very different set of requirements for an organization’s data environment. Rather than being focused on raw speed, there’s now a need to optimally manage the flow of data both to these models – for inferencing, and back – to support continuous fine tuning. With the potential for these models not being co-located with the data fueling their runtime – across on-premises and cloud – organizations should seek out intelligent data infrastructures to simplify what could otherwise be a daunting task.
In Summary
This year, companies need to ensure their organization’s data infrastructure can support widespread AI adoption. Entering this new era of tech with intention and care is critical to providing organizations with the tools they need to claim a competitive advantage and generate a return on investment.
With all the expectation emanating from the lines of business they serve, as well as pressure to keep up with competition also quickly adopting and reaping the benefits of AI, IT organizations must quickly find and embrace the right data infrastructure vendor to be able to simplify, accelerate and de-risk this journey.