Providers of existing artificial intelligence (AI) platforms are now moving to extend them in a way that promises to reduce the amount of time and effort needed to embrace generative AI.
DataRobot, for example, today committed to integrating customizable large language models (LLMS) into its existing AI platform to make it simpler for organizations to safely use their data with the context of a generative AI application. Organizations will be able to take advantage of vector databases and other tools provided by DataRobot to expose their data in a way that provides results to natural language queries without that data having to be added to an LLM that might gave been created by another external entity, says Jay Schuren, chief customer officer for DataRobot.
In addition, DataRobot is providing access tools for a bot to make it simpler to build these applications as well as observe them once they are deployed.
Finally, the company is also providing a range of training and professional services to help organizations better identify use cases for applying AI at a time when that expertise is hard to find.
The goal is to make it simpler and for organizations to use generative AI to build their own chatbots across a range of applications using LLMs they customize using tools provided by DataRobot, says Schuren.
In general, the financial services industry will most likely be on the leading edge of adoption of generative AI. There will no doubt be regulatory hurdles that will need to be addressed, but there are also many internal workflows that generative AI can be applied to across the financial services sector, notes Schuren.
Those firms will be able to, for example, use generative AI to provide summarizations of financial filings that will be easier to comprehend. “No one really wants to read a 10K,” says Schuren.
There’s obviously lot of hype surrounding generative AI today, but in time it will prove warranted as it is pervasively applied across innumerable workflows, adds Schuren. The issue that organizations are now contending with is the degree to which investment in generative AI will simply enable them to remain competitive versus providing a sustainable competitive advantage. There is clearly a raft of workflows that just about every organization will automate using AI, but the challenge will be defining a workflow using data that no one else has to provide a unique capability.
The quickest path to achieving that goal may be to rely on a platform to automate most routine tasks while experimenting with LLMs directly to see what the art of the possible might be.
In the meantime, the race to automate processes using AI is clearly on. It’s already clear organizations that, despite a raft of security, privacy and accuracy concerns, don’t embrace AI will find themselves irrelevant one day soon. That doesn’t mean organizations should ignore those concerns, but the only way to effectively surmount them is going to be by gaining some hands-on experience.