
Salesforce today revealed its Einstein 1 Platform for applying artificial intelligence to processes can now consume unstructured data in a way that enables users to generate content without having to create prompts.
Announced at a Salesforce+ World Tour event in New York, Einstein 1 is now integrated with a Data Cloud Vector Database that makes it possible to use unstructured data to, for example, automatically generate a sales contract based on the documents and emails loaded into a vector database.
In addition, Salesforce has added Einstein Copilot Search, a natural language interface for launching queries against any of the large language models (LLMs) it provides access to via its software-as-a-service (SaaS) platform. Einstein Copilot Search also provides citations to source material using the Einstein Trust Layer that Salesforce has embedded in its platform to validate generative AI output.
Due out in February of 2024, these capabilities reduce the need for end users to create prompts to fine tune content generated by an LLM. Instead, the Einstein platform will, for example, use the data entered in the search interface to generate a document. At the same time, organizations will be able to expose that search interface to enable customers to self-service requests for support.
That capability doesn’t eliminate the need for prompt engineering specialists to link prompts together in a way that automates a repeatable process, but it does make generative AI more accessible to end users that are not well-versed in how to use prompts to fine tune the output of an LLM, says Sanjna Parulekar, vice president of product marketing for Salesforce. Many companies, rather than having end users recreate prompts, are already starting to build libraries of prompts to automate various tasks, she adds.
The most immediate challenge organizations will face in the meantime is finding ways to organize their data in a way that enables them to get the most business value possible from AI, notes Parulekar. “Organizations that have their data house in order will have an advantage,” she says.
Unfortunately, far too many organizations have not yet implemented best data management practices, but the rise of generative AI will force the issue. Organizations that make use of a platform will achieve that goal faster than organizations that might, for example, decide to deploy their own vector database to access an LLM they built and then need to maintain, notes Parulekar.
In fact, most organizations will find that using their data to extend existing LLMs will address most of their requirements, she adds.
Organizations will, of course, need to put guardrails in place to make sure content being generated by those LLM is accurate, but it’s already becoming apparent that organizations that don’t make these types of capabilities available will soon be viewed as being antiquated. Generative AI may provide some level of competitive advantage, but it’s clear that employees and customers that are not given tools enhanced with AI to perform tasks will soon find somewhere else to work and engage.