ServiceNow expanded its efforts to add generative artificial intelligence (AI) with case summarization and text-to-code capabilities driven by large language models (LLMs) it created.

Previously, the company announced it had developed an AI Controller through which it would enable generative AI capabilities using a mix of LLMs. However, in collaboration with Hugging Face, a provider of a hub for collaboratively building AI models based on open source technologies, the company is moving to rely more on proprietary ServiceNow LLMs that are specific to domains such as customer service and IT service management (ITSM). In effect, ServiceNow is using a mix of open source and proprietary technologies to infuse AI across all the workflows its platform enables.

Those capabilities will be made available via a premium service that ServiceNow will make available as part of the Vancouver release of the Now platform, due out in September.

In addition, ServiceNow also plans to allow organizations to bring their own LLMs to the Now platform, if they so choose, notes Jon Sigler, senior vice president of the Now platform at ServiceNow. In fact, as AI continues to evolve it will become more multimodal as different LLMs are employed to optimize specific processes. The difference is that most of those LLMs will be trained using less data to create AI models that will be more accurate than a general-purpose AI model such as ChatGPT.

Regardless of the LLM employed, the result should be an ability to manage a wide range of workflows at unprecedented levels of scale, he adds. “We should major productivity gains,” says Sigler.

Both KPMG and Accenture, for example, in anticipation of helping organizations re-engineer those business processes, are both extending their relationships with ServiceNow. KPMG will be applying AI in collaboration with ServiceNow to finance, supply chain and procurement processes. Accenture, in collaboration with ServiceNow and NVIDIA, launched an AI Lighthouse initiative to design, develop and implement new generative AI use cases.

It’s still early in terms of how AI will be incorporated into workflows, but declining productivity gains have been a cause of concern now for several years now. AI creates an opportunity to re-engineer workflows in ways that are not only more efficient but that also reduce the level of monotonous toil that far too many humans experience daily at work.

Customer service agents, for example, should also have more time to allocate to issues that they would otherwise not be motivated to fully resolve, simply because they are measured on time to resolution rather than the actual quality of the customer experience, noted Sigler.

Of course, there undoubtedly will be a significant amount of disruption as workflows become more infused with AI. Roles and functions within organizations are going to change. The issue is not so much whether AI will replace humans as it will be determining where humans will add the most value to automated workflows. After all, even in the most optimal use cases, machines are not going to be 100% infallible, so the need for humans to remain in the loop to ensure guardrails are followed is still apparent.