Stigg, a provider of a monetization platform used widely by software vendors, today revealed it has extended the scope of its capabilities to now include credits that can be applied in real time as infrastructure resources are consumed.

The Stigg Credit Suite extension is intended to provide publishers of applications with an alternative approach to offering access to AI resources using a set of credits that can be assigned to organizations or individuals using a SaaS application, says Stigg CEO Dor Sasson.

Most providers of applications today have embraced a business model based on subscriptions. However, as they add AI capabilities many of them are seeing increased consumption of AI infrastructure resources that is costly to deploy and manage. Stigg wants to provide publishers with a credit mechanism to monetize consumption of the AI capabilities they are infusing into their applications in a way that doesn’t require them to adopt an entirely new business model, says Sasson.

Each provider of an application will need to determine for themselves to what degree to specifically monetize AI capabilities versus treating them much like any other feature that has been added to an application. In most instances, they might provide a base set of AI capabilities that can be extended using a set of credits that the end customer can purchase. “The credits provide more flexibility,” says Sasson.

It’s still early days so far as adoption of AI is concerned but many software providers and their customers are already experiencing sticker shock. The core AI models that application developers rely on charge a fee for each token used to invoke a large language model (LLM). Those tokens are created for each input and output so the total cost of making AI capabilities can quickly add up. It’s not clear how sustainable that token-based approach to pricing might actually be long term, but for now many providers of application software are looking for ways to pass those costs along to customers. However, the challenge is that rivals might not similarly opt to pass those costs along as part of an effort to gain market share.

It’s also not clear to what extent the end customer of an application provider is going to embrace credits as a billing method for accessing AI models, especially if they discover that their end users are regularly exceeding the number of credits purchased for any given month. On the plus side, credits provide end customers with a billing model that, thanks to the rise of cloud computing services, many of them already understand, says Sasson

Regardless of the approach, processing AI prompts today remains an expensive proposition. It may one day become less expensive as additional AI advances are made, but for now the cost of AI is limiting adoption. The challenge now is finding that balance between what AI features are becoming table stakes in an application versus those that are truly worth paying extra to invoke in a way an organization can reasonably expect to afford.

TECHSTRONG TV

Click full-screen to enable volume control
Watch latest episodes and shows

Tech Field Day Events

TECHSTRONG AI PODCAST

SHARE THIS STORY