Generating images using artificial intelligence (AI) requires organizations to be able to navigate copyright issues that can become thorny if there isn’t a lot of supervision. Organizations that create content for profit don’t want the images they create being used in the future to train an AI model that can be used by anyone to create images based on content they created. This concern is giving rise now to enterprise editions of AI platforms that provide guardrails that impose limits on how proprietary intellectual property (IP) is employed within the context of an AI model.
A case in point is Invoke, a provider of an open source AI platform for generating images, that is now making available an enterprise edition of the platform. The commercial offering makes it possible for enterprise IT organizations to create their own models based on the Invoke platform that only their employees can access using a single tenant platform that provides access to role-based access control (RBACs), single sign-on (SSO) capabilities and support for custom settings for project spaces that can only be accessed by invitation.
That’s critical because not every end user understands how intellectual property (IP) might one day find its way into the public domain when it’s used to train an AI platform such as ChatGPT that is made available to the general public. As a result, many organizations are restricting usage of public AI platforms out of fear of losing control of their IP, notes Invoke CEO Kent Keirsey.
An enterprise edition of the open source AI platform provides all the training benefits of a collaborative approach to training in a way that eliminates any chance IP will inadvertently wind up being incorporated into a future release of the open source AI platform, says Keirsey.
In addition, that approach enables organizations to train an AI model using their own semantics that describe various business processes. Every organization has its own nomenclature that an open source AI model isn’t going to understand without additional training, he adds. “They need to add their own semantic layers,” adds Keirsey.
Finally, organizations also get the added benefit of being able to add additional third-party AI models downloaded from repositories such as Hugging Face as they see fit, says Keirsey.
There are, of course, no shortage of options when it comes to using AI models to generate images. The challenge is now every end user appreciates the implications of exposing IP to those models. Customizing AI models gives enterprise IT organizations more control over how AI models are being employed within the context of their proprietary data.
Each organization will need to decide to what degree they are comfortable sharing data with public AI platforms, but historically most organizations are wary of sharing anything that might result in a loss of control of their IP. Once lost, it’s all but impossible to recover control of that IP once an AI model has been trained on it.
It’s still early days in terms of how enterprise organizations will operationalize AI, but given the sensitivity of the data being used to train AI models, most of them are going to opt to be safe now rather than sorry later.