Operationalizing AI

The Operationalizing AI event is taking place this week in sunny Boca Raton, Florida. Industry experts from all over the planet are converging to ponder and plan how AI and large language models (LLMs) can play an integral role in the world of DevOps.

The event is the brainchild of two thought leaders, John Willis and Patrick Dubois. John Willis is one of the founders of the DevOps movement, and has been an evangelist since its beginning. He co-founded a company called SocketPlane.io, which developed tools for software defined networks; the company was later acquired by Docker. Patrick Dubois is the man who coined the term DevOps, indirectly, by creating an event he called DevOpsDays. The name stuck.

During the first morning of Operational AI, Joseph Enoch gave an extended presentation about generative AI and large language models (LLMs). Participants were encouraged to share their thoughts throughout the talk.

In the afternoon, the team broke up into four small working groups, covering the following topics:

  • Governance
  • Pipelines
  • Architecture
  • Data

Meanwhile, transcending all four working groups is the whole concept of generative AI.

What is Generative AI?

Generative AI is a form of AI where the system starts with what it knows, and continues learning based on input that it receives, and ultimately generates new ideas. For example, after reading enough novels, a generative AI system could generate a new novel. Or, similarly, a generative AI that has studied artwork could generate new artwork. One specific type of generative AI is a generative pre-trained transformer, or GPT.

What is a GPT?

Over the past year, ChatGPT has gotten a lot of news. ChatGPT is an example of a generative pre-trained transformer, a specific type of generative AI that makes use of a transformer model. What does that mean? It’s one of the latest techniques for training a large language model. It’s based on a groundbreaking paper called “Attention Is All You Need” by Vaswani et al. The idea is for the system to learn relationships between words that lie in a sequence. As more and more sequences are fed into the model, the system will reach a point where it can generate a word based on the previous word; the decision is made based on context in conjunction with hundreds, thousands, even millions of pre-training data fed into the system.

OAI: Operationalizing AI

The goal here with this event is to bring AI into the DevOps world, and ultimately create a new idea they’re calling OAI, Operationalizing AI. Together, this week, this group of top thinkers in the industry will be bringing their ideas together to launch this new field. Stay tuned in the weeks and months to come to see what grows from this.