Synopsis: In this AI Leadership Insights video interview, Scott Anderson, senior vice president for product management and business operations for Couchbase, explains how generative artificial intelligence (AI) will transform how databases are managed and the way applications are built and deployed.

Mike Vizard: Hello and welcome to the latest edition of the Techstrong.ai video series. I’m your host, Mike Vizard. Today we’re with Scott Anderson, who’s senior vice president for product management at CouchBase, and we’re talking about how copilots in the form of generative AI are going to change the way we think about managing databases. Scott, welcome to the show.

Scott Anderson: Thank you very much, Mike. It’s great to be here with you today.

Mike Vizard: You guys have launched an offering in this space. I’m going to let you describe how it works, but essentially we’re bringing all the joys of copilots to databases, and that’s going to, I assume, make things more accessible for all kinds of folks, and maybe I don’t have to be such a tremendous database wizard to manage all this stuff, and we’ll see what the implications of that are from here. Scott, dive in, what exactly are you guys doing here?

Scott Anderson: Yeah, so we recently announced general availability of a new capability called Capella iQ, which is really our chatbot or coding assistant for developers using Capella, which is our fully managed database as a service. And what iQ does is it leverages large language models to allow developers to do common tasks and interact with Capella and the underlying CouchBase database using natural language to do things like create a database, create an index, write a query using SQL++, our query language within CouchBase. But in addition to that, it also allows developers to create code based off the development language of their choice.
And one of the things that we’re really, really excited about is we’ve done integrations with both VS Code and JetBrains IDE. So people can go to those marketplaces, download a plugin for Capella iQ, and if they have credentials to our Capella database as a service, they can interact with iQ directly in a side window within their IDE. So they can work in the context that they’re familiar with and not just have to come into the Capella UI to use that functionality. We got great feedback from the preview that I discussed with you, I believe at the end of August, and we’ve taken that feedback that we’ve gotten from developers continuing to enhance the capability of IQ. And probably one of the things that was a great feedback that we got was really that integration with IDEs that I just mentioned.

Mike Vizard: So what are the implications for all of this? Are we going to be able to build and deploy more applications faster, run larger numbers of databases? How do you see the downstream impact of all of this?

AWS

Scott Anderson: Yeah, I think there’s really two areas. So there’s one is you’re building an application and creating your queries to ensure that they’re optimized. And be able to do that, even if you’re an experienced SQL developer and coming to Couchbase or if you’re somebody who’s earlier in their career and it’s really, as you mentioned earlier about kind of democratizing and making databases more accessible and allowing developers to program much more efficiently. And I think that’s the first thing that we’ve done because with iQ, the context in which the developers working, which is in Couchbase where we understand the schema of the data that they have in their JSON documents, we’re able to guide them so that they can build much more quickly as we go into the future though, there’s other things that we can add to iQ. So there’s capabilities that we want to do on initial configuration.
So more of some of the operational functions of I want to create replication between two different Couchbase clusters. Being able to ask iQ how to go ahead and do that, and most importantly, not just get the advice, but be able to take action directly in the context of iQ. Similar to what we do for developers today, where we will suggest developer says, how do I create the index? We show them what that code is, and they can go ahead and run and execute that code directly in iQ, or they can do that via their IDEs. So I think a lot of this is not just about giving the answer, which is really, really important and give a great very concrete answer, but then allow the developer with the click of a button to take the requisite action without having to leave the context of where they are.

Mike Vizard: I get that this will help us to build newer applications faster, but we have a massive amount of legacy applications. So do you think more folks will be inclined to take on an application modernization effort using Couchbase as the foundation than they might’ve previously been?

Scott Anderson: I think so because you’re really removing some of the barriers or the friction points. If I’ve got a code that may be written in relational using SQL and kind of a common development language, obviously our compatibility with SQL++, the context is very similar for that developer, but we’re removing points of friction, first with the database as a service, get up and running as quickly as possible. But you can imagine a world of being able to translate code from your historical database and what you’ve done and be able to translate directly into the code. So it would work in the context of Couchbase itself. So I think one of the comments that I’ve seen in the industry is we’re just going to have more applications. So as developers become more productive using these coding assistants such as Capella iQ. And I run product management, I’ve got a lot of requirements. I know businesses have a lot of requirements. So as developers become more productive using these tools, I just see the number of applications and new capabilities growing rapidly as we go into the future.

Mike Vizard: In effect, is the cost of switching dropping? Because it seems to me one of the reasons that a lot of legacy platforms are entrenched in organizations is that it takes too much effort to rewrite the code and too much time and energy, and so nobody has the motivation? Are we getting the point now where we can move forward without necessarily feeling the weight of the past tying us down every time we want to do something?

Scott Anderson:Yeah, I think when I talk to customers about modernization and where they’re going, there’s a couple of elements to that. One is the cost and really the time, and there’s an opportunity cost when you’re rewriting things. I think the other is risk, and I think these coding assistants address both of those. You can move much more quickly, so your time to value is going to be reduced. The amount of resources required to do that rewriting or modernization of the application can be much more efficient. And I think using coding assistants to ensure that the code that you’re writing is going to get the expected result reduces the risk barrier also. So I think we’re just in the early days, but I’m incredibly optimistic about how this technology will make it much easier for people to modernize their applications and most importantly, their database infrastructure.

Mike Vizard: How smart can smart get, I mean, where are we on that curve? What does the future look like? A lot of folks are managing multiple databases at scale. Some of them are even challenged with fund tasks like sharding, although I don’t know if that’s relevant in the document database here, but how much of the drudgery can we eliminate?

Scott Anderson: I think a lot. I think what we’ve seen already just in the last year or 14 months since ChatGPT 3.5 Was launched, we’ve seen incredible acceleration in large language models. And I think what we’re seeing right now is the early adoption and use of that technology by vendors like ourselves and some of the other players in the data management space. What we’ve been really focused on is kind of the pre-prompt engineering and the context awareness. And so I think we launched this in preview about four or five months ago. We’re now in GA, and this is an area that we’re going to be investing, and as I mentioned before, to add new capabilities, one that we’re kind of calling Ask iQ, which is for some of those operational tasks about what’s the best practice, general Q&A, where we’re just going to expand things. And the thing that we can do in things like Capella is we understand the context of the user, we know the scheme of the data, we know what page they’re on in the UI, which allows us to be much more deterministic in the answers that we provide.
And then we’ll learn as we use reinforcement learning and understanding the answers that we get, the rating of those answers is going to allow the answers just to get better and more deterministic for users. So I think we’re in that first inning or two, and I think we have a long way to go, which is really, really exciting because the feedback we’ve gotten from developers with our initial GA and preview has been incredibly strong and exciting. We’ll be investing in this area, and I think there’s just the best is yet to come as we continue to move forward and understand user behavior and how we can assist them based off the feedback that we get from developers.

Mike Vizard: There seems to be some debate about to what degree will I need to maybe hire somebody who is a prompt engineer versus how much of this is just a skill that every developer and DBA is eventually going to have in their arsenal because it’s how you interact with machines and hiring somebody else to do that doesn’t make sense.

Scott Anderson: I think we’re on a pretty steep learning curve. I’ll speak for myself. When I first played, for example, the ChatGPT was not very good, read a bunch of Twitter posts, figure out the types of prompts that I could use to be able to get better results. And I think everybody’s been on that steep learning curve for the last year. So is this technology becomes used more people become more familiar with this? I think it’s just a general tool that developers are going to use, product managers are going to use, and people across the industry, it just can be part of our normal toolkit. And our focus is if we can understand the context of the user’s data, not the data itself, but the shape of the schema that they have in their documents, we can do a lot of that pre-prompting engineering. And we do that today with iQ because we understand the context of where they are on the UI, what are they trying to do.
And I think one of the things that’s really important that I mentioned before is not just getting the answer that’s interesting. It reminds me of old reporting tools I worked on in the backup space 10, 15 years ago, and we gave them a recommendation and the thing they wanted was the button to implement the recommendation. And we’ve done that with iQ, which is here’s a sample set of documents, okay, hit the button, create the documents and insert it in a bucket within Couchbase. Here’s the recommended index. Great. I don’t want to go into the query workbench and create that index, just let me have the button. And we’ve provided that to developers, and I think that’s incredibly powerful in terms of the level of productivity developers will receive when using Capella iQ.

Mike Vizard: How bidirectional is the relationship going to be with something like Capella iQ? And I’m asking the question because there are days when I walk in the office and all I really want is for somebody to tell me the three things I need to pay attention to that are likely to get me fired. So will the machines just start sending me the alerts that I need based on things that it observes?

Scott Anderson: Yeah and I think that is on the predictive AI front where we’re just going to get more intelligent. I look at it as the needle in the haystack, which is what are the anomalies or information, as you mentioned, the three things, not the hundred things that I need to pay attention to and take action. And that’s a big focus that we have in Capella. We’ll be releasing an updated alerting framework that allow you to configure and customize alerts. But I’d look at this as us giving calls to action versus just giving data and insight. And I think that’s where you’re going with this is what are those three things and what do I need to do about it? Do I need to take immediate action?
Do I need to continue? And what should that action be and how do I go ahead and take it? Or is this something I need to monitor? And you’re kind of at a yellow level, at orange, you need to take a specific action. So I think this is how do we take information, not just represent it, present it to people, so it’s consumable, but that call to action is critically important. What should I do with my information? Make sure my database is performing at an optimal level.

Mike Vizard: You have a co-pilot for your platform. There are lots of co-pilots for different platforms. At the end of the day, or is my co-pilot going to call your co-pilot to figure something out in order for us to work together? Or how do you think this is all going to get stitched together?

Scott Anderson: I think as I mentioned before, it’s early, but I can see the interplay between these things because each one of these co-pilots have different contexts. So I think the thing we need to figure out is privacy, what data is shared. And it’s really up to the user, which is can the one plus one equal three or five because we get shared context among these co-pilots that gives more deterministic answers for the end user of that. So I see that can evolve over time and it should be within, in my view, in the developer’s control about where they want those interactions to occur. But the end goal is giving that deterministic answer and more context of understanding about the data the user has, the tasks that they’re trying to complete is going to give a better answer and result for the end user.

Mike Vizard: You’ve been at this a while. What’s your advice for folks to help them get started on this? ‘Cause I think some folks are just a little overwhelmed and maybe intimidated by the whole thing, but based on your experience, what do you know now that you wish you knew a year ago?

Scott Anderson: Well, I think it’s jump in with both feet is kind of my advice. I think sometimes we have cohorts of people, new technology, there’s a group of people who jump in, want to play, want to experiment. I think in some cases there are people like, how is it going to settle out? And this is a new thing and it may be a little bit challenging to get started, but there’s an incredible amount of resource, be it on X and documentation and blogs and so forth. And I think you learn by doing at the end of the day. So I think jump in, experiment, talk to friends, talk to coworkers, and share that insight and accelerate your learning. But I think what we’ve seen over the last year is for most users in almost every single use case, this provides more insight, more productivity for developers. So jump in, get started, talk to your friends and coworkers, learn the tips and tricks, and this is going to accelerate your productivity and what you’re going to be able to deliver for yourself or for your organization.

Mike Vizard: All right, folks. You heard it here, as usual, there’s no substitute for a little hands-on experience because that’s how we all learn. Scott, thanks for being on the show.

Scott Anderson: Thank you very much, Mike. Really appreciate it.

Mike Vizard: And thank you all for watching the latest episode of the Techstrong.ai Leadership series. You can find this episode and others on our website. We invite you to check them all out. Until then, we’ll see you next time.

TECHSTRONG TV

Click full-screen to enable volume control
Watch latest episodes and shows

AI Data Infrastructure Field Day

TECHSTRONG AI PODCAST

SHARE THIS STORY