AI News

The level of expertise required to launch queries against a SQL database is about the substantially drop thanks to the rise of generative artificial intelligence. Kinetica today announced it has developed a large language model for its real-time analytics database that automatically converts a query written in natural language to SQL.

The overall goal is to make it simpler for anyone to interrogate data residing in the Kinetica database by leveraging native support for vector data that converts data into a format that can be used to involve a large language model (LLM), says Kinetica CEO Nima Negahban.

In this instance, Kinetica is enabling this capability via a large language model (LLM) it developed to optimize SQL queries, he notes.

Previously, the company added an integration with the LLMs developed by OpenAI to make it possible to launch queries against a general-purpose LLM. Kinetica is also promising to add other integrations with LLM platforms provided by, for example, NVIDIA later this year.

Collectively, the two approaches show how SQL can be used to invoke external LLMs as well as be invoked via a natural language interface, says Negahban. This approach provides the added benefit of being able to fine tune queries in a way that generates more accurate results when launching queries involving LLMs, he adds.

AWS

The need to optimize the processing of those queries requires a database that was designed from the ground up to make the most efficient use of costly graphical processor units (GPUs), noted Negahban.

It’s not entirely clear what impact generative AI will have on the role of database administrators (DBAs) but as generative AI continues to be applied to specific domains, many IT tasks that once required a specialized are for all intents and purposes being democratized. More end users, for example, should be able to launch queries without needing a DBA to craft an optimal SQL query on their behalf.

In theory, that should give DBAs more time to manage more complex tasks spanning, for example, all the various data types that are now supported by databases.

Launching SQL queries is only one of many IT tasks that generative AI is in the cusp of democratizing. Historically, the single largest cost of IT has been the labor required to manage it. Generative AI coupled should make it possible to build and deploy more applications while reducing the cost of labor—or, at the very least, keeping labor costs relatively stable. Today, deploying additional applications in IT environments is often limited by the fact that there are not enough IT professionals available to manage them alongside the hundreds of applications that have already been deployed. Generative AI will change the labor equation on which many IT decisions are based.

Just as importantly, the end user experience will improve as it becomes possible to automate more tasks and resolve issues faster. End users will soon view any IT experience that is not augmented by generative AI technologies to be antiquated.

Naturally, these types of advances will cause as much consternation as they do excitement. A lot of the toil that makes working in IT not as appealing, as it might otherwise, could be eliminated in the months and years ahead. The challenge and opportunity is to determine what processes are best left to machines to automate versus the ones that still require a level of insight that only humans can provide.

TECHSTRONG TV

Click full-screen to enable volume control
Watch latest episodes and shows

AI Data Infrastructure Field Day

TECHSTRONG AI PODCAST

SHARE THIS STORY