
Cribl this week added another generative artificial intelligence (AI) capability at no additional cost that automatically normalizes telemetry data.
Nick Heudecker, head of marketing and business development for Cribl, said Copilot Editor has been trained to map schemas and translate logs from one format to another. That ability to understand log structure and semantics makes it simpler to build telemetry data pipelines using a Cribl Streams platform that is used by DevOps engineers, IT managers and cybersecurity teams to route telemetry data to a specific analytics engine.
The overall goal is to reduce the manual effort that would otherwise be required to set up the pipelines needed to analyze telemetry data in a matter of minutes, added Huedecker.
In addition to now making it simpler to onboard additional sources of telemetry data, Copilot Editor also reduces dependency on any given vendor by making it simpler to pivot between platforms, he added.
Cribl first added a set of Copilot tools last year that made it simpler to query telemetry data using a natural language interface. Copilot Editor adds an ability to now normalize that data into an industry standard format. That’s critical because most IT and security platforms generate telemetry data in a unique format that makes it challenging to correlate events across multiple platforms.
Additionally, Cribl earlier this year added a data lake to its portfolio that is specifically designed for storing and automatically normalizing telemetry data in a way that reduces total costs by streamlining workflows. The Cribl Lake cloud service makes it simpler to aggregate and query highly distributed telemetry datasets in real time. That capability eliminates the need to build the complex extract transform and load (ETL) pipelines that data engineers would otherwise be needed to construct across, for example, time series databases and cloud data warehouses.
Most of the organizations that collect telemetry data are being overwhelmed by the sheer volume, resulting in increased costs as they find themselves storing massive amounts that may often never be needed.
Cribl in addition to reducing the cost of storing that data is also leveraging generative AI tools to make it simpler to build the pipelines needed to analyze normalized telemetry data. It then routes and stores data in the optimal format to further reduce costs.
In general, it’s becoming simpler to apply AI to a wider range of IT operations, otherwise known as AIOps. The challenge now is moving beyond machine learning algorithms to embrace large language models (LLMs) that will make it possible for AI copilot and agents to reason across a series of events in near real time, said Heudecker. That’s critical because given the rate of change in IT environments, there is a need for an approach that doesn’t require every AI model to be trained on how each unique IT environment is constructed before it starts to have value, he added.
It’s still early days so far as usage of AI to capture, manage and ultimately analyze telemetry data but the one thing that is clear is the volume of this data is already overwhelming IT and cybersecurity teams. The challenge and the opportunity now is how to apply AI to all the data in a way that makes collecting and storing it worth the actual effort.