Microsoft

Microsoft this week unfurled a bevy of tools, platforms and services that promise to accelerate the pace at which artificial intelligence (AI) applications can be built.

Unveiled at the Microsoft Build 2024 conference, the latest editions to the Microsoft portfolio include a copilot, available in private preview, that uses templates to guide AI development teams working for startup companies that have joined the Microsoft Founders Hub to gain free access to Azure services.

In addition, Microsoft plans to soon preview a custom generative model that, starting with a single document, guides the user through the schema definition and model creation process with minimal labeling. It uses large language models (LLMs) to extract fields and is designed to adapt as additional documents are added.

Microsoft is also previewing additions to the Microsoft Azure AI Speech service to automate workflows and provide video dubbing capabilities. In addition, a message analysis tool for WhatsApp, now in preview on the Azure OpenAI Service via Azure Communication Services, will enable businesses to extract meaningful insights from WhatsApp messages.

At the same time, Microsoft also announced it is making the GPT-4o LLM trained by OpenAI available in preview via Azure AI Studio platform for building AI applications along with a copilot for Microsoft Teams. Phi-3-vision, an instance of a small language model (SLM) designed to be run on personal devices, is also now available in preview. Microsoft is also making available a Copilot Runtime for Windows 11 along with support for tools such as PyTorch.

Microsoft also announced that the Microsoft Azure AI Studio tools for building AI applications is now generally available and pledged to add a Custom Categories feature to the Microsoft Azure AI Content Safety service to make it simpler to create custom filters for specific types of content.

Additionally, Microsoft added support for instances of graphical processor units (GPUs) from AMD to the Azure Cloud.

Microsoft also extended Microsoft Azure AI Search to provide connectors to the Azure data lake along with additional vector capabilities for customizing AI models using retrieval augmented generation (RAG) techniques.

On the data management front, Microsoft has extended its implementation of the open source Postgres database to add support for LLMs and previewed vector capabilities to the managed Microsoft Azure Cosmos DB service. Microsoft also added an ability to process data in real-time to Microsoft Fabric, an analytics service that, like every other Microsoft service, will have a copilot tool along with other artificial intelligence (AI) tools for processing data. It also now support the Apache Iceberg open-source native table format.

Microsoft CEO Satya Nadella told conference attendees the company is pursuing an end-to-end approach to AI that includes providing access to the most advanced AI accelerators possible.

Paul Nashawaty, practice lead for application development and modernization at the Futurum Group noted that as tools such as Azure AI Studio democratizes AI model development tools, platforms such as Azure AI Search will enable organizations to unlock actionable insights with unparalleled efficiency.

There are, of course, no shortage of options when it comes to building AI applications, but Microsoft has clearly gotten a head start. The only issue left to be resolved is determining to what degree organizations are actually able to operationalize AI in the months and years ahead.