Recent News in Analytics and AI: February 2026 Edition

6th March 2026 . By Michael A

February brought another wave of meaningful progress across the analytics and AI landscape, with Microsoft delivering updates that continue to narrow the gap between intelligence and day‑to‑day productivity. Power BI matured with smarter filtering, richer visuals and clearer modelling paths, while Fabric advanced its security, engineering and cross‑platform integration capabilities. Copilot saw steps toward more autonomous workflow execution, supported by improved connectors and stronger governance for agentic experiences. Azure developments focused on performance, scale and enterprise‑ready model deployment. Open‑source tools and industry insights rounded out the month, signalling where data‑driven organisations are heading next.

Read on and get up to speed.

Power BI


  • The new input slicer brings a more intuitive way to filter Power BI reports by accepting typed values, partial text and pasted entries. This makes it easier to work with detailed datasets where users often know what they want to find but do not want to browse through long lists. With matching modes including exact and contains, the slicer gives users better control over analysis and speeds up the discovery process. Its move to general availability signals Power BI’s investment in reducing friction for everyday tasks. The feature matters because it gives analysts faster, more natural filtering in complex report scenarios. Learn more.

  • A detailed exploration of composite semantic models delves into how Direct Lake and import tables work together to streamline modelling in Power BI. Direct Lake offers immediate access to OneLake data without importing it, while import tables suit scenarios where curated transformations or historical snapshots are essential. The article highlights how combining these approaches reduces engineering overhead and improves performance for demanding analytical workloads. By walking through the mechanics and benefits of this mixed‑mode design, the piece shows why composite models are becoming central to enterprise‑scale reporting strategies and how they enable more adaptable, high‑performing solutions Learn more.

  • Microsoft is retiring the old Excel and CSV import experience in Power BI Service by 31 May 2026. Users who previously relied on the Create page workflow will need to transition to the newer import methods, although Excel and CSV files remain fully supported as data sources. The change affects only those using the legacy experience and does not impact report creation in Power BI Desktop. This update matters because it signals a consolidation of data‑ingestion paths, helping users adopt more secure, consistent and well‑maintained workflows. It also reduces fragmentation and aligns Power BI Service with the platform’s modern capabilities. Learn more.

  • Power BI’s February 2026 update brings several enhancements that make day‑to‑day reporting faster and more intuitive. Smarter Copilot and AI experiences take centre stage by simplifying analysis and streamlining common tasks. Input slicer reaches general availability, giving users more flexible ways to filter reports without scrolling through long lists. Visuals receive meaningful polish, including improvements to cards and Azure Maps, while new DAX functions such as TABLEOF add modelling power. The update also includes deprecations like upcoming hierarchy removal in scorecards, encouraging teams to modernise their reporting approach. Overall, this release strengthens productivity and offers practical, user‑focused improvements. Learn more.



Microsoft Fabric


  • Microsoft outlines a vision where interoperable security becomes the foundation of modern data protection, and OneLake sits at the centre of this shift. The approach allows organisations to define a single set of security rules and have them consistently enforced across different processing engines. This removes fragmentation and gives teams confidence that governance remains intact regardless of how or where data is accessed. Fine‑grained controls such as row and column‑level security ensure that sensitive information stays protected while still supporting flexible architectures. This matters because it simplifies enterprise governance and enables secure analytics without forcing compromises in performance or design. Learn more.

  • Mirroring Azure Databricks catalogues behind private endpoints is now generally available in Microsoft Fabric, giving organisations a secure and streamlined way to combine Databricks and Fabric workloads. The feature mirrors catalogue data into OneLake as a read‑only replica, keeping it continuously up to date without requiring manual pipelines. This helps analytics teams work with Databricks data inside Fabric while maintaining strong network isolation. The development is important because it simplifies cross‑platform data integration, reduces operational friction and allows organisations to build more secure, unified analytical environments. Learn more.

  • Semantic Link is now generally available and brings a shared semantic layer that connects AI, BI and data engineering across Microsoft Fabric. The capability allows semantic models to be accessed directly in notebooks, enabling data scientists to work with trusted business definitions without manual duplication. Over time, it has expanded into a cross‑Fabric feature that streamlines collaboration between engineering, analytics and admin teams. This matters because it reduces friction in multi‑discipline workflows, accelerates model development and ensures consistent use of business logic. The result is faster, more reliable data‑driven work powered by a unified semantic foundation. Learn more.

  • Microsoft Fabric’s Native Execution Engine modernises Spark processing by shifting execution to a high‑performance, C++‑based runtime. This vectorised engine accelerates data engineering jobs, cuts processing time, and removes overhead traditionally associated with JVM‑based Spark execution. Crucially, teams can achieve these improvements without modifying existing code, making performance gains instantly accessible. The approach blends Spark’s familiar developer experience with an enhanced execution path designed for scale and efficiency. This advancement is important for organisations managing growing data volumes, as it delivers faster insights, reduced compute usage, and more predictable performance across Fabric workloads. Learn more.

  • Microsoft has officially adopted fabric‑cicd, the open‑source Python deployment library designed to automate CI/CD across Fabric workspaces. Developed in collaboration with engineering teams, MVPs, enterprise users, and the wider community, the tool simplifies deployments by offering a modern, code‑first interface for managing workspace items. This official backing signals Microsoft’s commitment to robust DevOps workflows within Fabric, making it easier for teams to streamline version control, automate delivery, and reduce manual deployment tasks. The announcement is important because it strengthens operational reliability and gives organisations a supported, scalable path for managing Fabric environments efficiently. Learn more.

  • Microsoft Fabric demonstrates how machine learning can be integrated directly into Power BI reports by building a churn‑prediction model on top of a governed semantic dataset. Predictions are delivered through batch or real‑time scoring, allowing business users to interact with insights inside standard reports. This workflow meets the growing need for analytics that look ahead rather than backwards, enabling organisations to highlight risks, anticipate customer behaviour, and improve strategic planning. The significance lies in making advanced analytical capabilities accessible within everyday reporting tools, reducing the barrier to adopting machine learning across the business. Learn more.



Microsoft 365 Copilot and Copilot Studio


  • Copilot Tasks represents Microsoft’s shift from conversational responses to practical action by enabling AI to complete multi step workflows rather than stopping at suggestions or drafts. By drawing from emails, chats and meetings, it can automatically extract action items and organise them into a coherent to do list. This matters because it reduces manual effort and removes the friction that usually follows knowledge work. Instead of users being responsible for administrating their workload, Copilot Tasks handles much of the orchestration, freeing time for higher value contributions. For organisations, the shift signals a meaningful step towards operational efficiency driven by AI automation. Learn more.

  • Microsoft’s Sales Development Agent (SDA) provides an autonomous way to generate qualified pipeline by managing outreach and lead qualification at scale. It follows structured guidance, ensuring every prospect receives consistent, timely engagement while human sellers focus on advancing promising conversations. This matters because sales teams often struggle to maintain volume without sacrificing personalisation or quality. SDA addresses that tension by reliably handling the most repetitive elements of the process. For organisations wanting to grow pipeline without proportionally increasing staffing, it offers a practical route to higher throughput. It also reflects a trend towards AI supported sales operations becoming central to modern revenue strategies. Learn more.

  • The growth of Copilot connectors introduces a more complete data ecosystem for Microsoft 365 users by securely linking external systems into Copilot’s reasoning environment. With the library now exceeding 100 connectors, teams gain the ability to search, analyse and act on information previously siloed in line of business platforms. This is important because meaningful AI assistance relies on high quality context, and connectors make that context accessible. The expansion also offers developers more flexibility to build tailored integrations. Overall, the update represents a major step towards enterprise wide AI, helping organisations streamline operations by placing broader data intelligence directly inside their everyday tools. Learn more.

  • Computer‑using agents in Copilot Studio now provide stronger security and governance while offering broader model choice for UI automation. These agents visually navigate web and desktop applications, enabling organisations to automate work in environments that lack APIs. Enhanced protection and improved controls make it safer to scale automation across business processes. This update matters because it moves traditional RPA closer to AI‑driven autonomy, allowing teams to automate more complex workflows with confidence. As adoption grows, organisations gain a practical route to modernising legacy systems while reducing manual effort and operational risk. Learn more.

  • New and redesigned guidance hubs from Microsoft give organisations end‑to‑end support for creating enterprise‑ready agents. The resources help teams navigate planning, build processes and operational considerations, offering clarity on design choices, governance expectations and practical implementation steps. This update is important because many organisations face fragmented documentation when adopting AI agents, often leading to inconsistent approaches. With a consolidated, authoritative set of materials, teams can work more confidently and produce agents that scale effectively across the business. The enhanced guidance ultimately improves quality, reduces risk and accelerates the path to meaningful agent‑driven transformation. Learn more.



Azure


  • Claude Opus 4.6 is now available in Microsoft Foundry on Azure, bringing advanced reasoning and multi‑agent coordination to enterprise workloads. It is designed for teams building sophisticated coding assistants and operational agents, with capabilities such as long‑context handling, robust orchestration and proactive agent steering. Its strong performance on complex automation tasks helps organisations run more resilient and efficient workflows. By joining Azure’s growing roster of foundation models, Opus 4.6 provides greater flexibility for teams looking to modernise development and operational processes. Its availability enhances Azure’s position as a leading platform for agent‑driven enterprise AI. Learn more.

  • Microsoft is introducing agentic cloud operations as a modern operating model that blends AI and cloud management into a seamless system. Azure Copilot anchors this experience by coordinating agents that analyse signals from the environment and take context‑aware actions across planning, launching and optimisation. These agents help reduce operational overhead by handling routine tasks while remaining under human governance. The model prioritises resilience, efficiency and confidence when managing dynamic cloud environments. Its significance comes from enabling teams to scale operations without increasing complexity, offering a more intelligent and future‑ready way to oversee enterprise cloud workloads. Learn more.

  • The Microsoft Learn MCP Server was launched to improve how AI agents consume Microsoft Learn documentation by offering a consistent, reliable access point. Built in 2025, it focuses on scalable architecture, durable tooling and lessons gained from operating a production‑grade agent interface. The team explains how careful design decisions help agents retrieve trusted information with minimal friction. This matters because high‑quality documentation access underpins the performance and accuracy of agentic solutions, allowing organisations to build more dependable AI systems using Microsoft’s learning resources. Learn more.

  • Microsoft Foundry’s February 2026 update introduces major advances in model capability and enterprise deployment. Anthropic’s Claude Opus 4.6 and Sonnet 4.6 arrive with expanded one million token context windows and adaptive reasoning, enabling deeper analytical workflows and cost efficient scaling for production teams. These improvements are backed by Azure’s secure architecture, which strengthens agentic systems that learn from and act on business processes. New additions to the Foundry Local portfolio allow large multimodal models to operate inside sovereign private cloud environments, an important step for organisations requiring strict data residency and high security. Collectively, these upgrades enhance flexibility, performance and enterprise readiness. Learn more.

  • Spark Declarative Pipelines aim to transform data engineering by elevating entire pipelines to a fully declarative model. Instead of focusing on execution mechanics, teams simply define the intended transformations while Spark plans the end‑to‑end workflow. This approach enhances reliability, consistency and maintainability, reducing operational overhead that often slows delivery. By extending declarative principles, which traditionally apply to single queries, to complete pipelines, organisations can streamline development and reduce errors caused by manual orchestration. The shift matters because it allows analytics and AI teams to spend more time modelling data and less time managing infrastructure, resulting in faster, safer and more scalable data operations. Learn more.



Open-Source


  • LakeBench v1.0.0 introduces a multi‑modal Python framework designed to benchmark lakehouse compute engines across real‑world ELT scenarios. It enables teams to evaluate engine performance through workloads such as bulk loads, incremental merges, transformations and maintenance tasks, offering a rounded view of system behaviour beyond traditional query benchmarks. The release signals an important shift towards standardising how organisations validate lakehouse performance at scale. By providing consistent, extensible workloads, LakeBench helps data and analytics teams compare technologies with greater confidence and transparency, which is increasingly valuable as lakehouse architectures continue to mature and diversify. Learn more.

  • Delta Lake’s introduction of catalog‑managed tables marks a major shift in how open table formats are governed. Instead of coordinating transactions through the underlying filesystem, table state is now controlled directly by the catalogue, which becomes the authoritative system for identity, discovery and access control. This change simplifies governance and reduces the risk of inconsistencies that can arise when multiple engines interact with the same data. As ecosystems standardise around catalogue‑centric architectures, this update strengthens reliability for multi‑engine environments and aligns Delta Lake with the broader industry movement towards unified metadata layers that support scalable and secure data operations. Learn more.

  • Polars outlines how Apache Airflow can be used to schedule and orchestrate Polars Cloud queries, enabling teams to automate workloads ranging from quick fire‑and‑forget jobs to sophisticated multi‑stage pipelines. The guidance explains how Airflow manages job submission, parallel execution and monitoring while keeping service account credentials isolated from DAG code. This integration is important because it allows analytics teams to scale Polars workloads through a mature orchestration layer that supports dependency management and operational oversight. As cloud‑native data processing grows, combining Polars’ performance with Airflow’s orchestration capabilities offers a flexible approach to building reliable production pipelines. Learn more.

  • Daft expands its multimodal data capabilities with daft.File, a feature designed to let teams work seamlessly with any file type at scale. Instead of loading entire files into memory, daft.File passes lightweight references through a distributed execution engine, opening files only when required. This approach supports efficient parallel processing and enables handling of large or complex datasets without memory strain. The introduction of complementary types such as daft.VideoFile and daft.AudioFile widens support for increasingly diverse analytics workloads. These improvements matter because they streamline file-heavy pipelines, reduce infrastructure overheads and give data teams more flexibility when building advanced AI workflows. Learn more.

  • Ibis positions itself as a system that does not literally understand SQL text but fully understands the relational operations users wish to perform. Through its dataframe‑style API, users describe transformations in Python, and Ibis translates these into SQL suited to the target backend. This gives analysts the familiarity of SQL‑like semantics without the burden of writing or adapting raw SQL for each engine. The capability is important because organisations increasingly operate multiple data platforms, and Ibis offers a way to maintain consistent logic across them. By compiling intent rather than syntax, Ibis simplifies complex pipelines and improves long‑term portability. Learn more.



Industry


  • According to Gartner, a substantial proportion of generative AI projects are abandoned because companies fail to bridge the gap between experimentation and scalable deployment. Frequent blockers include inadequate data readiness, unclear business outcomes and increased operational costs that erode confidence in long‑term value. Gartner argues that organisations can improve success rates by strengthening governance, investing in high‑quality data and defining benefits early. This perspective is important for leaders seeking to avoid the pattern of short‑lived pilots by building GenAI programmes that demonstrate tangible impact and can ultimately support more advanced, enterprise‑grade capabilities. Learn more.

  • Google outlines how integrating human oversight into generative AI workflows leads to higher‑quality outputs and more dependable results. Humans guide the process by providing context, validating information and adjusting responses where necessary, allowing AI to operate as a supportive collaborator rather than an autonomous decision‑maker. This human‑centred framework helps maintain accuracy, encourages better use of domain expertise and strengthens user confidence in AI‑assisted work. Its significance lies in ensuring that rapid automation does not compromise standards, making it easier for organisations to adopt GenAI responsibly across diverse teams and functions. Learn more.

  • Google announces the availability of Gemini 3.1 Pro across Google Cloud, offering developers and business teams access through Gemini CLI, Gemini Enterprise and Vertex AI. The release strengthens the platform’s intelligence capabilities, enabling more advanced reasoning and richer multimodal experiences. Its broad availability helps organisations embed the latest model improvements directly into development workflows and enterprise applications. This matters because access to higher‑performance models enables faster innovation, more capable automation and better integration between AI tools and existing cloud infrastructure, accelerating the delivery of production‑ready GenAI solutions.. Learn more.

  • Claude Sonnet 4.6 arrives as a major capability uplift, rolled out across all Anthropic platforms from Claude Cowork to Claude Code and cloud integrations. The free tier has been significantly improved, now including advanced features like file creation, connectors, skills and compaction by default. These additions make it easier for users to handle complex tasks such as structured analysis and workflow orchestration. This release is important because it expands access to powerful tools that previously sat behind paid tiers and highlights Anthropic’s intention to accelerate practical, safe and widely accessible AI adoption across industries. Learn more.

  • OpenAI and Microsoft reaffirm their partnership, emphasising a strong and central relationship built on years of shared research, engineering and product progress. Both teams continue to collaborate closely while a non-binding memorandum of understanding shapes the next stage of cooperation, with detailed agreements still being finalised. Their intellectual property arrangements remain intact, indicating long-term continuity. This announcement is significant because it signals strategic alignment at a time of rapid AI growth, providing clarity for organisations investing in the Microsoft and OpenAI ecosystem for dependable, enterprise-ready innovation. Learn more.


Taken together, this month’s advancements reflect a sector moving beyond experimentation towards structural transformation. Power BI and Fabric are becoming more tightly integrated, Copilot is stepping into operational roles once reserved for humans, and Azure is setting the tempo for scalable, secure model deployment. Open‑source projects are simultaneously raising expectations around portability and performance. These shifts challenge leaders to think not only about what AI can automate today but how their organisations will function when intelligence underpins every workflow. We will be back next month to explore what comes next.

Stay in the Know


Get notified when we post something new by following us on X and LinkedIn.