Data Intelligence: entering the era of understanding
Speed, security, profitability, agility… Data intelligence platforms break down silos and transform data into strategic intelligence. A quantum leap from “data-driven” to “intelligence-driven”.
Table of contents
For a decade, organisations have worked to overcome the limits of traditional databases by dismantling the silos inherited from legacy SQL architectures.
Today, they operate on solid, proven foundations commonly referred to as the “Modern Data Stack”. Major platforms such as Databricks, Snowflake, Microsoft Fabric, AWS and Google Cloud now provide environments capable of centralising and processing massive volumes of data.
However, business value remains well below expectations. For example, decision-makers have access to indicators. Nevertheless, they struggle to obtain an actionable view of their business. The cause is structural. Their data platforms do not fundamentally understand the data within their organisations, nor how it is used.
Data intelligence platforms are changing the game thanks to AI. They capture the semantics of enterprise data and make it directly usable by business units. By unifying governance, advanced analytics, generative AI and automation, they transform data into an active agent. This is the principle of “Data as an Agent”. From now on, data anticipates. It recommends. Better still, it acts.
The stakes are therefore high for data-driven companies that will take the leap towards becoming intelligence-driven companies. And this transition now determines operational performance, decision-making speed and competitiveness.
The observation: technical maturity seeking new standards
The data market has undergone significant consolidation in recent years. Hyperscalers such as AWS, Azure and Google Cloud now offer comprehensive and integrated solutions. The era of “best-of-breed”, where each company chose its tools separately, is coming to an end.
Today, organisations are instead adopting comprehensive and integrated ecosystems capable of managing the entire data lifecycle.
In this regard, let us mention Fivetran, a data integration specialist, and dbt Labs, publisher of the dbt transformation tool, which have announced their merger¹.
The aim of this operation was to “simplify enterprise data management with a unified platform that powers analytics and AI at scale.²” However, some fundamental challenges remain.
The ambition was clear: to simplify enterprise data management through a single platform that powers analytics and AI at scale.
However, some fundamental challenges remain.
Data observability and quality: ensuring reliability for AI
It is important to note that having an integrated platform is not enough. Data quality and understanding (observability) are major challenges. Practices such as Data Mesh³ and Data Products⁴ have been developed precisely to address these challenges.
Data Mesh was designed to distribute data management among business teams to speed up data availability.
The Data Products, reliable and well-documented datasets, were already available to be leveraged by the teams.
Despite these developments, trust in data remains the main barrier to AI adoption. Admittedly, we have the pipelines, but we do not yet have standardised quality “sensors”.
The disconnect between applications and data : a barrier to data exploitation
In many companies, cross-application integration teams (transactional systems) and analytics teams still work in silos.
This disconnect significantly hinders data exploitation. It is the main bottleneck for companies seeking to make their data platforms operational in real time.
The explosion of operational workloads and real‑time
Data platforms, long confined to batch processing, are now entering the operational arena. To support their workloads, platforms now incorporate high-performance storage capabilities. These include scalable relational SQL databases, such as PostgreSQL.
The challenge therefore remains rationalisation. Why maintain separate systems? The data platform can serve both the strategic dashboard and the real-time customer application.
This convergence is forcing a rethink of integration choices within the information system. The aim is to bring greater fluidity between the core business and analysis.
Data Intelligence: making data intelligible for AI
Moving from a “Data & AI platform” to a “Data Intelligence Platform” means bringing intelligence closer to the data.
The Data as a Product approach marked a crucial step in holding business domains accountable and ensuring reliable, packaged data.
With the rise of Generative AI, the focus is shifting. Organisations now want data that is self-explanatory and acts autonomously. And the data assistant (“Data as an Agent”) embodies this ambition.
The principle is to equip each data product with an agentic interface capable of reasoning, interacting and acting within an ecosystem of agents. In other words, transforming passive data into a conversational entity. How can this be achieved? Three dimensions structure this approach.
The data agent: your conversational assistant for data
The data agent goes beyond simply displaying SQL tables. It brings data into the field of language. Powered by its domain knowledge, it establishes a direct dialogue with information in natural language, without an intermediary. To put it simply, the data becomes the user’s interlocutor.
Native intelligence and ontologies: understanding data to act more effectively
A Data Agent’s intelligence is native to the data. This means that it relies directly on the meaning and business context of the information. Basically, for AI to understand data, it must be described using rich semantic models. These are called ontologies. They formalise key concepts and their relationships. It is within this framework that data assistants draw on to correctly interpret business nuances.
A2A (Agent-to-Agent) protocols: communication protocols for agent interoperability
Thanks to emerging standards such as MCP (Model Context Protocol), these agents can communicate with each other.
A “Sales” agent can independently question a “Logistics” agent to resolve a delivery issue. The platform then acts as the conductor of these interactions.
Semantics: a prerequisite for AI visibility
Once data has been quality-assured, it must be made understandable to AI. Data that is not described by a semantic model is invisible to the agents of tomorrow.
New imperatives: AI FinOps and ethics
This new architecture cannot emerge without economic efficiency, trust, and ethics. Firms such as Gartner and Forrester now consider these factors critical.
Economic efficiency (FinOps AI): making AI a controlled lever of value
The proliferation of AI agents and calls to large language models (LLMs) can lead to a sharp rise in costs. Each request consumes compute resources and generates a measurable expense. To remain sustainable, a Data Intelligence Platform must incorporate effective cost governance. It must optimise every use of AI.
This approach, known as FinOps AI, applies the principles of financial management already proven in the cloud to AI. The principle is to use the right model at the right time for the right purpose. In many cases, smaller, specialised models (SLMs) offer comparable performance at a significantly lower cost. They should be preferred when the complexity of the task does not justify the use of an LLM.
Trust and ethics
By becoming autonomous, the “Data Agent” must be capable of inherently complying with confidentiality and ethical rules. This involves automatic filtering of sensitive data and full traceability (lineage) of decisions made by AI.
Data as an intelligence system
Ultimately, moving from a “Data & AI Platform” to a “Data Intelligence Platform” means bringing intelligence closer to data.
For organisations, the roadmap is based on three key areas.
Reuniting application integration and data
This involves removing organisational barriers to support operational use cases and reduce latency.
Investing in the “Semantic Layer”
No longer just cleaning data, but modelling it so that it can be understood by a machine.
Adopt Agentic Design under strict cost and ethical control
Anticipate architectures where data is consumed by agents (via MCP protocols) rather than by simple SQL queries.
This approach requires strict cost management (FinOps AI) and mechanisms to control ethics, security and traceability.
References
1. « « Fivetran and dbt Labs Unite to Set the Standard for Open Data Infrastructure», fivetran.com, October 13, 2025. ↩︎
2. Ibid. ↩︎
3. Christophe Heng, Maria José Lopez, Gontran Pubez, « Le Data Mesh, effet de mode ou véritable outil de transformation de votre business model ? », groupeonepoint.com, 2025.↩︎
4. Grégory Lecointe, « Obtenir un avantage concurrentiel grâce aux Data Products : 4 leviers d’optimisation clés », groupeonepoint.com, 2025.↩︎