AI and data integration: an indispensable symbiosis

Data AI



To fully realize the benefits of AI, companies must first lay a solid foundation, argues Pim Simons, Integration Domain Lead at Codit. Breaking through data silos via data integration is the cornerstone for powerful AI applications.

Pim Simons

AI continues to be a defining factor this year, with forecasts pointing to a $250 billion market, with generative AI as the growth engine. In their quest for increased efficiency, better decision-making and higher customer satisfaction, organizations are exploring what AI can do for their processes in full with more than half of companies expecting to increase their AI spending in 2025.


“There can be no AI without data integration. The more qualitative, consistent and comprehensive the data, the better the output of AI applications.”

Pim Simons, Integration Domain Lead at Codit

Integration lays the foundation

Data integration is essential for powerful AI applications, Pim Simons, Integration Domain Lead at Codit, emphasizes. “The more qualitative and consistent the data, the better the AI output.” It brings together data from different systems and delivers valuable insights.

“In large companies with diverse IT systems, integration is the heart of the business,” Francis Defauw, Chief Portfolio & Marketing Officer at Codit, says. “Without an integration layer, the business does not function. Use cases rely on these data flows.”

Francis Defauw

A concrete example: a supplier of fresh fruits and vegetables optimizes its ordering and delivery processes by accurately predicting demand which helps with efficient personnel planning and less waste. Data integration is crucial here.

Accurate demand forecasting requires data from CRM, inventory and transportation systems. This data sits in separate applications, known as data silos. Integration is necessary when extracting value from this data, to make better forecasts. With data integration, companies operate more efficiently and respond better to market movements and customer needs, thus adding strategic value.

What makes data integration complex?


Data integration goes beyond mere data collection; it is about bringing together data from different systems into a coherent whole, according to a certain reference architecture. “Instead of linking systems directly, we bring all the data together through an integration layer,” Pim explains. “This makes the data flows more manageable and provides the flexibility to add new functionalities quickly.”

At many companies, data resides in data silos spread across different applications such as ERP, CRM and WMS. Those silos make it difficult to combine data. “Just that makes it all very complex and again underscores the importance of data integration,” Pim continues. “The integration layer breaks through all those silos and brings the data together in a central environment. That provides a single source of truth, without the company having to pull data from different applications each time.”

“That integration is the heart of the business; take away the integration layer and the business can no longer function.”

Francis Defauw, Chief Portfolio & Marketing Officer at Codit

Scalability and cost

Scalability is essential for data integration, especially as the data volumes used increase. Those looking to efficiently store massive amounts of data quickly turn to cloud-based solutions, such as Microsoft Azure, which offer almost unlimited scalability. The challenge here is to pay close attention to costs.

“Those costs are usually not so much in the storage of data, but rather in the tools you use to develop and implement AI applications,” Pim explains. In addition, companies often do not know what data they will need in the future. “It is only later that they discover that they are missing certain datasets. Storing raw data right from the beginning lets you build a history that can be valuable later.”

But there are practical challenges to that too. “As more applications enter use, the number of data formats also increases,” Pim says. “But working with common, readable data formats does make it easier to perform more complex tasks with them later on. Data integration is therefore preferably done in consistent formats, such as JSON or XML.”

Don't forget data governance

A strong data integration strategy requires more than technology alone, it needs a combination of governance, advanced tools and automation to ensure scalability and reliability.

“Data governance is essential to ensure data quality and privacy, especially now that AI models are using business-sensitive data more often,” Pim explains. “A tool like Microsoft Azure Purview then aligns nicely with DevOps processes, providing support for data governance within large-scale data integration.”

The rise of Agentic AI

After GenAI comes the technological wave of Agentic AI, in which applications communicate with each other autonomously without human intervention. “The ultimate goal is for AI applications to make their own suggestions, such as placing orders automatically, which customers only need to confirm,” Francis outlines.

To realize that vision, the importance of data integration increases even further. “The quality and consistency of data determine whether Agentic AI works well,” Pim concludes. “Errors in the data stream can lead to wrong suggestions or decisions, possibly with unwanted consequences. That kind of autonomous system can't work without a solid integration layer.”

Receive four interesting insights about data integration & AI in your mailbox

We would love to share inspiring insights from our experts to help you optimally design your own AI journey. Register now and receive four articles featuring our experts in your mailbox in 2025.

Pim Simons is Integration Domain Lead at Codit, a subsidiary of Proximus NXT. He specializes in cloud-based data integration. Together with his team, Pim helps organizations build scalable, future-proof data platforms.

Francis Defauw is Chief Portfolio & Marketing Officer at Codit, a subsidiary of Proximus NXT. He’s responsible for managing and optimizing advanced data integration and cloud solutions. He ensures innovation and strategic growth of Codit’s service portfolio.

Latest insights & stories

Getty Images 841419634

Seven cybersecurity trends and threats for 2025

Last year saw another alarming increase in cyber-attacks, with hackers using increasingly sophisticated methods. Proximus NXT and its security partners explain how to navigate a minefield of vulnerabilities in 2025.


IT is innovating faster than ever. We should embrace the limitless possibilities in cloud adoption, generative and non-generative artificial intelligence and the increasing use of APIs, says our expert panel. At the same time, cyber criminals see such trends as opportunities to compromise businesses and are capitalizing on new vulnerabilities.

Schermafbeelding 2025 01 28 om 10 28 00

The az groeninge hospital innovates care with 5G

A private 5G network at az groeninge provides the platform for the hospital to roll out innovation. From remote monitoring with biosensors to robotic surgery and training using VR: everything is focused on patient care.

Getty Images 1267987226

Why your AI project is also a data project

AI needs quality data to realize its full potential. Yashfeen Saiyid, Data & AI Practice Lead at Proximus NXT, explains how to use a data-driven approach to lay the foundations for a successful AI project.

“There are more and more business applications, these days, based on artificial intelligence. And the accelerating rise of generative AI, with ChatGPT as its flagship, is simply breathtaking. According to Gartner, 90% of companies will use AI in the workplace by 2025,” begins Yashfeen Saiyid, Data & AI Practice Lead at Proximus NXT and Managing Director at Codit.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Drag
0%