IAG finds a "middle ground" in its pursuit of data mesh

By

Uses cloud and data streaming building blocks stood up over several years.

IAG is a year into implementing a data mesh architecture as it continues to elevate the role of data in group operations and transformation programs.

IAG finds a "middle ground" in its pursuit of data mesh
IAG's Burak Hoban (L) with Confluent's Rhett Pearson (R) on-stage in Melbourne.

Executive manager of data platforms - data and risk, Burak Hoban, told last month’s Confluent Data in Motion Tour 2024 event in Melbourne that IAG “can’t go to the very end state of what data mesh represents, but we can find a middle ground”.

“That’s sort of what we’ve done,” Hoban said. 

“We’ve already implemented that, we’re a year into that journey, [and] my role is to really help accelerate [that].”

Two of the building blocks in IAG’s data mesh are its strategic data and analytics platform, based on Google Cloud services including BigQuery, and Confluent Cloud and connectors.

“We’re going all-in on GCP and using that as a foundational platform to build our data mesh architecture, and then we have Confluent Kafka supporting that as well as part of our journey,” Hoban said.

Both Google Cloud [pdf] and Confluent are established proponents for a data mesh architecture.

Data meshes are built around several core principles. These include treating data as a product, bringing data into a central resource but where it is still owned and curated by “the domain team that is most familiar” with the dataset, allowing self-service access to data, and applying overarching standards to all data being worked with.

Hoban said that IAG had spent several years putting in place some of the building blocks considered to be foundational to a data mesh. 

He also said that IAG’s divisional organisational structure meant that aspects of a mesh architecture could work well.

“If we look at our business divisions, they’ve already got data teams, so part of that is to get them … onto a centralised platform that’s well governed,” Hoban said.

Real-time data uses grow

IAG is starting to see more use cases for real-time streaming data emerge across the organisation.

Hoban said that the potential of real-time streaming and business eventing was recognised early in 2017-18, but at the time there were few good use cases to apply it to and to build internal capability.

While some use cases did eventually emerge, Hoban said that “in the grand scheme of things, I think we probably were a little bit early.”

Still, Hoban said the work was useful due to the technology direction that IAG, and the broader industry, was headed in.

“We knew that on the horizon IAG had a transformational programs plan to shift to cloud, and we knew that actually more broadly in the industry we’re starting to see use cases around eventing and streaming, and they looked like things we could explore as a business,” he said.

Early streaming implementations were built and managed internally with Apache Kafka, but with signs that usage of Kafka would increase, the insurer moved to a managed Kafka model, through Confluent Cloud.

Most early use cases were migrated across, and Confluent Cloud now powers other real-time streaming use cases internally.

Using the managed service freed Hoban and his team to work with internal development and engineering teams, “getting them more comfortable with Kafka” and driving up adoption.

That has led to rapid growth in uses, with Kafka supporting transformation works around customer migration, policy migration, payment notifications and two-way synchronisation of data between business tools, as well as enabling the creation of real-time data products.

“Kafka’s starting to play a critical role as part of our strategic platforms moving forward,” Hoban said.

“It’s already to the point where if we had any downtime or issues with Kafka, it impacts our ability to price quotes or to write policies, so it’s already mission critical.”

Hoban also foreshadowed an increased use of pre-built connectors by Confluent to access and stream data from a variety of source systems.

All-in on GCP

Separately, at a Google Cloud Summit held in Sydney in May, IAG’s executive general manager of data, risk and resilience David Abrahams provided more details of the insurer’s GCP-based data platform.

The insurer had previously only spoken briefly about its use of GCP and not about its architecture or decision-making process.

“We have a lot of data and it’s pretty important to our business, but because of our legacy, our data has become quite fragmented and siloed,” Abrahams said.

“What we needed to do was rationalise that data and make it more useful, and build a better understanding of who our customers are and what it is that they value and want us to protect. 

“We started this journey about six years ago where we first tried to solve this problem ourselves, but eventually we reached the limits of just how far and how much engineering we could really run and put against running our own data platform, so we needed to evolve to a platform that could scale and perform at the levels we needed now and into our future.”

Abrahams said IAG set Google Cloud as the foundation of its “strategic data and analytics platform”.

He added that the strategic data platform “now leverages Google Data Platform with advanced analytics using Vertex AI and machine learning on BigQuery.”

Among the results produced by the platform is an ability to “create more personalised experiences that help customers navigate the complex world of insurance.”

There had also been internal-facing benefits as well, he said, highlighting capabilities considered important to the data mesh discussion.

“It helped us realise our goal of empowering our business to be more self-sufficient when using data,” Abrahams said.

“Business teams are now deploying new models into production themselves without any additional tech support.”

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © iTnews.com.au . All rights reserved.
Tags:

Most Read Articles

Suncorp builds generative AI engine 'SunGPT'

Suncorp builds generative AI engine 'SunGPT'

Coles Group calculates a TCO for its enterprise applications

Coles Group calculates a TCO for its enterprise applications

NAB retires its Tableau environment

NAB retires its Tableau environment

Bendigo and Adelaide Bank uses GenAI, MongoDB to refactor application

Bendigo and Adelaide Bank uses GenAI, MongoDB to refactor application

Log In

  |  Forgot your password?