Chief Information Officer

Axa’s Head of Investment Data Platform Discusses Data, Analytics and Innovation

Yves Chauvin, Head of Investment Data Platform at Axa, explored the use of data analytics and machine learning technologies to drive innovation during his keynote presentation to Argyle’s CIO membership at the 2018 CIO Leadership Forum: Data Strategy & Innovation in San Francisco on Feb. 13. In his presentation, “Forecast of Financial Measures: Form Opinions Using Machine Learning,” Chauvin provided insights into the increasing importance of data and analytics.

Data analysis may make or break a business, regardless of its size or industry. If a company is equipped to perform data analysis, it could generate actionable insights to improve its customer relationships and everyday processes and increase its revenues. 

The sheer volume of data available to today’s businesses is overwhelming. Yet companies must allocate time and resources to collect and analyze both structured and unstructured data. Because if companies fail to optimize the value of data, they risk missing out on valuable customer and business insights.

Now, artificial intelligence (AI) is transforming the way that businesses perform data analysis. AI helps companies streamline data analysis, and as such, may prove to be exceedingly valuable to businesses across all industries in the years to come.

“The world is changing, and AI is part of that change,” Chauvin stated. “There is a huge amount of data, and we’ll try new solutions [to analyze this information] that may depend on AI.”

Machine learning has transformed the way that businesses analyze data too.

With machine learning, businesses can empower machines to collect and analyze information day after day. This technology helps companies speed up and automate data analysis, ensuring businesses can gain in-depth insights at all times.

Yet AI and machine learning alone are insufficient for businesses. Instead, companies must understand how these technologies work to realize their full value.

“We’re trying to put the knowledge of expert people into code,” Chauvin said. “You pick the brains of people … understand how they function and code it.”

Although businesses can quickly access massive amounts of data from multiple sources, these companies must have processes in place to perform comprehensive data analysis. That way, organizations can identify patterns and trends hidden within data sets and maximize the value of data.

“The world is changing, and AI is part of that change.”

With an extensive approach to data analysis, organizations can differentiate poor data sources from rich data sources. Then, organizations can use AI and machine learning tools to ensure they can consistently collect data from the best sources.

“Data quality is a constant struggle. Every piece of data counts, and we need to look carefully at all of our data,” Chauvin indicated.

Meanwhile, data analysis is ongoing for businesses, and collecting a single data set from a single source is unlikely to provide a company with the insights it needs to improve its decision-making processes.

For today’s organizations, it is important to use technologies and systems to manage large data sets. If organizations perform regular data analysis, they may be better equipped than ever before to keep pace with customers in a rapidly evolving global marketplace.

“Data is not stationary in a statistical sense. Things change, and we need to account for that over time,” Chauvin said.

Furthermore, businesses must understand the true value of the data at their disposal. Because if organizations are unsure about data and its importance, they are unlikely to obtain meaningful insights from their data sets.

“We need to explain what the machines do and explain what the code does. We have to consider the sources of data, and if we cannot, then we’re just lucky,” Chauvin noted. “The explanatory power of data is a very high priority for us.”

Deploying an effective data analysis strategy requires hard work and patience.

“Data quality is a constant struggle. Every piece of data counts, and we need to look carefully at all of our data.”

An organization needs to prioritize data collection and analysis, and over time, can establish large data sets. This organization then can make its data sets available to employees and teach these workers how to perform data analysis. As a result, an organization can help its employees gain unparalleled insights that they may struggle to obtain elsewhere.

“Analytics improve with time and people,” Chauvin stated. “People may go, but the code they use stays, and we know what the code does.”

A consistent process to collect and manage data is essential. This process must account for data from a wide range of sources. It also should be updated to ensure an organization can capitalize on new data analysis technologies as they become available.

“We get data, we clean it and we integrate it,” Chauvin pointed out. “We put all of the data into a universal schema, and this allows us to see where data comes from in the beginning and until the end of the process.”

Visit Argyle Executive Forum's CIO LEADERSHIP DINNER: What Lies Beneath – Diving into the Murky Depths of Multi-Tier Risk in San Francisco, CA on Jan 15, 2020

right arrow icon

Next Article:
VP of Content Marketing at ON24 Discusses the Value of Webinars in Engagement-Based Marketing