Commonwealth Bank CTO details how crap data, legacy can kill you

By on
Commonwealth Bank CTO details how crap data, legacy can kill you

"Every single one of us went through grief, anger, denial, sorrow, humility”.

“I remember the day in August… August 17 of 2018. Matt [Comyn], who was head of the retail bank at the time, walked into the executive team and said, ‘we've got a big problem’. I remember the hour.”

That’s the first-hand description of the moment the Commonwealth Bank of Australia’s chief technology officer Matt Pancino got the news that sent Australia’s largest financial institution into a $700 million world of regulatory pain and diet of humble pie stretching to the horizon.

Pancino has become intimate with regulatory demands over the past 18 months because he’s the data guy.

Not the big data guy, not the cool data guy, definitely not marketing hipster new app waving ‘change your life’ data guy (though for the record he loves the CBA’s new mobile app and everything it delivers).

Pancino is the “old and grumpy” data guy who gets to suture reputations and systems back together after they get put through the shredder by air-punching ‘yes’ people who live in the moment and prefer spreadsheets and powerpoint to architecture and standards.

And he’s still in his job; unlike Ian Narev, Andrew Thorburn, Ken Henry, a lot of AMP… and the ever growing list of Royal Commission-inspired decapitations that send even the most seasoned bank executives to Seek to perform their penance.

The CBA’s CTO has a comparatively simple theory about survival, regulatory and fiscal; it essentially boils down to not building on decaying or unsure foundations. 

And despite an abundance of technical passion, Pancino can simplify things with a jolting clarity.

“This is not going to be a discussion around big data and all of the great things you can do with AI. This is going to be the things that you can learn from the perspective of being in IT, being a technologist. I think that's what I'd like to cover today,” Pancino tells a packed room at the FST Future of Financial Services conference in Sydney.

The first and biggest lesson from CBA’s CTO that Australia’s collective IT shop – call it tech, innovation, digital etc – can’t just shrug off responsibility for regulatory train wreck of the last 18 months.

“How did tech play a partially contributing role into the state that we're in, both from an institutional and industry point of view? More importantly, how can we convert what we've learned into becoming a simpler and better bank with the technology that we use,” Pancino asks.

“I can assure you every single one of us went through grief, anger, denial, sorrow, humility,” Pancino said.

He says that as he and 500 other staff read the Royal Commission findings, and the APRA findings, and other probes, one common thread jumped out.

 “When we came out …and decided what we were going to do to fix it, the reflection I got was, if you do a find on the APRA report, on the PDF, [the word] data comes up everywhere,” Pancino said.

“Data absolutely comes up as a systemic problem that the company is facing.”

When Pancino says systemic, he means both right across the local financial services sector and the technology sector that supports it.

A root cause of the current pain is that as the upside of big data and analytics became obvious, there was a buying binge without addressing some foundation issues.

“We built data lakes. We built big warehouses. We bought appliances from vendors ... I know some of you are in here, we saw you before... We bought storage. We bought as much software as we could buy, and we employed data scientists, tens of them, and hundreds of them. we spent hundreds of millions of dollars,” Pancino said.

“But my question is, did we take enough time to understand that the data we were pumping into the lakes and all of the warehouses came from our core systems, our legacy platforms our heterogeneous capabilities. Did we understand where the data was taken from? Did we understand who was collecting it? Was it accurate?...”. Pancino’s list goes on.

“You'd have to argue that we didn't. Because when the crisis hit, you're into this strange situation where data in the enterprise in the IT systems is still hard to wrangle, and yet we've spent millions of dollars,” he says.

Like the regulators who supervise him, Pancino’s patience for repeated errors is growing visibly thin. He points to the global financial crisis of a decade ago and the consequences of raw ambition that overrode systemic integrity.

“We should have learnt as a community,” Pancino said.

“If you speak to any IT financial services executive on the East Coast of the USA who managed to keep their job during the global financial crisis, they will tell you about the importance of IT controls and data controls in your IT ecosystem.

“There are extraordinary stories ... we have some of the gentlemen, and women in our organization. They talk about some of the largest financial institutions in the United States [not] being generate a balance sheet for up to eight weeks.”

“Stories of customers, when the music stopped, actually having to generate [and] assemble customer statements with spreadsheets. Companies literally decades, hundreds of years old vanished on the spot in the global financial crisis, not because of their big data teams, but because of the data they had inside IT systems they didn't have the right controls over the top of,” Pancino said.

Which is a bit more existential than the Royal Commission’s regulatory Kabuki theatre, which are our words, not his.

“The thing that frustrates me about this is our inability to learn,” Pancino said.

He said a lightbulb moment occurred to him at an investment bank presentation in the US when a senior executive outlined what real execution around data integrity and governance actually entailed.

It came down to determining what the authoritative sources of data that needed to be kept were, what the key systems of reference were, investing solidly in what was to be kept and then forcibly decommissioning systems that had bad data in them.

It took that bank five years to pare back 8000 systems to 4000. The same bank estimated they would need to keep going for another ten years.

Pancino said an overlooked piece of post GFC data architecture governance work that has now guided the CBA was the creation of a global standards body in the form of an Enterprise data Management Council.

“Guess what? They built an enterprise data management framework, which has very clear policies that are appropriate for how you manage the data in your IT systems.

“As a result of all the challenges, and as part of our response to the APRA enquiry, we have rewritten our policies front to back, based on the global enterprise data management standards.”

This included chasing down “manual controls”.

“It’s a spreadsheet”, Pancino said of the “manual controls euphemism.

“Clearly, these are ineffective, unacceptable. And it is the job of technology and architecture to actually drive these manual controls out.

“We have to actually drive from an architecture perspective,” Pancino said.

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © iTnews.com.au . All rights reserved.
Tags:

Most Read Articles

Log In

  |  Forgot your password?